Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Fiscal Year 2015 Rates; Quality Reporting Requirements for Specific Providers; Reasonable Compensation Equivalents for Physician Services in Excluded Hospitals and Certain Teaching Hospitals; Provider Administrative Appeals and Judicial Review; Enforcement Provisions for Organ Transplant Centers; and Electronic Health Record (EHR) Incentive Program

Download PDF
Federal RegisterAug 22, 2014
79 Fed. Reg. 49853 (Aug. 22, 2014)

AGENCY:

Centers for Medicare and Medicaid Services (CMS), HHS.

ACTION:

Final rule.

SUMMARY:

We are revising the Medicare hospital inpatient prospective payment systems (IPPS) for operating and capital-related costs of acute care hospitals to implement changes arising from our continuing experience with these systems. Some of these changes implement certain statutory provisions contained in the Patient Protection and Affordable Care Act and the Health Care and Education Reconciliation Act of 2010 (collectively known as the Affordable Care Act), the Protecting Access to Medicare Act of 2014, and other legislation. These changes are applicable to discharges occurring on or after October 1, 2014, unless otherwise specified in this final rule. We also are updating the rate-of-increase limits for certain hospitals excluded from the IPPS that are paid on a reasonable cost basis subject to these limits. The updated rate-of-increase limits are effective for cost reporting periods beginning on or after October 1, 2014.

We also are updating the payment policies and the annual payment rates for the Medicare prospective payment system (PPS) for inpatient hospital services provided by long-term care hospitals (LTCHs) and implementing certain statutory changes to the LTCH PPS under the Affordable Care Act and the Pathway for Sustainable Growth Rate (SGR) Reform Act of 2013 and the Protecting Access to Medicare Act of 2014. In addition, we discuss our proposals on the interruption of stay policy for LTCHs and on retiring the “5 percent” payment adjustment for co-located LTCHs. While many of the statutory mandates of the Pathway for SGR Reform Act apply to discharges occurring on or after October 1, 2014, others will not begin to apply until 2016 and beyond.

In addition, we are making a number of changes relating to direct graduate medical education (GME) and indirect medical education (IME) payments. We are establishing new requirements or revising requirements for quality reporting by specific providers (acute care hospitals, PPS-exempt cancer hospitals, and LTCHs) that are participating in Medicare.

We are updating policies relating to the Hospital Value-Based Purchasing (VBP) Program, the Hospital Readmissions Reduction Program, and the Hospital-Acquired Condition (HAC) Reduction Program. In addition, we are making technical corrections to the regulations governing provider administrative appeals and judicial review; updating the reasonable compensation equivalent (RCE) limits, and revising the methodology for determining such limits, for services furnished by physicians to certain teaching hospitals and hospitals excluded from the IPPS; making regulatory revisions to broaden the specified uses of Medicare Advantage (MA) risk adjustment data and to specify the conditions for release of such risk adjustment data to entities outside of CMS; and making changes to the enforcement procedures for organ transplant centers.

We are aligning the reporting and submission timelines for clinical quality measures for the Medicare EHR Incentive Program for eligible hospitals and critical access hospitals (CAHs) with the reporting and submission timelines for the Hospital IQR Program. In addition, we provide guidance and clarification of certain policies for eligible hospitals and CAHs such as our policy for reporting zero denominators on clinical quality measures and our policy for case threshold exemptions.

In this document, we are finalizing two interim final rules with comment period relating to criteria for disproportionate share hospital uncompensated care payments and extensions of temporary changes to the payment adjustment for low-volume hospitals and of the Medicare-Dependent, Small Rural Hospital (MDH) Program.

DATES:

Effective Date: These final rules are effective on October 1, 2014.

Applicability Dates: The amendments to 42 CFR 405.1811 and 405.1835 are applicable to appeals based on untimely contractor determinations that are pending or were filed on or after August 21, 2008, subject to the rules of administrative finality and reopening at 42 CFR 405.1807 and 405.1885. The provisions discussed in section IV.I.4.c. of the preamble of this final rule are applicable on or after July 1, 2015; and the provisions discussed in section IV.I.5.a. of the preamble of this final rule are applicable on or after January 1, 2015.

FOR FURTHER INFORMATION, CONTACT:

Ing-Jye Cheng, (410) 786-4548 and Donald Thompson, (410) 786-4487, Operating Prospective Payment, MS-DRGs, Hospital-Acquired Conditions (HAC), Wage Index, New Medical Service and Technology Add-On Payments, Hospital Geographic Reclassifications, Graduate Medical Education, Capital Prospective Payment, Excluded Hospitals, and Medicare Disproportionate Share Hospital (DSH) Issues.

Michele Hudson, (410) 786-4487, and Judith Richter, (410) 786-2590, Long-Term Care Hospital Prospective Payment System and MS-LTC-DRG Relative Weights Issues.

Siddhartha Mazumdar, (410) 786-6673, Rural Community Hospital Demonstration Program Issues.

James Poyer, (410) 786-2261, Hospital Inpatient Quality Reporting and Hospital Value-Based Purchasing—Program Administration, Validation, and Reconsideration Issues.

Pierre Yong, (410) 786-8896, Hospital Inpatient Quality Reporting—Measures Issues Except Hospital Consumer Assessment of Healthcare Providers and Systems Issues; and Readmission Measures for Hospitals Issues.

Elizabeth Goldstein, (410) 786-6665, Hospital Inpatient Quality Reporting—Hospital Consumer Assessment of Healthcare Providers and Systems Measures Issues.

Mary Pratt, (410) 786-6867, LTCH Quality Data Reporting Issues.

Kim Spalding Bush, (410) 786-3232, Hospital Value-Based Purchasing Efficiency Measures Issues.

James Poyer, (410) 786-2261, PPS-Exempt Cancer Hospital Quality Reporting Issues.

Kellie Shannon, (410) 786-0416, Administrative Appeals by Providers and Judicial Review Issues.

Amelia Citerone, (410) 786-3901, and Robert Kuhl (410) 786-4597, Reasonable Compensation Equivalent (RCE) Limits for Physician Services Provided in Providers.

Anne Calinger, (410) 786-3396, and Jennifer Harlow, (410) 786-4549, Medicare Advantage Risk Adjustment Data Issues.

Thomas Hamilton, (410) 786-6763, Organ Transplant Center Issues.

Jennifer Phillips, (410) 786-1023, 2-Midnight Rule Benchmark Issues.

SUPPLEMENTARY INFORMATION:

Electronic Access

This Federal Register document is also available from the Federal Register online database through Federal Digital System (FDsys), a service of the U.S. Government Printing Office. This database can be accessed via the Internet at: http://www.gpo.gov/fdsys.

Tables Available Only Through the Internet on the CMS Web site

In the past, a majority of the tables referred to throughout this preamble and in the Addendum to the proposed rule and the final rule were published in the Federal Register as part of the annual proposed and final rules. However, beginning in FY 2012, some of the IPPS tables and LTCH PPS tables are no longer published in the Federal Register. Instead, these tables are available only through the Internet. The IPPS tables for this final rule are available only through the Internet on the CMS Web site at: http://www.cms.hhs.gov/Medicare/medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html. Click on the link on the left side of the screen titled, “FY 2015 IPPS Final Rule Home Page” or “Acute Inpatient—Files for Download”. The LTCH PPS tables for this FY 2015 final rule are available only through the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/LongTermCareHospitalPPS/index.html under the list item for Regulation Number CMS-1607-F. For complete details on the availability of the tables referenced in this final rule, we refer readers to section VI. of the Addendum to this final rule.

Readers who experience any problems accessing any of the tables that are posted on the CMS Web sites identified above should contact Michael Treitel at (410) 786-4552.

Acronyms

3M 3M Health Information System

AAMC Association of American Medical Colleges

ACGME Accreditation Council for Graduate Medical Education

ACoS American College of Surgeons

AHA American Hospital Association

AHIC American Health Information Community

AHIMA American Health Information Management Association

AHRQ Agency for Healthcare Research and Quality

AJCC American Joint Committee on Cancer

ALOS Average length of stay

ALTHA Acute Long Term Hospital Association

AMA American Medical Association

AMGA American Medical Group Association

AMI Acute myocardial infarction

AOA American Osteopathic Association

APR DRG All Patient Refined Diagnosis Related Group System

APRN Advanced practice registered nurse

ARRA American Recovery and Reinvestment Act of 2009, Pub. L. 111-5

ASCA Administrative Simplification Compliance Act of 2002, Pub. L. 107-105

ASITN American Society of Interventional and Therapeutic Neuroradiology

ATRA American Taxpayer Relief Act of 2012, Pub. L. 112-240

BBA Balanced Budget Act of 1997, Pub. L. 105-33

BBRA Medicare, Medicaid, and SCHIP [State Children's Health Insurance Program] Balanced Budget Refinement Act of 1999, Pub. L. 106-113

BIPA Medicare, Medicaid, and SCHIP [State Children's Health Insurance Program] Benefits Improvement and Protection Act of 2000, Pub. L. 106-554

BLS Bureau of Labor Statistics

CABG Coronary artery bypass graft [surgery]

CAH Critical access hospital

CARE [Medicare] Continuity Assessment Record & Evaluation [Instrument]

CART CMS Abstraction & Reporting Tool

CAUTI Catheter-associated urinary tract infection

CBSAs Core-based statistical areas

CC Complication or comorbidity

CCN CMS Certification Number

CCR Cost-to-charge ratio

CDAC [Medicare] Clinical Data Abstraction Center

CDAD Clostridium difficile-associated disease

CDC Center for Disease Control and Prevention

CERT Comprehensive error rate testing

CDI Clostridium difficile (C. difficile)

CFR Code of Federal Regulations

CLABSI Central line-associated bloodstream infection

CIPI Capital input price index

CMI Case-mix index

CMS Centers for Medicare & Medicaid Services

CMSA Consolidated Metropolitan Statistical Area

COBRA Consolidated Omnibus Reconciliation Act of 1985, Pub. L. 99-272

COLA Cost-of-living adjustment

CoP [Hospital] condition of participation

COPD Chronis obstructive pulmonary disease

CPI Consumer price index

CQM Clinical quality measure

CRNA Certified registered nurse anesthetist

CY Calendar year

DACA Data Accuracy and Completeness Acknowledgement

DPP Disproportionate patient percentage

DRA Deficit Reduction Act of 2005, Pub. L. 109-171

DRG Diagnosis-related group

DSH Disproportionate share hospital

EBRT External Bean Radiotherapy

ECI Employment cost index

eCQM Electronic clinical quality measure

EDB [Medicare] Enrollment Database

EHR Electronic health record

EMR Electronic medical record

EMTALA Emergency Medical Treatment and Labor Act of 1986, Pub. L. 99-272

EP Eligible professional

FAH Federation of American Hospitals

FDA Food and Drug Administration

FFY Federal fiscal year

FPL Federal poverty line

FQHC Federally qualified health center

FR Federal Register

FTE Full-time equivalent

FY Fiscal year

GAF Geographic Adjustment Factor

GME Graduate medical education

HAC Hospital-acquired condition

HAI Healthcare-associated infection

HCAHPS Hospital Consumer Assessment of Healthcare Providers and Systems

HCFA Health Care Financing Administration

HCO High-cost outlier

HCRIS Hospital Cost Report Information System

HHA Home health agency

HHS Department of Health and Human Services

HICAN Health Insurance Claims Account Number

HIPAA Health Insurance Portability and Accountability Act of 1996, Pub. L. 104-191

HIPC Health Information Policy Council

HIS Health information system

HIT Health information technology

HMO Health maintenance organization

HPMP Hospital Payment Monitoring Program

HSA Health savings account

HSCRC [Maryland] Health Services Cost Review Commission

HSRV Hospital-specific relative value

HSRVcc Hospital-specific relative value cost center

HQA Hospital Quality Alliance

HQI Hospital Quality Initiative

IBR Intern- and Resident-to-Bed Ratio

ICD-9-CM International Classification of Diseases, Ninth Revision, Clinical Modification

ICD-10-CM International Classification of Diseases, Tenth Revision, Clinical Modification

ICD-10-PCS International Classification of Diseases, Tenth Revision, Procedure Coding System

ICR Information collection requirement

IGI IHS Global Insight, Inc.

IHS Indian Health Service

IME Indirect medical education

I-O Input-Output

IOM Institute of Medicine

IPF Inpatient psychiatric facility

IPFQR Inpatient Psychiatric Facility Quality Reporting [Program]

IPPS [Acute care hospital] inpatient prospective payment system

IRF Inpatient rehabilitation facility

IQR Inpatient Quality Reporting

LAMCs Large area metropolitan counties

LOS Length of stay

LTC-DRG Long-term care diagnosis-related group

LTCH Long-term care hospital

LTCHQR Long-Term Care Hospital Quality Reporting

MA Medicare Advantage

MAC Medicare Administrative Contractor

MAP Measure Application Partnership

MCC Major complication or comorbidity

MCE Medicare Code Editor

MCO Managed care organization

MDC Major diagnostic category

MDH Medicare-dependent, small rural hospital

MedPAC Medicare Payment Advisory Commission

MedPAR Medicare Provider Analysis and Review File

MEI Medicare Economic Index

MGCRB Medicare Geographic Classification Review Board

MIEA-TRHCA Medicare Improvements and Extension Act, Division B of the Tax Relief and Health Care Act of 2006, Pub. L. 109-432

MIPPA Medicare Improvements for Patients and Providers Act of 2008, Pub. L. 110-275

MMA Medicare Prescription Drug, Improvement, and Modernization Act of 2003, Pub. L. 108-173

MMEA Medicare and Medicaid Extenders Act of 2010, Pub. L. 111-309

MMSEA Medicare, Medicaid, and SCHIP Extension Act of 2007, Pub. L. 110-173

MRHFP Medicare Rural Hospital Flexibility Program

MRSA Methicillin-resistant Staphylococcus aureus

MSA Metropolitan Statistical Area

MS-DRG Medicare severity diagnosis-related group

MS-LTC-DRG Medicare severity long-term care diagnosis-related group

MU Meaningful Use [EHR Incentive Program]

NAICS North American Industrial Classification System

NALTH National Association of Long Term Hospitals

NCD National coverage determination

NCHS National Center for Health Statistics

NCQA National Committee for Quality Assurance

NCVHS National Committee on Vital and Health Statistics

NECMA New England County Metropolitan Areas

NHSN National Healthcare Safety Network

NOP Notice of Participation

NQF National Quality Forum

NQS National Quality Strategy

NTIS National Technical Information Service

NTTAA National Technology Transfer and Advancement Act of 1991, Pub. L. 104-113

NVHRI National Voluntary Hospital Reporting Initiative

OACT [CMS] Office of the Actuary

OBRA 86 Omnibus Budget Reconciliation Act of 1986, Pub. L. 99-509

OES Occupational employment statistics

OIG Office of the Inspector General

OMB [Executive] Office of Management and Budget

OPM [U.S.] Office of Personnel Management

OQR [Hospital] Outpatient Quality Reporting

O.R. Operating room

OSCAR Online Survey Certification and Reporting [System]

PAMA Protecting Access to Medicare Act of 2014, Pub. L. 113-93

PCH PPS-exempt cancer hospital

PCHQR PPS-exempt cancer hospital quality reporting

PMSAs Primary metropolitan statistical areas

POA Present on admission

PPI Producer price index

PPS Prospective payment system

PRM Provider Reimbursement Manual

ProPAC Prospective Payment Assessment Commission

PRRB Provider Reimbursement Review Board

PRTFs Psychiatric residential treatment facilities

PSF Provider-Specific File

PSI Patient safety indicator

PS&R Provider Statistical and Reimbursement [System]

PQRS Physician Quality Reporting System

QIG Quality Improvement Group [CMS]

QIO Quality Improvement Organization

QRDA Quality Reporting Data Architecture

RCE Reasonable compensation equivalent

RFA Regulatory Flexibility Act, Pub. L. 96-354

RHC Rural health clinic

RHQDAPU Reporting hospital quality data for annual payment update

RNHCI Religious nonmedical health care institution

RPL Rehabilitation psychiatric long-term care (hospital)

RRC Rural referral center

RSMR Risk-standardized mortality rate

RSRR Risk-standard readmission rate

RTI Research Triangle Institute, International

RUCAs Rural-urban commuting area codes

RY Rate year

SAF Standard Analytic File

SCH Sole community hospital

SCIP Surgical Care Improvement Project

SFY State fiscal year

SIC Standard Industrial Classification

SNF Skilled nursing facility

SOCs Standard occupational classifications

SOM State Operations Manual

SSI Surgical site infection

SSI Supplemental Security Income

SSO Short-stay outlier

SUD Substance use disorder

TEFRA Tax Equity and Fiscal Responsibility Act of 1982, Pub. L. 97-248

TEP Technical expert panel

THA/TKA Total hip arthroplasty/Total knee arthroplasty

TMA TMA [Transitional Medical Assistance], Abstinence Education, and QI [Qualifying Individuals] Programs Extension Act of 2007, Pub. L. 110-90

TPS Total Performance Score

UHDDS Uniform hospital discharge data set

UMRA Unfunded Mandate Reform Act, Pub. L. 104-4

VBP [Hospital] Value Based Purchasing [Program]

VTE Venous thromboembolism

Table of Contents

I. Executive Summary and Background

A. Executive Summary

1. Purpose and Legal Authority

2. Summary of the Major Provisions

3. Summary of Costs and Benefits

B. Summary

1. Acute Care Hospital Inpatient Prospective Payment System (IPPS)

2. Hospitals and Hospital Units Excluded From the IPPS

3. Long-Term Care Hospital Prospective Payment System (LTCH PPS)

4. Critical Access Hospitals (CAHs)

5. Payments for Graduate Medical Education (GME)

C. Summary of Provisions of Recent Legislation Discussed in This Final Rule

1. Patient Protection and Affordable Care Act (Pub. L. 111-148) and the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152)

2. American Taxpayer Relief Act of 2012 (Pub. L. 112-240)

3. Pathway for Sustainable Growth Rate (SGR) Reform Act of 2013 (Pub. L. 113-67)

4. Protecting Access to Medicare Act of 2014 (Pub. L. 113-93)

D. Issuance of Notice of Proposed Rulemaking

E. Public Comments Received in Response to the FY 2015 IPPS/LTCH PPS Proposed Rule

F. Finalization of Interim Final Rule With Comment Period on Extension of Payment Adjustment for Low-Volume Hospitals and the MDH Program

G. Finalization of Interim Final Rule With Comment Period Related to Changes to Certain Cost Reporting Procedures for Disproportionate Share Hospital Uncompensated Care Payments

II. Changes to Medicare Severity Diagnosis-Related Group (MS-DRG) Classifications and Relative Weights

A. Background

B. MS-DRG Reclassifications

C. Adoption of the MS-DRGs in FY 2008

D. FY 2015 MS-DRG Documentation and Coding Adjustment

1. Background on the Prospective MS-DRG Documentation and Coding Adjustments for FY 2008 and FY 2009 Authorized by Pub. L. 110-90

2. Adjustment to the Average Standardized Amounts Required by Pub. L. 110-90

a. Prospective Adjustment Required by Section 7(b)(1)(A) of Pub. L. 110-90

b. Recoupment or Repayment Adjustments in FYs 2010 Through 2012 Required by Section 7(b)(1)(B) Pub. L. 110-90

3. Retrospective Evaluation of FY 2008 and FY 2009 Claims Data

4. Prospective Adjustments for FY 2008 and FY 2009 Authorized by Section 7(b)(1)(A) of Pub. L. 110-90

5. Recoupment or Repayment Adjustment Authorized by Section 7(b)(1)(B) of Pub. L. 110-90

6. Recoupment or Repayment Adjustment Authorized by Section 631 of the American Taxpayer Relief Act of 2012 (ATRA)

7. Prospective Adjustment for the MS-DRG Documentation and Coding Effect Through FY 2010

E. Refinement of the MS-DRG Relative Weight Calculation

1. Background

2. Discussion for FY 2015

F. Adjustment to MS-DRGs for Preventable Hospital-Acquired Conditions (HACs), Including Infections for FY 2015

1. Background

2. HAC Selection

3. Present on Admission (POA) Indicator Reporting

4. HACs and POA Reporting in Preparation for Transition to ICD-10-CM and ICD-10-PCS

5. Current HACs and Previously Considered Candidate HACs

6. RTI Program Evaluation

7. Current and Previously Considered Candidate HACs—RTI Report on Evidence-Based Guidelines

G. Changes to Specific MS-DRG Classifications

1. Discussion of Changes to Coding System and Basis for MS-DRG Updates

a. Conversion of MS-DRGs to the International Classification of Diseases, 10th Edition (ICD-10)

b. Basis for FY 2015 MS-DRG Updates

2. MDC 1 (Diseases and Disorders of the Nervous System)

a. Intracerebral Therapies: Gliadel® Wafer

b. Endovascular Embolization or Occlusion of Head and Neck

3. MDC 4 (Diseases and Disorders of the Ear, Nose, Mouth and Throat): Avery Breathing Pacemaker System

4. MDC 5 (Diseases and Disorders of the Circulatory System)

a. Exclusion of Left Atrial Appendage

b. Transcatheter Mitral Valve Repair: MitraClip®

c. Endovascular Cardiac Valve Replacement Procedures

d. Abdominal Aorta Graft

5. MDC 8 (Diseases and Disorders of the Musculoskeletal System and Connective Tissue)

a. Shoulder Replacement Procedures

b. Ankle Replacement Procedures

c. Back and Neck Procedures

6. MDC 10 (Endocrine, Nutritional and Metabolic Diseases and Disorders): Disorders of Porphyria Metabolism

7. MDC 15 (Newborns and Other Neonates With Conditions Originating in the Perinatal Period)

8. Medicare Code Editor (MCE) Changes

9. Changes to Surgical Hierarchies

10. Changes to the MS-DRG Diagnosis Codes for FY 2015

a. Major Complications or Comorbidities (MCCs) and Complications or Comorbidities (CCs) Severity Levels for FY 2015

b. Coronary Atherosclerosis Due to Calcified Coronary Lesion

11. Complications or Comorbidity (CC) Exclusions List

a. Background of the CC List and the CC Exclusions List

b. CC Exclusions List for FY 2015

12. Review of Procedure Codes in MS-DRGs 981 Through 983, 984 Through 986, and 987 Through 989

a. Moving Procedure Codes From MS-DRGs 981 Through 983 or MS-DRGs 987 Through 989 Into MDCs

b. Reassignment of Procedures Among MS-DRGs 981 Through 983, 984 Through 986, and 987 Through 989

c. Adding Diagnosis or Procedure Codes to MDCs

13. Changes to the ICD-9-CM Coding System

a. ICD-10 Coordination and Maintenance Committee

b. Code Freeze

14. Public Comments on Issues Not Addressed in the Proposed Rule

a. Request for Review and MS-DRG Assignment for ICD-9-CM Diagnosis Code 784.7 Reported with Procedure Code 39.75

b. Coding for Extracorporeal Membrane Oxygenation (ECMO) Procedures

c. Adding Severity Levels to MS-DRGs 245 Through 251

H. Recalibration of the FY 2015 MS-DRG Relative Weights

1. Data Sources for Developing the Relative Weights

2. Methodology for Calculation of the Relative Weights

3. Development of National Average CCRs

4. Bundled Payments for Care Improvement (BPCI) Initiative

I. Add-On Payments for New Services and Technologies

1. Background

2. Public Input Before Publication of a Notice of Proposed Rulemaking on Add-On Payments

3. FY 2015 Status of Technologies Approved for FY 2014 Add-On Payments

a. Glucarpidase (Trade Brand Voraxaze®)

b. DIFICIDTM (Fidaxomicin) Tablets

c. Zenith® Fenestrated Abdominal Aortic Aneurysm (AAA) Endovascular Graft

d. KcentraTM

e. Argus® II Retinal Prosthesis System

f. Zilver® PTX® Drug Eluting Stent

4. FY 2015 Applications for New Technology Add-On Payments

a. Dalbavancin (Durata Therapeutics, Inc.)

b. Heli-FXTM EndoAnchor System (Aptus Endosystems, Inc.)

c. CardioMEMSTM HF (Heart Failure) System

d. MitraClip® System

f. Responsive Neurostimulator (RNS®) System

III. Changes to the Hospital Wage Index for Acute Care Hospitals

A. Background

B. Core-Based Statistical Areas for the Hospital Wage Index

1. Background

2. Implementation of New Labor Market Area Delineations

a. Micropolitan Statistical Areas

b. Urban Counties That Became Rural Under the New OMB Delineations

c. Rural Counties That Became Urban Under the New OMB Delineations

d. Urban Counties That Moved to a Different Urban CBSA Under the New OMB Delineations

e. Transition Period

C. Worksheet S-3 Wage Data for the FY 2015 Wage Index

1. Included Categories of Costs

2. Excluded Categories of Costs

3. Use of Wage Index Data by Suppliers and Providers Other Than Acute Care Hospitals Under the IPPS

D. Verification of Worksheet S-3 Wage Data

E. Method for Computing the FY 2015 Unadjusted Wage Index

F. Occupational Mix Adjustment to the FY 2015 Wage Index

1. Development of Data for the FY 2015 Occupational Mix Adjustment Based on the 2010 Occupational Mix Survey

2. New 2013 Occupational Mix Survey for the FY 2016 Wage Index

3. Calculation of the Occupational Mix Adjustment for FY 2015

G. Analysis and Implementation of the Occupational Mix Adjustment and the FY 2015 Occupational Mix Adjusted Wage Index

1. Analysis of the Occupational Mix Adjustment and the Occupational Mix Adjusted Wage Index

2. Application of the Rural, Imputed, and Frontier Floors

a. Rural Floor

b. Imputed Floor and Alternative, Temporary Methodology for Computing the Rural Floor for FY 2015

c. Frontier Floor

3. FY 2015 Wage Index Tables

H. Revisions to the Wage Index Based on Hospital Redesignations and Reclassifications

1. General Policies and Effects of Reclassification and Redesignation

2. FY 2015 MGCRB Reclassifications

a. FY 2015 Reclassification Requirements and Approvals

b. Effects of Implementation of New OMB Labor Market Area Delineations on Reclassified Hospitals

c. Applications for Reclassifications for FY 2016

3. Hospitals Redesignated Under Section 1886(d)(8)(B) of the Act

a. New Lugar Areas for FY 2015

b. Hospitals Redesignated Under Section 1886(d)(8)(B) of the Act Seeking Reclassification by the MGCRB

c. Rural Counties No Longer Meeting the Criteria to be Redesignated as Lugar

4. Waiving Lugar Redesignation for the Out-Migration Adjustment

5. Update of Application of Urban to Rural Reclassification Criteria

I. FY 2015 Wage Index Adjustment Based on Commuting Patterns of Hospital Employees

J. Process for Requests for Wage Index Data Corrections

K. Notice of Change to Wage Index Development Timetable

L. Labor-Related Share for the FY 2015 Wage Index

IV. Other Decisions and Changes to the IPPS for Operating Costs and Graduate Medical Education (GME) Costs

A. Changes to MS-DRGs Subject to the Postacute Care Transfer Policy (§ 412.4)

B. Changes in the Inpatient Hospital Updates for FY 2015 (§§ 412.64(d) and 412.211(c))

1. FY 2015 Inpatient Hospital Update

2. FY 2015 Puerto Rico Hospital Update

C. Rural Referral Centers (RRCs): Annual Updates to Case-Mix Index (CMI) and Discharge Criteria (§ 412.96)

1. Case-Mix Index (CMI)

2. Discharges

D. Payment Adjustment for Low-Volume Hospitals (§ 412.101)

1. Background

2. Provisions of the Protecting Access to Medicare Act of 2014

3. Low-Volume Hospital Definition and Payment Adjustment for FY 2015

E. Indirect Medical Education (IME) Payment Adjustment (§ 412.105)

1. IME Adjustment Factor for FY 2015

2. IME Add-On Payments for Medicare Part C Discharges to Sole Community Hospitals (SCHs) That Are Paid According to Their Hospital-Specific Rates and Change in Methodology in Determining Payment to SCHs

3. Other Policy Changes Affecting IME

F. Payment Adjustment for Medicare Disproportionate Share Hospitals (DSHs) (§ 412.106)

1. Background

2. Impact on Medicare DSH Payment Adjustment of Implementation of New OMB Labor Market Area Delineations

3. Payment Adjustment Methodology for Medicare Disproportionate Share Hospitals (DSHs) under Section 3133 of the Affordable Care Act (§ 412.106)

a. General Discussion

b. Eligibility for Empirically Justified Medicare DSH Payments and Uncompensated Care Payments

c. Empirically Justified Medicare DSH Payments

d. Uncompensated Care Payments

e. Limitations on Review

G. Medicare-Dependent, Small Rural Hospital (MDH) Program (§ 412.108) and Sole Community Hospitals § 412.92)

1. Background for the MDH Program

2. PAMA of 2014 Provisions for FY 2015

3. Expiration of the MDH Program

4. Effects on MDHs of Adoption of New OMB Delineations

5. Effects on SCHs of Adoption of New OMB Delineations

H. Hospital Readmissions Reduction Program: Changes for FY 2015 Through FY 2017 (§§ 412.150 Through 412.154)

1. Statutory Basis for the Hospital Readmissions Reduction Program

2. Regulatory Background

3. Overview of Policies for the FY 2015 Hospital Readmissions Reduction Program

4. Refinement of the Readmissions Measures and Related Methodology for FY 2015 and Subsequent Years Payment Determinations

a. Refinement of Planned Readmission Algorithm for Acute Myocardial Infarction (AMI), Heart Failure (HF), Pneumonia (PN), Chronic Obstructive Pulmonary Disease (COPD), and Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Readmission Measures

b. Refinement of Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Readmission Measure Cohort

c. Anticipated Effect of Refinements on Measures

5. No Expansion of the Applicable Conditions for FY 2016

6. Expansion of the Applicable Conditions for FY 2017 To Include Patients Readmitted Following Coronary Artery Bypass Graft (CABG) Surgery Measure

a. Background

b. Overview of the CABG Readmissions Measure: Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery

c. Methodology for the CABG Measure: Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery

7. Maintenance of Technical Specifications for Quality Measures

8. Waiver From the Hospital Readmissions Reduction Program for Hospitals Formerly Paid under Section 1814(b)(3) of the Act (§ 412.152 and § 412.154(d))

9. Floor Adjustment Factor for FY 2015 (§ 412.154(c)(2))

10. Applicable Period for FY 2015

11. Inclusion of THA/TKA and COPD Readmissions Measures to Calculate Aggregate Payments for Excess Readmissions Beginning in FY 2015

12. Hospital Readmissions Reduction Program Extraordinary Circumstances Exceptions

I. Hospital Value-Based Purchasing (VBP) Program

1. Statutory Background

2. Overview of Previous Hospital VBP Program Rulemaking

3. FY 2015 Payment Details

a. Payment Adjustments

b. Base Operating DRG Payment Amount Definition for Medicare-Dependent, Small Rural Hospitals (MDHs)

4. Measures for the FY 2017 Hospital VBP Program

a. Measures Previously Adopted

b. Changes Affecting Topped-Out Measures

c. New Measures for the FY 2017 Hospital VBP Program

d. Adoption of the Current CLABSI Measure (NQF #0139) for the FY 2017 Hospital VBP Program

e. Summary of Previously Adopted and New Measures for the FY 2017 Hospital VBP Program

5. Additional Measures for the FY 2019 Hospital VBP Program

a. Hospital-level Risk-Standardized Complication Rate (RSCR) Following Elective Primary Total Hip Arthroplasty (THA) and Total Knee Arthroplasty (TKA)

b. PSI-90 Measure

6. Possible Measure Topics for Future Program Years

a. Care Transition Measure (CTM-3) Items for HCAHPS Survey

b. Possible Future Efficiency and Cost Reduction Domain Measure Topics

7. Previously Adopted and Final Performance Periods and Baseline Periods for the FY 2017 Hospital VBP Program

a. Background

b. Previously Adopted Baseline and Performance Periods for the FY 2017 Hospital VBP Program

c. Clinical Care—Process Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

d. Patient and Caregiver-Centered Experience of Care/Care Coordination Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

e. Performance Period and Baseline Period for NHSN Measures in the Safwety Domain for the FY 2017 Hospital VBP Program

f. Efficiency and Cost Reduction Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

g. Summary of Previously Adopted and Finalized Performance Periods and Baseline Periods for the FY 2017 Hospital VBP Program

8. Previously Adopted and Finalized Performance Periods and Baseline Periods for Certain Measures for the FY 2019 Hospital VBP Program

a. Previously Adopted and Finalized Performance Period and Baseline Period for the FY 2019 Hospital VBP Program for Clinical Care—Outcomes Domain Measures

b. Performance Period and Baseline Period for the PSI-90 Safety Domain Measure for the FY 2019 Hospital VBP Program

c. Summary of Previously Adopted and Finalized Performance Periods and Baseline Periods for Certain Measures for the FY 2019 Hospital VBP Program

9. Performance Period and Baseline Period for the Clinical Care—Outcomes Domain for the FY 2020 Hospital VBP Program

10. Performance Standards for the Hospital VBP Program

a. Background

b. Performance Standards for the FY 2016 Hospital VBP Program

c. Previously Adopted Performance Standards for the FY 2017, FY 2018, and FY 2019 Hospital VBP Programs

d. Additional Performance Standards for the FY 2017 Hospital VBP Program

e. Performance Standards for the FY 2019 and FY 2020 Hospital VBP Programs

f. Technical Updates Policy for Performance Standards

g. Solicitation of Public Comments on ICD-10-CM/PCS Transition

11. FY 2017 Hospital VBP Program Scoring Methodology

a. General Hospital VBP Program Scoring Methodology

b. Domain Weighting for the FY 2017 Hospital VBP Program for Hospitals That Receive a Score on All Domains

c. Domain Weighting for the FY 2017 Hospital VBP Program for Hospitals Receiving Scores on Fewer Than Four Domains

12. Minimum Numbers of Cases and Measures for the FY 2016 and FY 2017 Hospital VBP Program's Quality Domains

a. Previously Adopted Minimum Numbers of Cases and FY 2016 Minimum Numbers of Cases

b. Minimum Number of Measures—Safety Domain

c. Minimum Number of Measures—Clinical Care Domain

d. Minimum Number of Measures—Efficiency and Cost Reduction Domain

e. Minimum Number of Measures—Patient and Caregiver Centered Experience of Care/Care Coordination (PEC/CC) Domain

13. Applicability of the Hospital VBP Program to Maryland Hospitals

14. Disaster/Extraordinary Circumstance Exception under the Hospital VBP Program

J. Hospital-Acquired Condition (HAC) Reduction Program

1. Background

2. Statutory Basis for the HAC Reduction Program

3. Implementation of the HAC Reduction Program for FY 2015

a. Overview

b. Payment Adjustment Under the HAC Reduction Program, Including Exemptions

c. Measure Selection and Conditions, Including Risk Adjustment Scoring Methodology

d. Criteria for Applicable Hospitals and Performance Scoring Policy

e. Reporting Hospital-Specific Information, Including the Review and Correction of Information

f. Limitation on Administrative and Judicial Review

4. Maintenance of Technical Specifications for Quality Measures

5. Extraordinary Circumstances Exceptions/Exemptions

6. Implementation of the HAC Reduction Program for FY 2016

a. Measure Selection and Conditions, including a Risk-Adjustment Scoring Methodology

b. Measure Risk Adjustment

c. Measure Calculation

d. Applicable Time Period

e. Criteria for Applicable Hospitals and Performance Scoring

f. Rules To calculate the Total HAC Score for FY 2016

7. Future Consideration for the Use of Electronically Specified Measures

K. Payments for Indirect and Direct Graduate Medical Education (GME) Costs (§§ 412.105 and 413.75 through 413.83)

1. Background

2. Changes in the Effective Date of the FTE Resident Cap, 3-Year Rolling Average, and Intern- and Resident-to-Bed (IRB) Ratio Cap for New Programs in Teaching Hospitals

3. Changes to IME and Direct GME Policies as a Result of New OMB Labor Market Area Delineations

a. New Program FTE Cap Adjustment for Rural Hospitals Redesignated as Urban

b. Participation of Redesignated Hospitals in Rural Training Track

4. Clarification of Policies on Counting Resident Time in Nonprovider Settings Under Section 5504 of the Affordable Care Act

5. Changes to the Review and Award Process for Resident Slots Under Section 5506 of the Affordable Care Act

a. Effective Date of Slots Awarded Under Section 5506 of the Affordable Care Act

b. Removal of Seamless Requirement

c. Revisions to Ranking Criteria One, Seven, and Eight for Applications Under Section 5506

d. Clarification to Ranking Criterion Two Regarding Emergency Medicare GME Affiliation Agreements

6. Regulatory Clarification Applicable To Direct GME Payments to Federally Qualified Health Centers (FQHCs) and Rural Health Clinics (RHCs) for Training Residents in Approved Programs

L. Rural Community Hospital Demonstration Program

1. Background

2. FY 2015 Budget Neutrality Offset Amount

M. Requirement for Transparency of Hospital Charges Under the Affordable Care Act

1. Overview

2. Transparency Requirement Under the Affordable Care Act

N. Medicare Payment for Short Inpatient Hospital Stays

O. Suggested Exceptions to the 2-Midnight Benchmark

P. Finalization of Interim Final Rule With Comment Period on Extension of Payment Adjustment for Low-Volume Hospitals and the Medicare-Dependent, Small Rural Hospital (MDH) Program for FY 2014 Discharges Through March 31, 2014

1. Background

2. Summary of the Provisions of the Interim Final Rule With Comment Period

Q. Finalization of Interim Final Rule With Comment Period on Changes to Certain Cost Reporting Procedures Related to Disproportionate Share Hospital Uncompensated Care Payments

V. Changes to the IPPS for Capital-Related Costs

A. Overview

B. Additional Provisions

1. Exception Payments

2. New Hospitals

3. Hospitals Located in Puerto Rico

C. Annual Update for FY 2015

VI. Changes for Hospitals Excluded From the IPPS

A. Rate-of-Increase in Payments to Excluded Hospitals for FY 2015

B. Report on Adjustment (Exception) Payments

C. Updates to the Reasonable Compensation Equivalent (RCE) Limits on Compensation for Physician Services Provided in Providers (§ 415.70)

1. Background

2. Overview of the Current RCE Limits

a. Application of the RCE Limits

b. Exceptions to the RCE Limits

c. Methodology for Establishing the RCE Limits

3. Changes to the RCE Limits

D. Critical Access Hospitals (CAHs

1. Background

2. Proposed and Final Policy Changes Related to Reclassifications as Rural for CAHs

3. Revision of the Requirements for Physician Certification of CAH Inpatient Services

VII. Changes to the Long-Term Care Hospital Prospective Payment System (LTCH PPS) for FY 2015

A. Background of the LTCH PPS

1. Legislative and Regulatory Authority

2. Criteria for Classification as an LTCH

a. Classification as an LTCH

b. Hospitals Excluded From the LTCH PPS

3. Limitation on Charges to Beneficiaries

4. Administrative Simplification Compliance Act (ASCA) and Health Insurance Portability and Accountability Act (HIPAA) Compliance

B. Medicare Severity Long-Term Care Diagnosis-Related Group (MS-LTC-DRG) Classifications and Relative Weights for FY 2015

1. Background

2. Patient Classifications into MS-LTC-DRGs

a. Background

b. Changes to the MS-LTC-DRGs for FY 2015

3. Development of the FY 2015 MS-LTC-DRG Relative Weights

a. General Overview of the Development of the MS-LTC-DRG Relative Weights

b. Development of the MS-LTC-DRG Relative Weights for FY 2015

c. Data

d. Hospital-Specific Relative Value (HSRV) Methodology

e. Treatment of Severity Levels in Developing the MS-LTC-DRG Relative Weights

f. Low-Volume MS-LTC-DRGs

g. Steps for Determining the FY 2015 MS-LTC-DRG Relative Weights

C. LTCH PPS Payment Rates for FY 2015

1. Overview of Development of the LTCH Payment Rates

2. FY 2015 LTCH PPS Annual Market Basket Update

a. Overview

b. Revision of Certain Market Basket Updates as Required by the Affordable Care Act

c. Adjustment to the Annual Update to the LTCH PPS Standard Federal Rate Under the Long-Term Care Hospital Quality Reporting (LTCHQR) Program

1. Background

2. Reduction to the Annual Update to the LTCH PPS Standard Federal Rate under the LTCHQR Program

d. Market Basket Under the LTCH PPS for FY 2015

e. Annual Market Basket Update for LTCHs for FY 2015

3. Adjustment for the Final Year of the Phase-In of the One-Time Prospective Adjustment to the Standard Federal Rate under § 412.523(d)(3)

D. Revision of LTCH PPS Geographic Classifications

1. Background

2. Use of New OMB Labor Market Area Delineations (“New OMB Delineations”)

a. Micropolitan Statistical Areas

b. Urban Counties That Became Rural Under the New OMB Labor Market Area Delineations

c. Rural Counties That Became Urban Under the New OMB Labor Market Area Delineations

d. Urban Counties That Moved to a Different Urban CBSA Under the New OMB Labor Market Area Delineations

e. Transition Period

E. Reinstatement and Extension of Certain Payment Rules for LTCH Services—The 25-Percent Threshold Payment Adjustment

1. Background

2. Implementation of Section 1206(b)(1) of Pub. L. 113-67

F. Discussion of the “Greater Than 3-Day Interruption of Stay” Policy and the Transfer to Onsite Providers Policies Under the LTCH PPS

G. Moratoria on the Establishment of LTCHs and LTCH Satellite Facilities and on the Increase in the Number of Beds in Existing LTCHs or LTCH Satellite Facilities

H. Evaluation and Treatment of LTCHs Classified Under Section 1886(d)(1)(B)(iv)(II) of the Act

I. Description of Statutory Framework for Patient-Level Criteria-Based Payment Adjustment Under the LTCH PPS Under Pub. L. 113-67

1. Overview

2. Additional LTCH PPS Issues

J. Technical Change

VIII. Administrative Appeals by Providers and Judicial Review

A. Proposed and Final Changes Regarding the Claims Required in Provider Cost Reports and for Provider Administrative Appeals

B. Proposed and Final Changes to Conform Terminology From “Intermediary” to “Contractor”

C. Technical Correction to § 405.1835 of the Regulations and Corresponding Amendment to § 405.1811 of the Regulations

1. Background and Technical Correction to §§ 405.1811 and 405.1835 of the Regulations

2. Waiver of Notice of Proposed Rulemaking

3. Effective Date and Applicability Date; Finality and Reopening

IX. Quality Data Reporting Requirements for Specific Providers and Suppliers

A. Hospital Inpatient Quality Reporting (IQR) Program

1. Background

a. History of the Hospital IQR Program

b. Maintenance of Technical Specifications for Quality Measures

c. Public Display of Quality Measures

2. Removal and Suspension of Hospital IQR Program Measures

a. Considerations in Removing Quality Measures From the Hospital IQR Program

b. Removal of Hospital IQR Program Measures for the FY 2017 Payment Determination and Subsequent Years

3. Process for Retaining Previously Adopted Hospital IQR Program Measures for Subsequent Payment Determinations

4. Additional Considerations in Expanding and Updating Quality Measures Under the Hospital IQR Program

5. Previously Adopted Hospital IQR Program Measures for the FY 2016 Payment Determination and Subsequent Years

6. Refinements and Clarification to Existing Measures in the Hospital IQR Program

a. Refinement of Planned Readmission Algorithm for 30-Day Readmission Measures

b. Refinement of Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Complication and Readmission Measures

c. Anticipated Effect of Refinements to Existing Measures

d. Clarification Regarding Influenza Vaccination for Healthcare Personnel

7. Additional Hospital IQR Program Measures for the FY 2017 Payment Determination and Subsequent Years

a. Hospital 30-day, All-Cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery

b. Hospital 30-day, All-Cause, Risk-standardized Mortality Rate (RSMR) Following Coronary Artery Bypass Graft (CABG) Surgery

c. Hospital-Level, Risk-Standardized 30-Day Episode-of-Care Payment Measure for Pneumonia

d. Hospital-Level, Risk-Standardized 30-Day Episode-of-Care Payment Measure for Heart Failure

e. Severe Sepsis and Septic Shock: Management Bundle Measure (NQF #0500)

f. Electronic Health Record-Based Voluntary Measures

g. Readoption of Measures as Voluntarily Reported Electronic Clinical Quality Measures

h. Electronic Clinical Quality Measures

8. Possible New Quality Measures and Measure Topics for Future Years

a. Mandatory Electronic Clinical Quality Measure Reporting for FY 2018 Payment Determination

b. Possible Future Electronic Clinical Quality Measures

9. Form, Manner, and Timing of Quality Data Submission

a. Background

b. Procedural Requirements for the FY 2017 Payment Determination and Subsequent Years

c. Data Submission Requirements for Chart-Abstracted Measures

d. Alignment of the Medicare EHR Incentive Program Reporting and Submission Timelines for Clinical Quality Measures With Hospital IQR Program Reporting and Submission Timelines

e. Sampling and Case Thresholds for the FY 2017 Payment Determination and Subsequent Years

f. HCAHPS Requirements for the FY 2017 Payment Determination and Subsequent Years

g. Data Submission Requirements for Structural Measures for the FY 2017 Payment Determination and Subsequent Years

h. Data Submission and Reporting Requirements for Healthcare-Associated Infection (HAI) Measures Reported via NHSN

10. Submission and Access of HAI Measures Data Through the CDC's NHSN Web Site

11. Modifications to the Existing Processes for Validation of Chart-Abstracted Hospital IQR Program Data

a. Eligibility Criteria for Hospitals Selected for Validation

b. Number of Charts To Be Submitted per Hospital for Validation

c. Combining Scores for HAI and Clinical Process of Care Topic Areas

d. Processes To Submit Patient Medical Records for Chart-Abstracted Measures

e. Plans To Validate Electronic Clinical Quality Measure Data

f. Data Submission Requirements for Quality Measures That May Be Voluntarily Electronically Reported for the FY 2017 Payment Determination

12. Data Accuracy and Completeness Acknowledgement Requirements for the FY 2017 Payment Determination and Subsequent Years

13. Public Display Requirements for the FY 2017 Payment Determination and Subsequent Years

14. Reconsideration and Appeal Procedures for the FY 2017 Payment Determination and Subsequent Years

15. Hospital IQR Program Extraordinary Circumstances Extensions or Exemptions

B. PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program

1. Statutory Authority

2. Covered Entities

3. Previously Finalized PCHQR Program Quality Measures

4. Update to the Clinical Process/Oncology Care Measures Beginning With the 2016 Program

5. New Quality Measures Beginning With the FY 2017 Program

a. Considerations in the Selection of Quality Measures

b. New Quality Measure Beginning With the FY 2017 Program

6. Possible New Quality Measure Topics for Future Years

7. Maintenance of Technical Specifications for Quality Measures

8. Public Display Requirements Beginning With the FY 2014 Program

9. Form, Manner, and Timing of Data Submission Beginning With the FY 2017 Program

a. Background

b. Reporting Requirements for the Proposed New Measure: External Beam Radiotherapy for Bone Metastases (NQF #1822) Beginning With the FY 2017 Program

c. Reporting Options for the Clinical Process/Cancer Specific Treatment Measures Beginning With the FY 2015 Program and the SCIP and Clinical Process/Oncology Care Measures Beginning With the FY 2016 Program

d. New Sampling Methodology for the Clinical Process/Oncology Care Measures Beginning With the FY 2016 Program

10. Exceptions From Program Requirements

C. Long-Term Care Hospital Quality Reporting (LTCHQR) Program

1. Background

2. General Considerations Used for Selection of Quality Measures for the LTCHQR Program

3. Policy for Retention of LTCHQR Program Measures Adopted for Previous Payment Determinations

4. Policy for Adopting Changes to LTCHQR Program Measures

5. Previously Adopted Quality Measures

a. Previously Adopted Quality Measures for the FY 2015 and FY 2016 Payment Determinations and Subsequent Years

b. Previously Adopted Quality Measures for the FY 2017 and FY 2018 Payment Determinations and Subsequent Years

6. Revision to Data Collection Timelines and Submission Deadlines for Previously Adopted Quality Measures

a. Revisions to Data Collection Timelines and Submission Deadlines for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680)

b. Revisions to Data Collection Timelines and Submission Deadlines for the Application of Percent of Residents Experiencing One or More Falls With Major Injury (Long Stay) (NQF #0674)

7. New LTCHQR Program Quality Measures for the FY 2018 Payment Determination and Subsequent Years

a. New LTCHQR Program Functional Status Quality Measures for the FY 2018 Payment Determination and Subsequent Years

b. Quality Measure: National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure

8. LTCHQR Program Quality Measures and Concepts Under Consideration for Future Years

9. Form, Manner, and Timing of Quality Data Submission for the FY 2016 Payment Determinations and Subsequent Years

a. Background

b. Finalized Timeline for Data Submission Under the LTCHQR Program for the FY 2016 and FY 2017 Payment Determinations (Except NQF #0680 and NQF #0431)

c. Revision to the Previously Adopted Data Collection Timelines and Submission Deadlines for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #680) for the FY 2016 Payment Determination and Subsequent Years

d. Data Submission Mechanisms for the FY 2018 Payment Determination and Subsequent Years for New LTCHQR Program Quality Measures and for Revision to Previously Adopted Quality Measure

e. Data Collection Timelines and Submission Deadlines Under the LTCHQR Program for the FY 2018 Payment Determination

f. Data Collection Timelines and Submission Deadlines for the Application of Percent of Residents Experiencing One or More Falls With Major Injury (Long Stay) (NQF #0674) Measure for the FY 2018 Payment Determination and Subsequent Years

g. Data Collection Timelines and Submission Deadlines Under the LTCHQR Program for the FY 2019 Payment Determination

10. LTCHQR Program Data Completion Threshold for the FY 2016 Payment Adjustment and Subsequent Years

a. Overview

b. LTCHQR Program Data Completion Threshold for the Required LTCH CARE Data Set (LCDS) Data Items

c. LTCHQR Program Data Completion Threshold for Measures Submitted Using the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN)

d. Application of the 2 Percentage Point Reduction for LTCHs That Fail To Meet the Data Completion Thresholds

11. Data Validation Process for the FY 2016 Payment Determination and Subsequent Years

a. Data Validation Process

b. Application of the 2 Percentage Point Reduction for LTCHs That Fail To Meet the Data Accuracy Threshold

12. Public Display of Quality Measure Data for the LTCHQR Program

13. LTCHQR Program Submission Exception and Extension Requirements for the FY 2017 Payment Determination and Subsequent Years

14. LTCHQR Program Reconsideration and Appeals Procedures for the FY 2016 Payment Determination and Subsequent Years

a. Previously Finalized LTCHQR Program Reconsideration and Appeals Procedures for the FY 2014 and FY 2015 Payment Determinations

b. LTCHQR Program Reconsideration and Appeals Procedures for the FY 2016 Payment Determination and Subsequent Years

15. Electronic Health Records (EHR) and Health Information Exchange (HIE)

D. Electronic Health Record (EHR) Incentive Program and Meaningful Use (MU)

1. Background

2. Alignment of the Medicare EHR Incentive Program Reporting and Submission Timelines for Clinical Quality Measures With Hospital IQR Program Reporting and Submission Timelines

3. Quality Reporting Data Architecture Category III (QRDA-III) Option in 2015

4. Electronically Specified Clinical Quality Measures (CQMs) Reporting for 2015

5. Clarification Regarding Reporting Zero Denominators

X. Revision of Regulations Governing Use and Release of Medicare Advantage Risk Adjustment Data

A. Background

B. Regulatory Changes

1. Expansion of Uses and Reasons for Disclosure of Risk Adjustment Data

2. Conditions for CMS Release of Data

3. Technical Change

XI. Changes to Enforcement Provisions for Organ Transplant Centers

A. Background

B. Basis for Changes

1. Expansion of Mitigating Factors Based on CMS' Experience

2. Coordination With Efforts of the Organ Procurement and Transplantation Network (OPTN) and Health Resources and Services Administration

C. Provisions of the Proposed and Final Regulations

1. Expansion of Mitigating Factors List, Content, and Timeframe

2. Content and Timeframe for Mitigating Factors Requests

3. System Improvement Agreements (SIAs)

a. Purpose and Intent of an SIA

b. Description and Contents of an SIA

c. Effective Period for an SIA

XII. MedPAC Recommendations

XIII. Other Required Information

A. Requests for Data from the Public

B. Collection of Information Requirements

1. Statutory Requirement for Solicitation of Comments

2. ICRs for Add-On Payments for New Services and Technologies

3. ICRs for the Occupational Mix Adjustment to the FY 2015 Wage Index (Hospital Wage Index Occupational Mix Survey)

4. Hospital Applications for Geographic Reclassifications by the MGCRB

5. ICRs for Application for GME Resident Slots

6. ICRs for the Hospital Inpatient Quality Reporting (IQR) Program

7. ICRs for PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program

8. ICRs for Hospital Value-Based Purchasing (VBP) Program

9. ICRs for the Long-Term Care Hospital Quality Reporting (LTCHQR) Program

10. ICR Regarding Electronic Health Record (EHR) Incentive Program and Meaningful Use (MU)

11. ICR Regarding Revision of Regulations Governing Use and Release of Medicare Advantage (MA) Risk Adjustment Data (§ 422.310(f))

Regulation Text

Addendum—Schedule of Standardized Amounts, Update Factors, and Rate-of-Increase Percentages Effective with Cost Reporting Periods Beginning on or After October 1, 2014 and Payment Rates for LTCHs Effective With Discharges Occurring on or After October 1, 2014

I. Summary and Background

II. Changes to the Prospective Payment Rates for Hospital Inpatient Operating Costs for Acute Care Hospitals for FY 2015

A. Calculation of the Adjusted Standardized Amount

B. Adjustments for Area Wage Levels and Cost-of-Living

C. Calculation of the Prospective Payment Rates

III. Changes to Payment Rates for Acute Care Hospital Inpatient Capital-Related Costs for FY 2015

A. Determination of Federal Hospital Inpatient Capital-Related Prospective Payment Rate Update

B. Calculation of the Inpatient Capital-Related Prospective Payments for FY 2015

C. Capital Input Price Index

IV. Changes to Payment Rates for Excluded Hospitals: Rate-of-Increase Percentages for FY 2015

V. Updates to the Payment Rates for the LTCH PPS for FY 2015

A. LTCH PPS Standard Federal Rate for FY 2015

1. Background

2. Development of the FY 2015 LTCH PPS Standard Federal Rate

B. Adjustment for Area Wage Levels under the LTCH PPS for FY 2015

1. Background

2. Geographic Classifications Based on the New OMB Delineations

3. LTCH PPS Labor-Related Share

4. LTCH PPS Wage Index for FY 2015

5. Budget Neutrality Adjustment for Changes to the Area Wage Level Adjustment

C. LTCH PPS Cost-of-Living Adjustment (COLA) for LTCHs Located in Alaska and Hawaii

D. Adjustment for LTCH PPS High-Cost Outlier (HCO) Cases

1. Background

2. Determining LTCH CCRs Under the LTCH PPS

3. Establishment of the LTCH PPS Fixed-Loss Amount for FY 2015

4. Application of the Outlier Policy to SSO Cases

E. Update to the IPPS Comparable/Equivalent Amounts To Reflect the Statutory Changes to the IPPS DSH Payment Adjustment Methodology

F. Computing the Adjusted LTCH PPS Federal Prospective Payments for FY 2015

VI. Tables Referenced in This Final Rule and Available Through the Internet on the CMS Web site

Appendix A—Economic Analyses

I. Regulatory Impact Analysis

A. Introduction

B. Need

C. Objectives of the IPPS

D. Limitations of Our Analysis

E. Hospitals Included in and Excluded From the IPPS

F. Effects on Hospitals and Hospital Units Excluded From the IPPS

G. Quantitative Effects of the Policy Changes Under the IPPS for Operating Costs

1. Basis and Methodology of Estimates

2. Analysis of Table I

3. Impact Analysis of Table II

H. Effects of Other Policy Changes

1. Effects of Policy on MS-DRGs for Preventable HACs, Including Infections

2. Effects of Policy Relating to New Medical Service and Technology Add-On Payments

3. Effects of Changes to List of MS-DRGs Subject to Postacute Care Transfer and DRG Special Pay Policy

4. Effects of Payment Adjustment for Low-Volume Hospitals for FY 2015

5. Effects of Policy Changes Related to IME Medicare Part C Add-On Payments to SCHs Paid According to Their Hospital-Specific Rates

6. Effects of the Extension of the MDH Program for the First Half of FY 2015

7. Effects of Changes Under the FY 2015 Hospital Value-Based Purchasing (VBP) Program

8. Effects of the Changes to the HAC Reduction Program for FY 2015

9. Effects of Policy Changes Relating to Payments for Direct GME and IME

10. Effects of Implementation of Rural Community Hospital Demonstration Program

11. Effects of Changes Related to Reclassifications as Rural for CAHs

12. Effects of Revision of the Requirements for Physician Certification of CAH Inpatient Services

13. Effects of Changes Relating to Administrative Appeals by Providers and Judicial Review for Appropriate Claims in Provider Cost Reports

I. Effects of Changes to Updates to the Reasonable Compensation Equivalent (RCE) Limits for Physician Services Provided to Providers

J. Effects of Changes in the Capital IPPS

1. General Considerations

2. Results

K. Effects of Payment Rate Changes and Policy Changes Under the LTCH PPS

1. Introduction and General Considerations

2. Impact on Rural Hospitals

3. Anticipated Effects of LTCH PPS Payment Rate Changes and Policy Changes

4. Effect on the Medicare Program

5. Effect on Medicare Beneficiaries

L. Effects of Requirements for Hospital Inpatient Quality Reporting (IQR) Program

M. Effects of Requirements for the PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program for FY 2015

N. Effects of Requirements for the LTCH Quality Reporting (LTCHQR) Program for FY 2015 Through FY 2019

O. Effects of Policy Changes Regarding Electronic Health Record (EHR) Incentive Program and Hospital IQR Program

P. Effects of Revision of Regulations Governing Use and Release of Medicare Advantage Risk Adjustment Data

Q. Effects of Changes to Enforcement Provisions for Organ Transplant Centers

II. Alternatives Considered

III. Overall Conclusion

A. Acute Care Hospitals

B. LTCHs

IV. Accounting Statements and Tables

A. Acute Care Hospitals

B. LTCHs

V. Regulatory Flexibility Act (RFA) Analysis

VI. Impact on Small Rural Hospitals

VII. Unfunded Mandate Reform Act (UMRA) Analysis

VIII. Executive Order 12866

Appendix B: Recommendation of Update Factors for Operating Cost Rates of Payment for Inpatient Hospital Services

I. Background

II. Inpatient Hospital Update for FY 2015

A. FY 2015 Inpatient Hospital Update

B. Update for SCHs for FY 2015

C. FY 2015 Puerto Rico Hospital Update

D. Update for Hospitals Excluded From the IPPS for FY 2015

E. Update for LTCHs for FY 2015

III. Secretary's Recommendation

IV. MedPAC Recommendation for Assessing Payment Adequacy and Updating Payments in Traditional Medicare

I. Executive Summary and Background

A. Executive Summary

1. Purpose and Legal Authority

This final rule makes payment and policy changes under the Medicare inpatient prospective payment systems (IPPS) for operating and capital-related costs of acute care hospitals as well as for certain hospitals and hospital units excluded from the IPPS. In addition, it makes payment and policy changes for inpatient hospital services provided by long-term care hospitals (LTCHs) under the long-term care hospital prospective payment system (LTCH PPS). It also makes policy changes to programs associated with Medicare IPPS hospitals, IPPS-excluded hospitals, and LTCHs.

Under various statutory authorities, we are making changes to the Medicare IPPS, to the LTCH PPS, and to other related payment methodologies and programs for FY 2015 and subsequent fiscal years. These statutory authorities include, but are not limited to, the following:

  • Section 1886(d) of the Social Security Act (the Act), which sets forth a system of payment for the operating costs of acute care hospital inpatient stays under Medicare Part A (Hospital Insurance) based on prospectively set rates. Section 1886(g) of the Act requires that, instead of paying for capital-related costs of inpatient hospital services on a reasonable cost basis, the Secretary use a prospective payment system (PPS).
  • Section 1886(d)(1)(B) of the Act, which specifies that certain hospitals and hospital units are excluded from the IPPS. These hospitals and units are: rehabilitation hospitals and units; LTCHs; psychiatric hospitals and units; children's hospitals; cancer hospitals; and short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa. Religious nonmedical health care institutions (RNHCIs) are also excluded from the IPPS.
  • Sections 123(a) and (c) of Pub. L. 106-113 and section 307(b)(1) of Public Law 106-554 (as codified under section 1886(m)(1) of the Act), which provide for the development and implementation of a prospective payment system for payment for inpatient hospital services of long-term care hospitals (LTCHs) described in section 1886(d)(1)(B)(iv) of the Act.
  • Sections 1814(l), 1820, and 1834(g) of the Act, which specify that payments are made to critical access hospitals (CAHs) (that is, rural hospitals or facilities that meet certain statutory requirements) for inpatient and outpatient services and that these payments are generally based on 101 percent of reasonable cost.
  • Section 1866(k) of the Act, as added by section 3005 of the Affordable Care Act, which establishes a quality reporting program for hospitals described in section 1886(d)(1)(B)(v) of the Act, referred to as “PPS-Exempt Cancer Hospitals.”
  • Section 1886(d)(4)(D) of the Act, which addresses certain hospital-acquired conditions (HACs), including infections. Section 1886(d)(4)(D) of the Act specifies that, by October 1, 2007, the Secretary was required to select, in consultation with the Centers for Disease Control and Prevention (CDC), at least two conditions that: (a) are high cost, high volume, or both; (b) are assigned to a higher paying MS-DRG when present as a secondary diagnosis (that is, conditions under the MS-DRG system that are complications or comorbidities (CCs) or major complications or comorbidities (MCCs); and (c) could reasonably have been prevented through the application of evidence-based guidelines. Section 1886(d)(4)(D) of the Act also specifies that the list of conditions may be revised, again in consultation with CDC, from time to time as long as the list contains at least two conditions. Section 1886(d)(4)(D)(iii) of the Act requires that hospitals, effective with discharges occurring on or after October 1, 2007, submit information on Medicare claims specifying whether diagnoses were present on admission (POA). Section 1886(d)(4)(D)(i) of the Act specifies that effective for discharges occurring on or after October 1, 2008, Medicare no longer assigns an inpatient hospital discharge to a higher paying MS-DRG if a selected condition is not POA.
  • Section 1886(a)(4) of the Act, which specifies that costs of approved educational activities are excluded from the operating costs of inpatient hospital services. Hospitals with approved graduate medical education (GME) programs are paid for the direct costs of GME in accordance with section 1886(h) of the Act. A payment for indirect medical education (IME) is made under section 1886(d)(5)(B) of the Act.
  • Section 1886(b)(3)(B)(viii) of the Act, which requires the Secretary to reduce the applicable percentage increase in payments to a subsection (d) hospital for a fiscal year if the hospital does not submit data on measures in a form and manner, and at a time, specified by the Secretary.
  • Section 1886(o) of the Act, which requires the Secretary to establish a Hospital Value-Based Purchasing (VBP) Program under which value-based incentive payments are made in a fiscal year to hospitals meeting performance standards established for a performance period for such fiscal year.
  • Section 1886(p) of the Act, as added by section 3008 of the Affordable Care Act, which establishes an adjustment to hospital payments for hospital-acquired conditions (HACs), or a Hospital-Acquired Condition (HAC) Reduction Program, under which payments to applicable hospitals are adjusted to provide an incentive to reduce hospital-acquired conditions.
  • Section 1886(q) of the Act, as added by section 3025 of the Affordable Care Act and amended by section 10309 of the Affordable Care Act, which establishes the “Hospital Readmissions Reduction Program” effective for discharges from an “applicable hospital” beginning on or after October 1, 2012, under which payments to those hospitals under section 1886(d) of the Act will be reduced to account for certain excess readmissions.
  • Section 1886(r) of the Act, as added by section 3133 of the Affordable Care Act, which provides for a reduction to disproportionate share hospital payments under section 1886(d)(5)(F) of the Act and for a new uncompensated care payment to eligible hospitals. Specifically, section 1886(r) of the Act now requires that, for “fiscal year 2014 and each subsequent fiscal year,” “subsection (d) hospitals” that would otherwise receive a “disproportionate share hospital payment . . . made under subsection (d)(5)(F)” will receive two separate payments: (1) 25 percent of the amount they previously would have received under subsection (d)(5)(F) for DSH (“the empirically justified amount”), and (2) an additional payment for the DSH hospital's proportion of uncompensated care, determined as the product of three factors. These three factors are: (1) 75 percent of the payments that would otherwise be made under subsection (d)(5)(F); (2) 1 minus the percent change in the percent of individuals under the age of 65 who are uninsured (minus 0.1 percentage points for FY 2014, and minus 0.2 percentage points for FY 2015 through FY 2017); and (3) a hospital's uncompensated care amount relative to the uncompensated care amount of all DSH hospitals expressed as a percentage.
  • Section 1886(m)(6) of the Act, as added by section 1206(a)(1) of the Pathway for SGR Reform Act of 2013, which provided for the establishment of patient criteria for payment under the LTCH PPS for implementation beginning in FY 2016.
  • Section 1206(b)(1) of the Pathway for SGR Reform Act of 2013, which further amended section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act, by retroactively reestablishing and extending the statutory moratorium on the full implementation of the 25-percent threshold payment adjustment policy under the LTCH PPS so that the policy will be in effect for 9 years (except for “grandfathered” hospital-within-hospitals (HwHs), which are permanently exempt from this policy); and section 1206(b)(2) (as amended by section 112(b) of Pub. L. 113-93), which together further amended section 114(d) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act to establish a new moratoria (subject to certain defined exceptions) on the development of new LTCHs and LTCH satellite facilities and a new moratorium on increases in the number of beds in existing LTCHs and LTCH satellite facilities beginning January 1, 2015 and ending on September 30, 2017; and section 1206(d), which instructs the Secretary to evaluate payments to LTCHs classified under section 1886(b)(1)(C)(iv)(II) of the Act and to adjust payment rates in FY 2015 or FY 2016 under the LTCH PPS, as appropriate, based upon the evaluation findings.
  • Section 1886(m)(5)(D)(iv) of the Act, as added by section 1206 (c) of the Pathway for SGR Reform Act of 2013, which provides for the establishment, no later than October 1, 2015, of a functional status quality measure under the LTCHQR Program for change in mobility among inpatients requiring ventilator support.

In this final rule, we are making technical and conforming changes and nomenclature changes to the regulations regarding the claims required in provider cost reports and for provider administrative appeals to conform terminology from “intermediary” to “contractor”

We are aligning the reporting and submission timelines for clinical quality measures for the Medicare EHR Incentive Program for eligible hospitals and critical access hospitals (CAHs) with the reporting and submission timelines for the Hospital IQR Program. In addition, we provide guidance and clarification of certain policies for eligible hospitals and CAHs such as our policy for reporting zero denominators on clinical quality measures and our policy for case threshold exemptions.

In addition, this final rule contains several provisions that are not directly related to these Medicare payment systems, such as regulatory revisions to broaden the specified uses and reasons for disclosure of risk adjustment data and to specify the conditions for release of risk adjustment data to entities outside of CMS and changes to the enforcement procedures for organ transplant centers. The specific statutory authority for these other provisions is discussed in the relevant sections below.

2. Summary of the Major Provisions

a. MS-DRG Documentation and Coding Adjustment

Section 631 of the American Taxpayer Relief Act (ATRA, Pub. L. 112-240) amended section 7(b)(1)(B) of Public Law 110-90 to require the Secretary to make a recoupment adjustment to the standardized amount of Medicare payments to acute care hospitals to account for changes in MS-DRG documentation and coding that do not reflect real changes in case-mix, totaling $11 billion over a 4-year period of FYs 2014, 2015, 2016, and 2017. This adjustment represents the amount of the increase in aggregate payments as a result of not completing the prospective adjustment authorized under section 7(b)(1)(A) of Public Law 110-90 until FY 2013. Prior to the ATRA, this amount could not have been recovered under Public Law 110-90.

While our actuaries estimated that a -9.3 percent adjustment to the standardized amount would be necessary if CMS were to fully recover the $11 billion recoupment required by section 631 of the ATRA in FY 2014, it is often our practice to delay or phase in rate adjustments over more than one year, in order to moderate the effects on rates in any one year. Therefore, consistent with the policies that we have adopted in many similar cases, we made a -0.8 percent recoupment adjustment to the standardized amount in FY 2014. We are making an additional -0.8 percent recoupment adjustment to the standardized amount in FY 2015.

b. Reduction of Hospital Payments for Excess Readmissions

We are making changes in policies to the Hospital Readmissions Reduction Program, which is established under section 1886(q) of the Act, as added by section 3025 of the Affordable Care Act. The Hospital Readmissions Reduction Program requires a reduction to a hospital's base operating DRG payment to account for excess readmissions of selected applicable conditions. For FYs 2013 and 2014, these conditions are acute myocardial infarction, heart failure, and pneumonia. For FY 2014, we established additional exclusions to the three existing readmission measures (that is, the excess readmission ratio) to account for additional planned readmissions. We also established additional readmissions measures, Chronic Obstructive Pulmonary Disease (COPD), and Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA), to be used in the Hospital Readmissions Reduction Program for FY 2015 and future years. We are expanding the readmissions measures for FY 2017 and future years by adding a measure of patients readmitted following coronary artery bypass graft (CABG) surgery. We also are refining the readmission measures and related methodology for FY 2015 and subsequent years payment determinations. In addition, we are providing that the readmissions payment adjustment factors for FY 2015 can be no more than a 3-percent reduction in accordance with the statute. We also are revising the calculation of aggregate payments for excess readmissions to include THA/TKA and COPD readmissions measures beginning in FY 2015.

c. Hospital Value-Based Purchasing (VBP) Program

Section 1886(o) of the Act requires the Secretary to establish a Hospital Value-Based Purchasing (VBP) Program under which value-based incentive payments are made in a fiscal year to hospitals meeting performance standards established for a performance period for such fiscal year. Both the performance standards and the performance period for a fiscal year are to be established by the Secretary.

In this final rule, we are adopting quality measures for the FY 2017, FY 2019, and FY 2020 Hospital VBP Program years and establishing performance periods and performance standards for measures we are adopting for those fiscal years. We are also adopting additional policies related to performance standards and revising the domain weighting previously adopted for the FY 2017 Hospital VBP Program.

d. Hospital-Acquired Condition (HAC) Reduction Program

In this final rule, we are making a change in the scoring methodology with the addition of a previously finalized measure for the FY 2016 payment adjustment under the HAC Reduction Program. Section 1886(p) of the Act, as added under section 3008(a) of the Affordable Care Act, establishes an adjustment to hospital payments for HACs, or a HAC Reduction program, under which payments to applicable hospitals are adjusted to provide an incentive to reduce HACs, effective for discharges beginning on October 1, 2014 and for subsequent program years. This 1-percent payment reduction applies to a hospital whose ranking is in the top quartile (25 percent) of all applicable hospitals, relative to the national average, of conditions acquired during the applicable period and on all of the hospital's discharges for the specified fiscal year. The amount of payment shall be equal to 99 percent of the amount of payment that would otherwise apply to such discharges under section 1886(d) or 1814(b)(3) of the Act, as applicable.

e. DSH Payment Adjustment and Additional Payment for Uncompensated Care

Section 3133 of the Affordable Care Act modified the Medicare disproportionate share hospital (DSH) payment methodology beginning in FY 2014. Under section 1886(r) of the Act, which was added by section 3133 of the Affordable Care Act, starting in FY 2014, DSHs will receive 25 percent of the amount they previously would have received under the statutory formula for Medicare DSH payments in section 1886(d)(5)(F) of the Act. The remaining amount, equal to 75 percent of what otherwise would have been paid as Medicare DSH payments, will be paid as additional payments after the amount is reduced for changes in the percentage of individuals that are uninsured. Each Medicare DSH hospital will receive its additional amount based on its share of the total amount of uncompensated care for all Medicare DSH hospitals for a given time period. In this final rule, we are updating the uncompensated care amount to be distributed for FY 2015, and we are making changes to the methodology for calculating the uncompensated care payment amounts such that we will combine uncompensated care data for hospitals that have merged in order to calculate the relative share of uncompensated care for the surviving hospital.

f. Hospital Inpatient Quality Reporting (IQR) Program

Under section 1886(b)(3)(B)(viii) of the Act, hospitals are required to report data on measures selected by the Secretary for the Hospital IQR Program in order to receive the full annual percentage increase. In past rules, we have established measures for reporting and the process for submittal and validation of the data.

We are finalizing a total of 63 measures (47 required and 16 voluntary electronic clinical quality measures) in the Hospital IQR Program measure set for the FY 2017 payment determination and subsequent years. In this final rule, we are finalizing 11 new measures (1 chart-abstracted, 4 claims-based, and 6 voluntary electronic clinical quality measures). We proposed to remove 20 measures, but are only finalizing the removal of 19. The SCIP-INF-4 measure was proposed for removal, but will be retained as it was recently retooled for the 2014 collection period. Ten of these 19 measures are topped-out, chart-abstracted measures that are being retained as voluntary electronic clinical quality measures.

While we are finalizing our proposal to align the reporting and submission timelines of the Medicare EHR Incentive Program with those of the Hospital IQR Program on the calendar year for CQMs that are reported electronically for 2015, we are not finalizing the proposal to require quarterly submission of CQM data. Hospitals can voluntarily submit one calendar year (CY) quarter of data for Q 1, Q 2, or Q3 of 2015 by November 30, 2015, in order to partially fulfill requirements for both programs for CY 2015. In addition, we are finalizing a number of new policies related to the administration of the program, including access to specific NHSN data, updates to validation, and an electronic clinical quality measures validation pilot test.

g. Changes to the LTCH PPS

Section 1206(b) of the Pathway for SGR Reform Act provides for the retroactive reinstatement and extension, for an additional 4 years, of the moratorium on the full implementation of the 25-percent threshold payment adjustment under the LTCH PPS established under section 114(c) of the MMSEA, as further amended by subsequent legislation. In keeping with this mandate, we are reinstating this payment adjustment retroactively for LTCH cost reporting periods beginning on or after July 1, 2013, or October 1, 2013.

Section 1206(b)(2) of the Pathway for SGR Reform Act, as amended by section 112(b) of the Protecting Access to Medicare Act of 2014, provides for new statutory moratoria on the establishment of new LTCHs and LTCH satellite facilities (subject to certain defined exceptions) and a new statutory moratorium on bed increases in existing LTCHs effective for the period beginning April 1, 2014 and ending September 30, 2017.

In accordance with section 1206(d) of the Pathway for SGR Reform Act of 2013, we are applying a payment adjustment under the LTCH PPS to subclause (II) LTCHs beginning in FY 2015 that will result in payments to this type of LTCH resembling reasonable cost payments under the TEFRA payment system model.

We also discuss our proposed changes to the LTCH interruption of stay policy, which is a payment adjustment that is applied when, during the course of an LTCH hospitalization, a patient is discharged to an inpatient acute care hospital, an IRF, or a SNF for treatment or services not available at the LTCH for a specified period followed by readmittance to the same LTCH. In addition, we are finalizing our proposal to remove the 5-percent payment threshold policy for patient transfers between LTCHs and onsite providers.

3. Summary of Costs and Benefits

  • Adjustment for MS-DRG Documentation and Coding Changes. We are making a −0.8 percent recoupment adjustment to the standardized amount for FY 2015 to implement, in part, the requirement of section 631 of the ATRA that the Secretary make an adjustment totaling $11 billion over a 4-year period of FYs 2014, 2015, 2016, and 2017. This recoupment adjustment represents the amount of the increase in aggregate payments as a result of not completing the prospective adjustment authorized under section 7(b)(1)(A) of Public Law 110-90 until FY 2013. Prior to the ATRA, this amount could not have been recovered under Public Law 110-90.

While our actuaries estimated that a −9.3 percent recoupment adjustment to the standardized amount would be necessary if CMS were to fully recover the $11 billion recoupment required by section 631 of the ATRA in FY 2014, it is often our practice to delay or phase in rate adjustments over more than one year, in order to moderate the effects on rates in any one year. Therefore, consistent with the policies that we have adopted in many similar cases and the adjustment we made for FY 2014, we are making a −0.8 percent recoupment adjustment to the standardized amount in FY 2015. We estimated that this level of adjustment, combined with leaving the −0.8 percent adjustment made for FY 2014 in place, will recover up to $2 billion in FY 2015. Taking into account the approximately $1 billion recovered in FY 2014, this will leave approximately $8 billion remaining to be recovered by FY 2017.

  • Reduction to Hospital Payments for Excess Readmissions. The provisions of section 1886(q) of the Act which establishes the Hospital Readmissions Reduction Program are not budget neutral. For FY 2015, a hospital's readmissions payment adjustment factor is the higher of a ratio of a hospital's aggregate payments for excess readmissions to its aggregate payments for all discharges, or 0.97 (that is, or a 3-percent reduction). In this final rule, we estimate that the reduction to a hospital's base operating DRG payment amount to account for excess readmissions of selected applicable conditions under the Hospital Readmissions Reduction Program will result in a 0.2 percent decrease in payments to hospitals for FY 2015 relative to FY 2014.
  • Value-Based Incentive Payments under the Hospital Value-Based Purchasing (VBP) Program. We estimate that there will be no net financial impact to the Hospital VBP Program for FY 2015 in the aggregate because, by law, the amount available for value-based incentive payments under the program in a given fiscal year must be equal to the total amount of base operating DRG payment amount reductions for that year, as estimated by the Secretary. The estimated amount of base operating DRG payment amount reductions for FY 2015 and, therefore, the estimated amount available for value-based incentive payments for FY 2015 discharges is approximately $1.4 billion. We believe that the program's benefits will be seen in improved patient outcomes, safety, and in the patient's experience of care. However, we cannot estimate these benefits in actual dollar and patient terms.
  • Payment Adjustment under the HAC Reduction Program for FY 2015. Under section 1886(p) of the Act, (as added by section 3008 of the Affordable Care Act), the incentive to reduce hospital-acquired conditions with a payment adjustment to applicable hospitals under the HAC Reduction Program is made beginning FY 2015. We estimate that, under this provision, overall payments will decrease approximately 0.3 percent or $369 million.
  • Medicare DSH Payment Adjustment and Additional Payment for Uncompensated Care. Under section 1886(r) of the Act (as added by section 3313 of the Affordable Care Act), disproportionate share hospital payments to hospitals under section 1886(d)(5)(F) of the Act are reduced and an additional payment is made to eligible hospitals beginning in FY 2014. Hospitals that receive Medicare DSH payments will receive 25 percent of the amount they previously would have received under the current statutory formula for Medicare DSH payments in section 1886(d)(5)(F) of the Act. The remainder, equal to 75 percent of what otherwise would have been paid as Medicare DSH payments, will be the basis for determining the additional payments for uncompensated care after the amount is reduced for changes in the percentage of individuals that are uninsured and additional statutory adjustments. Each hospital that receives Medicare DSH payments will receive an additional payment based on its share of the total uncompensated care amount reported by Medicare DSHs. The reduction to Medicare DSH payments is not budget neutral.

For FY 2015, we are providing that the 75 percent of what otherwise would have been paid for Medicare DSH is adjusted to approximately 76.19 percent of the amount for changes in the percentage of individuals that are uninsured and additional statutory adjustments. In other words, our estimate of Medicare DSH payments prior to the application of section 3133 of the Affordable Care Act is adjusted to approximately 57.1 percent (the product of 75 percent and 76.19 percent) and the resulting payment amount is used to create an additional payment to hospitals for their relative share of the total amount of uncompensated care. We project that Medicare DSH payments and additional payments for uncompensated care made for FY 2015 will reduce payments overall by 1.3 percent as compared to the Medicare DSH payments and uncompensated care payments distributed in FY 2014. The additional payments have redistributive effects based on a hospital's uncompensated care amount relative to the uncompensated care amount for all hospitals that are estimated to receive Medicare DSH payments, and the final payment amount is not tied to a hospital's discharges.

  • Hospital Inpatient Quality Reporting (IQR) Program. In this final rule, we are finalizing 11 new measures (1 chart-abstracted, 4 claims-based, and 6 voluntary electronic clinical quality measures). We proposed to remove 20 measures, but are only finalizing the removal of 19. The SCIP-INF-4 measure was proposed for removal, but will be retained as it was recently retooled for the 2014 collection period. 10 of these 19 measures are topped-out, chart-abstracted measures that are being retained as voluntary electronic clinical quality measures. We estimate that the adoption and removal of these measures will decrease hospital costs by $39.8 million.
  • Update to the LTCH PPS Standard Federal Rate and Other Payment Factors. Based on the best available data for the 423 LTCHs in our database, we estimate that the changes to the payment rates and factors we are presenting in the preamble and Addendum of this final rule, including the update to the standard Federal rate for FY 2015, the changes to the area wage adjustment for FY 2015, and the expected changes to short-stay outliers and high-cost outliers, will result in an increase in estimated payments from FY 2014 of approximately $62 million (or 1.1 percent). In addition, we estimate that net effect of the projected impact of certain other LTCH PPS policy changes (that is, the reinstatement of the moratorium on the full implementation of the “25 percent threshold” payment adjustment; the reinstatement of the moratorium on the development of new LTCHs and LTCH satellite facilities and additional LTCH beds; the revocation of onsite discharges and readmissions policy; and the payment adjustment for “subclause (II)” LTCHs) is estimated to result in an increase in LTCH PPS payments of approximately $116 million.

The impact analysis of the payment rates and factors presented in this final rule under the LTCH PPS, in conjunction with the estimated payment impacts of certain other LTCH PPS policy changes will result in a net increase of $178 million to LTCH providers. Additionally, we estimate that the costs to LTCHs associated with the completion of the data for the LTCHQR Program to be approximately $4.7 million more than FY 2014.

B. Summary

1. Acute Care Hospital Inpatient Prospective Payment System (IPPS)

Section 1886(d) of the Social Security Act (the Act) sets forth a system of payment for the operating costs of acute care hospital inpatient stays under Medicare Part A (Hospital Insurance) based on prospectively set rates. Section 1886(g) of the Act requires the Secretary to use a prospective payment system (PPS) to pay for the capital-related costs of inpatient hospital services for these “subsection (d) hospitals.” Under these PPSs, Medicare payment for hospital inpatient operating and capital-related costs is made at predetermined, specific rates for each hospital discharge. Discharges are classified according to a list of diagnosis-related groups (DRGs).

The base payment rate is comprised of a standardized amount that is divided into a labor-related share and a nonlabor-related share. The labor-related share is adjusted by the wage index applicable to the area where the hospital is located. If the hospital is located in Alaska or Hawaii, the nonlabor-related share is adjusted by a cost-of-living adjustment factor. This base payment rate is multiplied by the DRG relative weight.

If the hospital treats a high percentage of certain low-income patients, it receives a percentage add-on payment applied to the DRG-adjusted base payment rate. This add-on payment, known as the disproportionate share hospital (DSH) adjustment, provides for a percentage increase in Medicare payments to hospitals that qualify under either of two statutory formulas designed to identify hospitals that serve a disproportionate share of low-income patients. For qualifying hospitals, the amount of this adjustment varies based on the outcome of the statutory calculations. The Affordable Care Act revised the Medicare DSH payment methodology and provides for a new additional Medicare payment that considers the amount of uncompensated care beginning on October 1, 2013.

If the hospital is an approved teaching hospital, it receives a percentage add-on payment for each case paid under the IPPS, known as the indirect medical education (IME) adjustment. This percentage varies, depending on the ratio of residents to beds.

Additional payments may be made for cases that involve new technologies or medical services that have been approved for special add-on payments. To qualify, a new technology or medical service must demonstrate that it is a substantial clinical improvement over technologies or services otherwise available, and that, absent an add-on payment, it would be inadequately paid under the regular DRG payment.

The costs incurred by the hospital for a case are evaluated to determine whether the hospital is eligible for an additional payment as an outlier case. This additional payment is designed to protect the hospital from large financial losses due to unusually expensive cases. Any eligible outlier payment is added to the DRG-adjusted base payment rate, plus any DSH, IME, and new technology or medical service add-on adjustments.

Although payments to most hospitals under the IPPS are made on the basis of the standardized amounts, some categories of hospitals are paid in whole or in part based on their hospital-specific rate, which is determined from their costs in a base year. For example, sole community hospitals (SCHs) receive the higher of a hospital-specific rate based on their costs in a base year (the highest of FY 1982, FY 1987, FY 1996, or FY 2006) or the IPPS Federal rate based on the standardized amount. Through and including FY 2006, a Medicare-dependent, small rural hospital (MDH) received the higher of the Federal rate or the Federal rate plus 50 percent of the amount by which the Federal rate is exceeded by the higher of its FY 1982 or FY 1987 hospital-specific rate. As discussed below, for discharges occurring on or after October 1, 2007, but before April 1, 2015, an MDH will receive the higher of the Federal rate or the Federal rate plus 75 percent of the amount by which the Federal rate is exceeded by the highest of its FY 1982, FY 1987, or FY 2002 hospital-specific rate. (We note that the statutory provision for payments to MDHs expires on March 31, 2015, under current law.) SCHs are the sole source of care in their areas, and MDHs are a major source of care for Medicare beneficiaries in their areas. Specifically, section 1886(d)(5)(D)(iii) of the Act defines an SCH as a hospital that is located more than 35 road miles from another hospital or that, by reason of factors such as isolated location, weather conditions, travel conditions, or absence of other like hospitals (as determined by the Secretary), is the sole source of hospital inpatient services reasonably available to Medicare beneficiaries. In addition, certain rural hospitals previously designated by the Secretary as essential access community hospitals are considered SCHs. Section 1886(d)(5)(G)(iv) of the Act defines an MDH as a hospital that is located in a rural area, has not more than 100 beds, is not an SCH, and has a high percentage of Medicare discharges (not less than 60 percent of its inpatient days or discharges in its cost reporting year beginning in FY 1987 or in two of its three most recently settled Medicare cost reporting years). Both of these categories of hospitals are afforded this special payment protection in order to maintain access to services for beneficiaries.

Section 1886(g) of the Act requires the Secretary to pay for the capital-related costs of inpatient hospital services “in accordance with a prospective payment system established by the Secretary.” The basic methodology for determining capital prospective payments is set forth in our regulations at 42 CFR 412.308 and 412.312. Under the capital IPPS, payments are adjusted by the same DRG for the case as they are under the operating IPPS. Capital IPPS payments are also adjusted for IME and DSH, similar to the adjustments made under the operating IPPS. In addition, hospitals may receive outlier payments for those cases that have unusually high costs.

The existing regulations governing payments to hospitals under the IPPS are located in 42 CFR Part 412, Subparts A through M.

2. Hospitals and Hospital Units Excluded From the IPPS

Under section 1886(d)(1)(B) of the Act, as amended, certain hospitals and hospital units are excluded from the IPPS. These hospitals and units are: Rehabilitation hospitals and units; long-term care hospitals (LTCHs); psychiatric hospitals and units; children's hospitals; certain cancer hospitals; and short-term acute care hospitals located in Guam, the U.S. Virgin Islands, the Northern Mariana Islands, and American Samoa. Religious nonmedical health care institutions (RNHCIs) are also excluded from the IPPS. Various sections of the Balanced Budget Act of 1997 (BBA, Pub. L. 105-33), the Medicare, Medicaid and SCHIP [State Children's Health Insurance Program] Balanced Budget Refinement Act of 1999 (BBRA, Pub. L. 106-113), and the Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act of 2000 (BIPA, Pub. L. 106-554) provide for the implementation of PPSs for rehabilitation hospitals and units (referred to as inpatient rehabilitation facilities (IRFs)), LTCHs, and psychiatric hospitals and units (referred to as inpatient psychiatric facilities (IPFs)). (We note that the annual updates to the LTCH PPS are now included as part of the IPPS annual update document. Updates to the IRF PPS and IPF PPS are issued as separate documents.) Children's hospitals, certain cancer hospitals, short-term acute care hospitals located in Guam, the U.S. Virgin Islands, the Northern Mariana Islands, and American Samoa, and RNHCIs continue to be paid solely under a reasonable cost-based system subject to a rate-of-increase ceiling on inpatient operating costs, as updated annually by the percentage increase in the IPPS operating market basket.

The existing regulations governing payments to excluded hospitals and hospital units are located in 42 CFR Parts 412 and 413.

3. Long-Term Care Hospital Prospective Payment System (LTCH PPS)

The Medicare prospective payment system (PPS) for LTCHs applies to hospitals described in section 1886(d)(1)(B)(iv) of the Act effective for cost reporting periods beginning on or after October 1, 2002. The LTCH PPS was established under the authority of section 123 of the BBRA and section 307(b) of the BIPA (as codified under section 1886(m)(1) of the Act). During the 5-year (optional) transition period, a LTCH's payment under the PPS was based on an increasing proportion of the LTCH Federal rate with a corresponding decreasing proportion based on reasonable cost principles. Effective for cost reporting periods beginning on or after October 1, 2006, all LTCHs are paid 100 percent of the Federal rate. The existing regulations governing payment under the LTCH PPS are located in 42 CFR Part 412, Subpart O. Beginning with FY 2009, annual updates to the LTCH PPS are published in the same documents that update the IPPS (73 FR 26797 through 26798).

4. Critical Access Hospitals (CAHs)

Under sections 1814(l), 1820, and 1834(g) of the Act, payments made to critical access hospitals (CAHs) (that is, rural hospitals or facilities that meet certain statutory requirements) for inpatient and outpatient services are generally based on 101 percent of reasonable cost. Reasonable cost is determined under the provisions of section 1861(v)(1)(A) of the Act and existing regulations under 42 CFR Part 413.

5. Payments for Graduate Medical Education (GME)

Under section 1886(a)(4) of the Act, costs of approved educational activities are excluded from the operating costs of inpatient hospital services. Hospitals with approved graduate medical education (GME) programs are paid for the direct costs of GME in accordance with section 1886(h) of the Act. The amount of payment for direct GME costs for a cost reporting period is based on the hospital's number of residents in that period and the hospital's costs per resident in a base year. The existing regulations governing payments to the various types of hospitals are located in 42 CFR Part 413.

C. Summary of Provisions of Recent Legislation Discussed in This Final Rule

The Patient Protection and Affordable Care Act (Pub. L. 111-148), enacted on March 23, 2010, and the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152), enacted on March 30, 2010, made a number of changes that affect the IPPS and the LTCH PPS. (Pub. L. 111-148 and Pub. L. 111-152 are collectively referred to as the “Affordable Care Act.”) A number of the provisions of the Affordable Care Act affect the updates to the IPPS and the LTCH PPS and providers and suppliers. The provisions of the Affordable Care Act that were applicable to the IPPS and the LTCH PPS for FYs 2010, 2011, and 2012 were implemented in the June 2, 2010 Federal Register notice (75 FR 31118), the FY 2011 IPPS/LTCH PPS final rule (75 FR 50042) and the FY 2012 IPPS/LTCH PPS final rule (76 FR 51476).

The American Taxpayer Relief Act of 2012 (ATRA) (Pub. L. 112-240), enacted on January 2, 2013, also made a number of changes that affect the IPPS. We announced changes related to certain IPPS provisions for FY 2013 in accordance with sections 605 and 606 of Public Law 112-240 in a document that appeared in the Federal Register on March 7, 2013 (78 FR 14689).

The Pathway for SGR Reform Act of 2013 (Pub. L. 113-67), enacted on December 26, 2013, also made a number of changes that affect the IPPS and the LTCH PPS. We implemented changes related to the low-volume hospital payment adjustment and MDH provisions for FY 2014 in accordance with sections 1105 and 1106 of Public Law 113-67 in an interim final rule with comment period that appeared in the Federal Register on March 18, 2014 (79 FR 15022).

The Protecting Access to Medicare Act of 2014 (Pub. L. 113-93), enacted on April 1, 2014, also made a number of changes that affect the IPPS and LTCH PPS.

1. The Patient Protection and Affordable Care Act (Pub. L. 111-148) and the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152)

In this final rule, we are making policy changes to implement (or, as applicable, continue to implement in FY 2015) the following provisions (or portions of the following provisions) of the Affordable Care Act that are applicable to the IPPS, the LTCH PPS, and PPS-exempt cancer hospitals for FY 2015:

  • Section 3001(a) of Public Law 111-148, which requires the establishment of a hospital inpatient value-based purchasing program under which value-based incentive payments are made in a fiscal year to hospitals that meet performance standards for the performance period for that fiscal year.
  • Section 3004 of Public Law 111-148, which provides for the submission of quality data by LTCHs in order for them to receive the full annual update to the payment rates beginning with the FY 2014 rate year.
  • Section 3005 of Public Law 111-148, which provides for the establishment of a quality reporting program for PPS-exempt cancer hospitals beginning with FY 2014, and for subsequent program years.
  • Section 3008 of Public Law 111-148, which establishes the Hospital-Acquired Condition (HAC) Reduction Program and requires the Secretary to make an adjustment to hospital payments for applicable hospitals, effective for discharges beginning on October 1, 2014, and for subsequent program years.
  • Section 3025 of Public Law 111-148, which establishes a hospital readmissions reduction program and requires the Secretary to reduce payments to applicable hospitals with excess readmissions effective for discharges beginning on or after October 1, 2012.
  • Section 3133 of Public Law 111-148, as amended by section 10316 of Public Law 111-148 and section 1104 of Public Law 111-152, which modifies the methodologies for determining Medicare DSH payments and creates a new additional payment for uncompensated care effective for discharges beginning on or after October 1, 2013.
  • Section 3401 of Public Law 111-148, which provides for the incorporation of productivity adjustments into the market basket updates for IPPS hospitals and LTCHs.
  • Section 10324 of Public Law 111-148, which provides for a wage adjustment for hospitals located in frontier States.
  • Sections 3401 and 10319 of Public Law 111-148 and section 1105 of Public Law 111-152, which revise certain market basket update percentages for IPPS and LTCH PPS payment rates for FY 2015.
  • Section 5506 of Public Law 111-148, which added a provision to the Act that instructs the Secretary to establish a process by regulation under which, in the event a teaching hospital closes, the Secretary will permanently increase the FTE resident caps for hospitals that meet certain criteria up to the number of the closed hospital's FTE resident caps.

2. American Taxpayer Relief Act of 2012 (ATRA) (Pub. L. 112-240)

In this final rule, we are making policy changes to implement section 631 of the American Taxpayer Relief Act of 2012, which amended section 7(b)(1)(B) of Public Law 110-90 and requires a recoupment adjustment to the standardized amounts under section 1886(d) of the Act based upon the Secretary's estimates for discharges occurring in FY 2014 through FY 2017 to fully offset $11 billion (which represents the amount of the increase in aggregate payments from FYs 2008 through 2013 for which an adjustment was not previously applied).

3. Pathway for SGR Reform Act of 2013 (Pub. L. 113-67)

In this final rule, we are making policy changes to implement, or discuss the need for future policy changes, to carry out provisions under section 1206 of the Pathway for SGR Reform Act of 2013. These include:

  • Section 1206(a), which provides the establishment of patient criteria for “site neutral” payment rates under the LTCH PPS, portions of which will begin to be implemented in FY 2016.
  • Section 1206(b)(1), which further amended section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act by retroactively reestablishing, and extending, the statutory moratorium on the full implementation of the 25-percent threshold payment adjustment policy under the LTCH PPS so that the policy will be in effect for 9 years (except for grandfathered hospitals-within-hospitals (HwHs), which are permanently exempt from this policy).
  • Section 1206(b)(2), which amended section 114(d) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act to establish new moratoria (subject to certain defined exceptions) on the development of new LTCHs and LTCH satellite facilities and a new moratorium on increases in the number of beds in existing LTCHs and LTCH satellite facilities.
  • Section 1206(d), which instructs the Secretary to evaluate payments to LTCHs classified under section 1886(d)(1)(B)(iv)(II) of the Act and to adjust payment rates in FY 2015 or 2016 under the LTCH PPS, as appropriate, based upon the evaluation findings.

4. Protecting Access to Medicare Act of 2014 (Pub. L. 113-93)

In this final rule, we are making policy changes to implement, or making conforming changes to regulations in accordance with, the following provisions (or portions of the following provisions) of the Protecting Access to Medicare Act of 2014 that are applicable to the IPPS and the LTCH PPS for FY 2015:

  • Section 105, which extends the temporary changes to the Medicare inpatient hospital payment adjustment for low-volume subsection (d) hospitals through March 31, 2015.
  • Section 106, which extends the MDH program through March 31, 2015.
  • Section 112, which makes certain changes to Medicare LTCH provisions, including modifications to the statutory moratoria on the establishment of new LTCHs and LTCH satellite facilities.
  • Section 212, which prohibits the Secretary from requiring implementation of ICD-10 code sets before October 1, 2015.

D. Issuance of Notice of Proposed Rulemaking

Earlier this year, we published a proposed rule that set forth proposed changes to the Medicare IPPS for operating costs and for capital-related costs of acute care hospitals for FY 2015. The proposed rule appeared in the Federal Register on May 15, 2014 (79 FR 27978). In the proposed rule, we also set forth proposed changes relating to payments for IME and GME costs and payments to certain hospitals that continue to be excluded from the IPPS and paid on a reasonable cost basis. In addition, in the proposed rule, we set forth proposed changes to the payment rates, factors, and other payment rate policies under the LTCH PPS for FY 2015.

Below is a summary of the major changes that we proposed to make:

1. Proposed Changes to MS-DRG Classifications and Recalibrations of Relative Weights

In section II. of the preamble of the proposed rule, we included—

  • Proposed changes to MS-DRG classifications based on our yearly review, including a discussion of the conversion of MS-DRGs to ICD-10 and the status of the implementation of the ICD-10-CM and ICD-10-PCS systems.
  • Proposed application of the documentation and coding adjustment for FY 2015 resulting from implementation of the MS-DRG system.
  • Proposed recalibrations of the MS-DRG relative weights.
  • Proposed changes to hospital-acquired conditions (HACs) and a listing and discussion of HACs, including infections, that would be subject to the statutorily required adjustment in MS-DRG payments for FY 2015.
  • A discussion of the FY 2015 status of new technologies approved for add-on payments for FY 2014 and a presentation of our evaluation and analysis of the FY 2015 applicants for add-on payments for high-cost new medical services and technologies (including public input, as directed by Pub. L. 108-173, obtained in a town hall meeting).

2. Proposed Changes to the Hospital Wage Index for Acute Care Hospitals

In section III. of the preamble to the proposed rule, we proposed revisions to the wage index for acute care hospitals and the annual update of the wage data. Specific issues addressed included the following:

  • Proposed changes in CBSAs as a result of new OMB labor market area delineations and proposed policies related to the proposed changes in CBSAs.
  • The proposed FY 2015 wage index update using wage data from cost reporting periods beginning in FY 2011.
  • Analysis and implementation of the proposed FY 2015 occupational mix adjustment to the wage index for acute care hospitals, including the proposed application of the rural floor, the proposed imputed rural floor, and the proposed frontier State floor.
  • Proposed revisions to the wage index for acute care hospitals based on hospital redesignations and reclassifications.
  • The proposed adjustment to the wage index for acute care hospitals for FY 2015 based on commuting patterns of hospital employees who reside in a county and work in a different area with a higher wage index.
  • The timetable for reviewing and verifying the wage data used to compute the proposed FY 2015 hospital wage index and proposed revisions to that timetable.
  • Determination of the labor-related share for the proposed FY 2015 wage index.

3. Other Decisions and Proposed Changes to the IPPS for Operating Costs and GME Costs

In section IV. of the preamble of the proposed rule, we discussed proposed changes or clarifications of a number of the provisions of the regulations in 42 CFR Parts 412 and 413, including the following:

  • Proposed changes in postacute care transfer policies as a result of proposed new MS-DRGs.
  • Proposed changes to the inpatient hospital updates for FY 2015, including incorporation of the adjustment for hospitals that are not meaningful EHR users under section 1886(b)(3)(B)(ix) of the Act.
  • The proposed updated national and regional case-mix values and discharges for purposes of determining RRC status.
  • Proposed payment adjustment for low-volume hospitals for FY 2015.
  • The statutorily required IME adjustment factor for FY 2015 and proposed IME add-on payments for Medicare Part C discharges to SCHs that are paid according to their hospital-specific rates.
  • Effect of expiration of the MDH program on April 1, 2015.
  • Proposed changes to the methodologies for determining Medicare DSH payments and the additional payments for uncompensated care.
  • Proposed changes to the measures and payment adjustments under the Hospital Readmissions Reduction Program.
  • Proposed changes to the requirements and provision of value-based incentive payments under the Hospital Value-Based Purchasing Program.
  • Proposed requirements for payment adjustments to hospitals under the HAC Reduction Program for FY 2015.
  • Proposed IME and direct GME policy changes regarding the effective date of the FTE resident cap, 3-year rolling average, and IRB ratio cap in new programs in teaching hospitals; effect of new OMB labor market area delineations on certain teaching hospitals training residents in rural areas; clarification of effective date of provisions on counting resident time in nonprovider settings; proposed changes to the process for reviewing applications for and awarding slots made available under section 5506 of the Affordable Care Act by teaching hospitals that close; and clarification regarding direct GME payment to FQHCs and RHCs that train residents in approved programs.
  • Discussion of the Rural Community Hospital Demonstration Program and a proposal for making a budget neutrality adjustment for the demonstration program.
  • Discussion of the requirements for transparency of hospital charges under the Affordable Care Act.
  • Discussion of and solicitation of comments on an alternative payment methodology under the Medicare program for short inpatient hospital stays.
  • Discussion of the process for submitting suggested exceptions to the 2-midnight benchmark.

4. Proposed FY 2015 Policy Governing the IPPS for Capital-Related Costs

In section V. of the preamble to the proposed rule, we discussed the proposed payment policy requirements for capital-related costs and capital payments to hospitals for FY 2015 and other related proposed policy changes.

5. Proposed Changes to the Payment Rates for Certain Excluded Hospitals: Rate-of-Increase Percentages

In section VI. of the preamble of the proposed rule, we discussed—

  • Proposed changes to payments to certain excluded hospitals for FY 2015.
  • Proposed updates to the RCE limits and proposed changes to the methodology for determining such limits for services furnished by physicians to IPPS-excluded hospitals and certain teaching hospitals.
  • Proposed CAH related changes regarding reclassifications as rural.
  • Proposed changes to the physician certification requirements for services furnished in CAHs.

6. Proposed Changes to the LTCH PPS

In section VII. of the preamble of the proposed rule, we set forth—

  • Proposed changes to the payment rates, factors, and other payment rate policies under the LTCH PPS for FY 2015.
  • Proposed revisions to the LTCH PPS geographic classifications based on the new OMB delineations.
  • Proposals to implement section 1206(b)(1) of the Pathway for SGR Reform Act, which provides for the retroactive reinstatement and extension, for an additional 4 years, of the statutory moratorium on the full implementation of the 25-percent threshold payment adjustment established under section 114(c) of the MMSEA, as further amended by subsequent legislation.
  • Proposals to implement section 1206(b)(2) of the Pathway for SGR Reform Act, as amended by section 112(b) of the Protecting Access to Medicare Act of 2014, which provides for moratoria (subject to certain defined exceptions) on the establishment of new LTCHs and LTCH satellite facilities and a moratorium on bed increases in LTCHs effective for the period beginning April 1, 2014, and ending September 30, 2017.
  • Proposed changes to the LTCH interruption of stay policy by revising the fixed-day thresholds under the “greater than 3-day interruption of stay policy” to apply a uniform 30-day threshold as an “acceptable standard” for determining a linkage between an index discharge and a readmission.
  • Proposal to remove the discharge and readmission requirement, “Special Payment Provisions for Patients Who are Transferred to Onsite Providers and Readmitted to an LTCH” (the “5 percent payment threshold”) beginning in FY 2015.
  • Proposal to apply a payment adjustment under the LTCH PPS to subclause (II) LTCHs beginning in FY 2015 that would result in payments to this type of LTCH resembling reasonable cost payment under the TEFRA payment system model, consistent with the provisions of section 1206(d) of the Pathway for SGR Reform Act of 2013.

7. Proposed Changes to Regulations Governing Administrative Appeals by Providers and Judicial Review of Provider Claims

In section VIII. of the preamble of the proposed rule, we set forth proposals to revise the regulations governing administrative appeals and judicial review of provider claims in Medicare cost reports.

8. Proposed Changes Relating to Quality Data Reporting for Specific Providers and Suppliers

In section IX. of the preamble of the proposed rule, we addressed—

  • Proposed requirements for the Hospital Inpatient Quality Reporting (IQR) Program as a condition for receiving the full applicable percentage increase.
  • Proposed changes to the requirements for the quality reporting program for PPS-exempt cancer hospitals (PCHQR Program).
  • Proposed changes to the requirements under the LTCH Quality Reporting (LTCHQR) Program.

9. Proposed Uses and Release of Medicare Advantage Risk Adjustment Data

In section X. of the preamble of the proposed rule, we set forth proposed regulatory revisions to broaden the specified uses of Medicare Advantage (MA) risk adjustment data and to specify the conditions for release of such risk adjustment data to entities outside of CMS.

10. Proposed Changes to Enforcement Provisions for Organ Transplant Centers

In section XI. of the preamble of the proposed rule, we proposed to revise the regulations governing organ transplant centers that request approval, based on mitigating factors for initial approval and re-approval, for participation in Medicare when the centers have not met one or more of the conditions of participation.

11. Determining Prospective Payment Operating and Capital Rates and Rate-of-Increase Limits for Acute Care Hospitals

In the Addendum to the proposed rule, we set forth proposed changes to the amounts and factors for determining the proposed FY 2015 prospective payment rates for operating costs and capital-related costs for acute care hospitals. We also proposed to establish the threshold amounts for outlier cases. In addition, we addressed the proposed update factors for determining the rate-of-increase limits for cost reporting periods beginning in FY 2015 for certain hospitals excluded from the IPPS.

12. Determining Prospective Payment Rates for LTCHs

In the Addendum to the proposed rule, we set forth proposed changes to the amounts and factors for determining the proposed FY 2015 LTCH PPS standard Federal rate. We proposed to establish the adjustments for wage levels (including proposed changes to the LTCH PPS labor market area delineations based on the new OMB delineations), the labor-related share, the cost-of-living adjustment, and high-cost outliers, including the fixed-loss amount, and the LTCH cost-to-charge ratios (CCRs) under the LTCH PPS.

13. Impact Analysis

In Appendix A of the proposed rule, we set forth an analysis of the impact that the proposed changes would have on affected acute care hospitals, LTCHs, and PCHs.

14. Recommendation of Update Factors for Operating Cost Rates of Payment for Hospital Inpatient Services

In Appendix B of the proposed rule, as required by sections 1886(e)(4) and (e)(5) of the Act, we provided our recommendations of the appropriate percentage changes for FY 2015 for the following:

  • A single average standardized amount for all areas for hospital inpatient services paid under the IPPS for operating costs of acute care hospitals (and hospital-specific rates applicable to SCHs).
  • Target rate-of-increase limits to the allowable operating costs of hospital inpatient services furnished by certain hospitals excluded from the IPPS.
  • The standard Federal rate for hospital inpatient services furnished by LTCHs.

15. Discussion of Medicare Payment Advisory Commission Recommendations

Under section 1805(b) of the Act, MedPAC is required to submit a report to Congress, no later than March 15 of each year, in which MedPAC reviews and makes recommendations on Medicare payment policies. MedPAC's March 2014 recommendations concerning hospital inpatient payment policies address the update factor for hospital inpatient operating costs and capital-related costs for hospitals under the IPPS. We addressed these recommendations in Appendix B of the proposed rule. For further information relating specifically to the MedPAC March 2014 report or to obtain a copy of the report, contact MedPAC at (202) 220-3700 or visit MedPAC's Web site at: http://www.medpac.gov.

E. Public Comments Received in Response to the FY 2015 IPPS/LTCH PPS Proposed Rule

We received approximately 653 timely pieces of correspondence containing multiple comments on the FY 2015 IPPS/LTCH PPS proposed rule. We note that some of these public comments were outside of the scope of the proposed rule. These out-of-scope public comments are not addressed in the policy responses in this final rule. Summaries of the public comments that are within the scope of the proposed rule and our responses to those public comments are set forth in the various sections of this final rule under the appropriate headings.

F. Finalization of Interim Final Rule With Comment Period on Extension of Payment Adjustment for Low-Volume Hospitals and the Medicare-Dependent, Small Rural Hospital (MDH) Program for FY 2014 Discharges Through March 31, 2014

In an interim final rule with comment period (CMS-1599-IFC2) that appeared in the Federal Register on March 18, 2014, we implemented the extension of the temporary changes to the payment adjustment for low-volume hospitals and the MDH program under the IPPS for FY 2014 (through March 31, 2014) in accordance with sections 1105 and 1106, respectively, of the Pathway for SGR Reform Act of 2013 (79 FR 15022 through 15030). We received four timely pieces of correspondence on this interim final rule with comment period. In section IV.P. of the preamble of this final rule, we summarize the provisions of the interim final rule, summarize and respond to the public comments received, and finalize the provisions of the interim final rule with comment period.

G. Finalization of Interim Final Rule With Comment Period on Changes to Certain Cost Reporting Procedures Related to Disproportionate Share Hospital Uncompensated Care Payments

In an interim final rule with comment period (CMS-1599-IFC) that appeared in the Federal Register on October 13, 2013 (78 FR 61191), we revised certain operational considerations for hospitals with Medicare cost reporting periods that span more than one Federal fiscal year and also made chnges to the data that will be used in the uncompensated care payment calculation in order to ensure that data from Indian Health Service (IHS) hospitals are included in Factor 1 and Factor 3 of that calculation (78 FR 61191 through 61197). We received 12 timely pieces of correspondence in response to this interim final rule with comment period. In section IV.Q. of the preamble of this final rule, we summarize the provisions of the interim final rule with comment period, summarize and respond to the public comments received, and finalize the provisions of the interim final rule with comment period.

II. Changes to Medicare Severity Diagnosis-Related Group (MS-DRG) Classifications and Relative Weights

A. Background

Section 1886(d) of the Act specifies that the Secretary shall establish a classification system (referred to as diagnosis-related groups (DRGs)) for inpatient discharges and adjust payments under the IPPS based on appropriate weighting factors assigned to each DRG. Therefore, under the IPPS, Medicare pays for inpatient hospital services on a rate per discharge basis that varies according to the DRG to which a beneficiary's stay is assigned. The formula used to calculate payment for a specific case multiplies an individual hospital's payment rate per case by the weight of the DRG to which the case is assigned. Each DRG weight represents the average resources required to care for cases in that particular DRG, relative to the average resources used to treat cases in all DRGs.

Congress recognized that it would be necessary to recalculate the DRG relative weights periodically to account for changes in resource consumption. Accordingly, section 1886(d)(4)(C) of the Act requires that the Secretary adjust the DRG classifications and relative weights at least annually. These adjustments are made to reflect changes in treatment patterns, technology, and any other factors that may change the relative use of hospital resources.

B. MS-DRG Reclassifications

For general information about the MS-DRG system, including yearly reviews and changes to the MS-DRGs, we refer readers to the previous discussions in the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43764 through 43766), the FY 2011 IPPS/LTCH PPS final rule (75 FR 50053 through 50055), the FY 2012 IPPS/LTCH PPS final rule (76 FR 51485 through 51487), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53273), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50512).

C. Adoption of the MS-DRGs in FY 2008

For information on the adoption of the MS-DRGs in FY 2008, we refer readers to the FY 2008 IPPS final rule with comment period (72 FR 47140 through 47189).

D. FY 2015 MS-DRG Documentation and Coding Adjustment

1. Background on the Prospective MS-DRG Documentation and Coding Adjustments for FY 2008 and FY 2009 Authorized by Pub. L. 110-90

In the FY 2008 IPPS final rule with comment period (72 FR 47140 through 47189), we adopted the MS-DRG patient classification system for the IPPS, effective October 1, 2007, to better recognize severity of illness in Medicare payment rates for acute care hospitals. The adoption of the MS-DRG system resulted in the expansion of the number of DRGs from 538 in FY 2007 to 745 in FY 2008. (In FY 2014, there are 751 MS-DRGs.) By increasing the number of MS-DRGs and more fully taking into account patient severity of illness in Medicare payment rates for acute care hospitals, MS-DRGs encourage hospitals to improve their documentation and coding of patient diagnoses.

In the FY 2008 IPPS final rule with comment period (72 FR 47175 through 47186), we indicated that the adoption of the MS-DRGs had the potential to lead to increases in aggregate payments without a corresponding increase in actual patient severity of illness due to the incentives for additional documentation and coding. In that final rule with comment period, we exercised our authority under section 1886(d)(3)(A)(vi) of the Act, which authorizes us to maintain budget neutrality by adjusting the national standardized amount, to eliminate the estimated effect of changes in coding or classification that do not reflect real changes in case-mix. Our actuaries estimated that maintaining budget neutrality required an adjustment of −4.8 percent to the national standardized amount. We provided for phasing in this −4.8 percent adjustment over 3 years. Specifically, we established prospective documentation and coding adjustments of −1.2 percent for FY 2008, −1.8 percent for FY 2009, and −1.8 percent for FY 2010.

On September 29, 2007, Congress enacted the TMA [Transitional Medical Assistance], Abstinence Education, and QI [Qualifying Individuals] Programs Extension Act of 2007 (Pub. L. 110-90). Section 7(a) of Public Law 110-90 reduced the documentation and coding adjustment made as a result of the MS-DRG system that we adopted in the FY 2008 IPPS final rule with comment period to −0.6 percent for FY 2008 and −0.9 percent for FY 2009, and we finalized the FY 2008 adjustment through rulemaking, effective October 1, 2007 (72 FR 66886).

For FY 2009, section 7(a) of Public Law 110-90 required a documentation and coding adjustment of −0.9 percent, and we finalized that adjustment through rulemaking effective October 1, 2008 (73 FR 48447). The documentation and coding adjustments established in the FY 2008 IPPS final rule with comment period, which reflected the amendments made by section 7(a) of Public Law 110-90, are cumulative. As a result, the −0.9 percent documentation and coding adjustment for FY 2009 was in addition to the −0.6 percent adjustment for FY 2008, yielding a combined effect of −1.5 percent.

2. Adjustment to the Average Standardized Amounts Required by Pub. L. 110-90

a. Prospective Adjustment Required by Section 7(b)(1)(A) of Pub. L. 110-90

Section 7(b)(1)(A) of Public Law 110-90 requires that, if the Secretary determines that implementation of the MS-DRG system resulted in changes in documentation and coding that did not reflect real changes in case-mix for discharges occurring during FY 2008 or FY 2009 that are different than the prospective documentation and coding adjustments applied under section 7(a) of Public Law 110-90, the Secretary shall make an appropriate adjustment under section 1886(d)(3)(A)(vi) of the Act. Section 1886(d)(3)(A)(vi) of the Act authorizes adjustments to the average standardized amounts for subsequent fiscal years in order to eliminate the effect of such coding or classification changes. These adjustments are intended to ensure that future annual aggregate IPPS payments are the same as the payments that otherwise would have been made had the prospective adjustments for documentation and coding applied in FY 2008 and FY 2009 reflected the change that occurred in those years.

b. Recoupment or Repayment Adjustments in FYs 2010 Through 2012 Required by Section 7(b)(1)(B) Pub. L. 110-90

If, based on a retroactive evaluation of claims data, the Secretary determines that implementation of the MS-DRG system resulted in changes in documentation and coding that did not reflect real changes in case-mix for discharges occurring during FY 2008 or FY 2009 that are different from the prospective documentation and coding adjustments applied under section 7(a) of Public Law 110-90, section 7(b)(1)(B) of Public Law 110-90 requires the Secretary to make an additional adjustment to the standardized amounts under section 1886(d) of the Act. This adjustment must offset the estimated increase or decrease in aggregate payments for FYs 2008 and 2009 (including interest) resulting from the difference between the estimated actual documentation and coding effect and the documentation and coding adjustment applied under section 7(a) of Public Law 110-90. This adjustment is in addition to making an appropriate adjustment to the standardized amounts under section 1886(d)(3)(A)(vi) of the Act as required by section 7(b)(1)(A) of Public Law 110-90. That is, these adjustments are intended to recoup (or repay, in the case of underpayments) spending in excess of (or less than) spending that would have occurred had the prospective adjustments for changes in documentation and coding applied in FY 2008 and FY 2009 matched the changes that occurred in those years. Public Law 110-90 requires that the Secretary only make these recoupment or repayment adjustments for discharges occurring during FYs 2010, 2011, and 2012.

3. Retrospective Evaluation of FY 2008 and FY 2009 Claims Data

In order to implement the requirements of section 7 of Public Law 110-90, we performed a retrospective evaluation of the FY 2008 data for claims paid through December 2008 using the methodology first described in the FY 2009 IPPS/LTCH PPS final rule (73 FR 43768 and 43775) and later discussed in the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43768 through 43772). We performed the same analysis for FY 2009 claims data using the same methodology as we did for FY 2008 claims (75 FR 50057 through 50068). The results of the analysis for the FY 2011 IPPS/LTCH PPS proposed and final rules, and subsequent evaluations in FY 2012, supported that the 5.4 percent estimate accurately reflected the FY 2009 increases in documentation and coding under the MS-DRG system. We were persuaded by both MedPAC's analysis (as discussed in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50064 through 50065)) and our own review of the methodologies recommended by various commenters that the methodology we employed to determine the required documentation and coding adjustments was sound.

As in prior years, the FY 2008, FY 2009, and FY 2010 MedPAR files are available to the public to allow independent analysis of the FY 2008 and FY 2009 documentation and coding effects. Interested individuals may still order these files through the CMS Web site at: http://www.cms.gov/Research-Statistics-Data-and-Systems/Files-for-Order/LimitedDataSets/ by clicking on MedPAR Limited Data Set (LDS)-Hospital (National). This CMS Web page describes the file and provides directions and further detailed instructions for how to order.

Persons placing an order must send the following: A Letter of Request, the LDS Data Use Agreement and Research Protocol (refer to the Web site for further instructions), the LDS Form, and a check (refer to the Web site for the required payment amount) to:

Mailing address if using the U.S. Postal Service: Centers for Medicare & Medicaid Services, RDDC Account, Accounting Division, P.O. Box 7520, Baltimore, MD 21207-0520.

Mailing address if using express mail: Centers for Medicare & Medicaid Services, OFM/Division of Accounting—RDDC, 7500 Security Boulevard, C3-07-11, Baltimore, MD 21244-1850.

4. Prospective Adjustments for FY 2008 and FY 2009 Authorized by Section 7(b)(1)(A) of Pub. L. 110-90

In the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43767 through 43777), we opted to delay the implementation of any documentation and coding adjustment until a full analysis of case-mix changes based on FY 2009 claims data could be completed. We refer readers to the FY 2010 IPPS/RY LTCH PPS final rule for a detailed description of our proposal, responses to comments, and finalized policy. After analysis of the FY 2009 claims data for the FY 2011 IPPS/LTCH PPS final rule (75 FR 50057 through 50073), we found a total prospective documentation and coding effect of 5.4 percent. After accounting for the −0.6 percent and the −0.9 percent documentation and coding adjustments in FYs 2008 and 2009, we found a remaining documentation and coding effect of 3.9 percent. As we have discussed, an additional cumulative adjustment of −3.9 percent would be necessary to meet the requirements of section 7(b)(1)(A) of Public Law 110-90 to make an adjustment to the average standardized amounts in order to eliminate the full effect of the documentation and coding changes that do not reflect real changes in case-mix on future payments. Unlike section 7(b)(1)(B) of Public Law 110-90, section 7(b)(1)(A) does not specify when we must apply the prospective adjustment, but merely requires us to make an “appropriate” adjustment. Therefore, as we stated in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50061), we believed the law provided some discretion as to the manner in which we applied the prospective adjustment of −3.9 percent. As we discussed extensively in the FY 2011 IPPS/LTCH PPS final rule, it has been our practice to moderate payment adjustments when necessary to mitigate the effects of significant downward adjustments on hospitals, to avoid what could be widespread, disruptive effects of such adjustments on hospitals. Therefore, we stated that we believed it was appropriate to not implement the −3.9 percent prospective adjustment in FY 2011 because we finalized a −2.9 percent recoupment adjustment for that fiscal year. Accordingly, we did not propose a prospective adjustment under section 7(b)(1)(A) of Public Law 110-90 for FY 2011 (75 FR 23868 through 23870). We noted that, as a result, payments in FY 2011 (and in each future fiscal year until we implemented the requisite adjustment) would be higher than they would have been if we had implemented an adjustment under section 7(b)(1)(A) of Public Law 110-90.

In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51489 and 51497), we indicated that, because further delay of this prospective adjustment would result in a continued accrual of unrecoverable overpayments, it was imperative that we implement a prospective adjustment for FY 2012, while recognizing CMS' continued desire to mitigate the effects of any significant downward adjustments to hospitals. Therefore, we implemented a −2.0 percent prospective adjustment to the standardized amount instead of the full −3.9 percent.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53274 through 53276), we completed the prospective portion of the adjustment required under section 7(b)(1)(A) of Public Law 110-90 by finalizing a −1.9 percent adjustment to the standardized amount for FY 2013. We stated that this adjustment would remove the remaining effect of the documentation and coding changes that do not reflect real changes in case-mix that occurred in FY 2008 and FY 2009. We believed that it was imperative to implement the full remaining adjustment, as any further delay would result in an overstated standardized amount in FY 2013 and any future fiscal years until a full adjustment was made.

We noted again that delaying full implementation of the prospective portion of the adjustment required under section 7(b)(1)(A) of Public Law 110-90 until FY 2013 resulted in payments in FY 2010 through FY 2012 being overstated. These overpayments could not be recovered by CMS as section 7(b)(1)(B) of Public Law 110-90 limited recoupments to overpayments made in FY 2008 and FY 2009.

5. Recoupment or Repayment Adjustment Authorized by Section 7(b)(1)(B) of Pub. L. 110-90

Section 7(b)(1)(B) of Public Law 110-90 requires the Secretary to make an adjustment to the standardized amounts under section 1886(d) of the Act to offset the estimated increase or decrease in aggregate payments for FY 2008 and FY 2009 (including interest) resulting from the difference between the estimated actual documentation and coding effect and the documentation and coding adjustments applied under section 7(a) of Public Law 110-90. This determination must be based on a retrospective evaluation of claims data. Our actuaries estimated that there was a 5.8 percentage point difference resulting in an increase in aggregate payments of approximately $6.9 billion. Therefore, as discussed in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50062 through 50067), we determined that an aggregate adjustment of −5.8 percent in FYs 2011 and 2012 would be necessary in order to meet the requirements of section 7(b)(1)(B) of Public Law 110-90 to adjust the standardized amounts for discharges occurring in FYs 2010, 2011, and/or 2012 to offset the estimated amount of the increase in aggregate payments (including interest) in FYs 2008 and 2009.

It is often our practice to phase in payment rate adjustments over more than one year in order to moderate the effect on payment rates in any one year. Therefore, consistent with the policies that we have adopted in many similar cases, in the FY 2011 IPPS/LTCH PPS final rule, we made an adjustment to the standardized amount of −2.9 percent, representing approximately half of the aggregate adjustment required under section 7(b)(1)(B) of Public Law 110-90, for FY 2011. An adjustment of this magnitude allowed us to moderate the effects on hospitals in one year while simultaneously making it possible to implement the entire adjustment within the timeframe required under section 7(b)(1)(B) of Public Law 110-90 (that is, no later than FY 2012). For FY 2012, in accordance with the timeframes set forth by section 7(b)(1)(B) of Public Law 110-90, and consistent with the discussion in the FY 2011 IPPS/LTCH PPS final rule, we completed the recoupment adjustment by implementing the remaining −2.9 percent adjustment, in addition to removing the effect of the −2.9 percent adjustment to the standardized amount finalized for FY 2011 (76 FR 51489 and 51498). Because these adjustments, in effect, balanced out, there was no year-to-year change in the standardized amount due to this recoupment adjustment for FY 2012. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53276), we made a final +2.9 percent adjustment to the standardized amount, completing the recoupment portion of section 7(b)(1)(B) of Public Law 110-90. We note that with this positive adjustment, according to our estimates, all overpayments made in FY 2008 and FY 2009 have been fully recaptured with appropriate interest, and the standardized amount has been returned to the appropriate baseline.

6. Recoupment or Repayment Adjustment Authorized by Section 631 of the American Taxpayer Relief Act of 2012 (ATRA)

Section 631 of the ATRA amended section 7(b)(1)(B) of Public Law 110-90 to require the Secretary to make a recoupment adjustment or adjustments totaling $11 billion by FY 2017. This adjustment represents the amount of the increase in aggregate payments as a result of not completing the prospective adjustment authorized under section 7(b)(1)(A) of Public Law 110-90 until FY 2013. As discussed earlier, this delay in implementation resulted in overstated payment rates in FYs 2010, 2011, and 2012. The resulting overpayments could not have been recovered under Public Law 110-90.

Similar to the adjustments authorized under section 7(b)(1)(B) of Public Law 110-90, the adjustment required under section 631 of the ATRA is a one-time recoupment of a prior overpayment, not a permanent reduction to payment rates. Therefore, any adjustment made to reduce payment rates in one year would eventually be offset by a positive adjustment, once the necessary amount of overpayment is recovered.

As we stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50515 through 50517), our actuaries estimate that a −9.3 percent adjustment to the standardized amount would be necessary if CMS were to fully recover the $11 billion recoupment required by section 631 of the ATRA in FY 2014. It is often our practice to phase in payment rate adjustments over more than one year, in order to moderate the effect on payment rates in any one year. Therefore, consistent with the policies that we have adopted in many similar cases, and after consideration of the public comments we received, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50515 through 50517), we implemented a −0.8 percent recoupment adjustment to the standardized amount in FY 2014. We stated that if adjustments of approximately −0.8 percent are implemented in FYs 2014, 2015, 2016, and 2017, using standard inflation factors, we estimate that the entire $11 billion will be accounted for by the end of the statutory 4-year timeline. As estimates of any future adjustments are subject to slight variations in total savings, we did not provide for specific adjustments for FYs 2015, 2016, or 2017 at that time. We stated that we believed that this level of adjustment for FY 2014 was a reasonable and fair approach that satisfies the requirements of the statute while mitigating extreme annual fluctuations in payment rates. In addition, we again noted that this −0.8 percent recoupment adjustment, and future adjustments under this authority, will be eventually offset by an equivalent positive adjustment once the full $11 billion recoupment requirement has been realized.

Consistent with the approach discussed in the FY 2014 rulemaking for recouping the $11 billion required by section 631 of the ATRA, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27997 through 27998), we proposed an additional −0.8 percent recoupment adjustment to the standardized amount for FY 2015. We estimated that this level of adjustment, combined with leaving the −0.8 percent adjustment made for FY 2014 in place, would recover up to $2 billion in FY 2015. Taking into account the approximately $1 billion recovered in FY 2014, this would leave approximately $8 billion remaining to be recovered by FY 2017.

Comment: Several commenters restated their previous position, as set forth in comments submitted in response to the FY 2014 IPPS/LTCH PPS proposed rule and summarized in the FY 2014 IPPS/LTCH PPS final rule, that CMS overstated the impact of documentation and coding effects for prior years. Commenters cited potential deficiencies in the CMS methodology and disagreed that the congressionally mandated adjustment is warranted. However, the majority of these commenters conceded that CMS is required by section 631 of the ATRA to recover $11 billion by FY 2017, and supported CMS' policy to phase in the adjustments over a 4-year period.

Response: We appreciate the commenters' support. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50515 through 50517) for our response to the commenters' position that CMS overstated the impact of documentation and coding effects.

After consideration of the public comments we received, we are finalizing the proposal to make an additional −0.8 percent adjustment to the standardized amount for FY 2015. Considering the −0.8 percent adjustment made in FY 2014, we expect the combined impact of these adjustments will be to recover $2 billion dollars in overpayments in FY 2015. Combined with the estimated $1 billion adjustment made in FY 2014, we estimate that $3 billion of the $11 billion in overpayments required to be recovered by section 631 of the ATRA will be accounted for.

We continue to believe that if adjustments of approximately −0.8 percent are implemented in FYs 2014, 2015, 2016, and 2017, using standard inflation factors, the entire $11 billion will be accounted for by the end of the statutory 4-year timeline. As we explained in the FY 2014 IPPS/LTCH PPS final rule, estimates of any future adjustments are subject to slight variations in total savings. Therefore, we have not yet addressed specific adjustments for FY 2016 and FY 2017. We continue to believe that the −0.8 percent adjustment for FY 2015 is a reasonable and fair approach that will help satisfy the requirements of the statute while mitigating extreme annual fluctuations in payment rates. In addition, we again note that this −0.8 percent recoupment adjustment, and future adjustments under this authority, will be eventually offset by an equivalent positive adjustment once the full $11 billion recoupment requirement has been realized.

7. Prospective Adjustment for the MS-DRG Documentation and Coding Effect Through FY 2010

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50515 through 50517), we discussed the possibility of applying an additional prospective adjustment to account for the cumulative MS-DRG documentation and coding effect through FY 2010. In that final rule, we stated that if we were to apply such an adjustment, we believed the most appropriate additional adjustment was −0.55 percent. However, we decided not to apply such an adjustment in FY 2014, in light of the need to make the retrospective adjustments required by the ATRA. We continue to believe that if we were to apply an additional prospective adjustment for the cumulative MS-DRG documentation and coding effect through FY 2010, the most appropriate additional adjustment is −0.55 percent. However, we did not propose such an adjustment for FY 2015, in light of the ongoing recoupment required by the ATRA. We will consider whether such an additional adjustment is appropriate in future years' rulemaking.

Comment: Commenters reiterated their concern, as set forth in comments submitted in response to the FY 2014 IPPS/LTCH PPS proposed rule and summarized in the FY 2014 IPPS/LTCH PPS final rule, that CMS overstated the adjustment factor for documentation and coding, including the revised −0.55 percent factor to adjust for documentation and coding that occurred in FY 2010. Commenters believed that adjustments related to FY 2010 documentation and coding are not required under section 631 of the ATRA. Commenters urged CMS to not consider additional adjustments, other than those required by section 631 of the ATRA.

Response: We appreciate the commenters' concerns. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50515 through 50517) for our response to the commenters' position that CMS overstated the impact of documentation and coding effects. We did not propose to make any additional prospective adjustment to address the cumulative documentation and coding effect through FY 2010 for FY 2015. We will consider these comments in future years' rulemaking.

E. Refinement of the MS-DRG Relative Weight Calculation

1. Background

Beginning in FY 2007, we implemented relative weights for DRGs based on cost report data instead of charge information. We refer readers to the FY 2007 IPPS final rule (71 FR 47882) for a detailed discussion of our final policy for calculating the cost-based DRG relative weights and to the FY 2008 IPPS final rule with comment period (72 FR 47199) for information on how we blended relative weights based on the CMS DRGs and MS-DRGs.

As we implemented cost-based relative weights, some public commenters raised concerns about potential bias in the weights due to “charge compression,” which is the practice of applying a higher percentage charge markup over costs to lower cost items and services, and a lower percentage charge markup over costs to higher cost items and services. As a result, the cost-based weights would undervalue high-cost items and overvalue low-cost items if a single cost-to-charge ratio (CCR) is applied to items of widely varying costs in the same cost center. To address this concern, in August 2006, we awarded a contract to the Research Triangle Institute, International (RTI) to study the effects of charge compression in calculating the relative weights and to consider methods to reduce the variation in the CCRs across services within cost centers. For a detailed summary of RTI's findings, recommendations, and public comments that we received on the report, we refer readers to the FY 2009 IPPS/LTCH PPS final rule (73 FR 48452 through 48453). In addition, we refer readers to RTI's July 2008 final report titled “Refining Cost to Charge Ratios for Calculating APC and MS-DRG Relative Payment Weights” (http://www.rti.org/reports/cms/HHSM-500-2005-0029I/PDF/Refining_Cost_to_Charge_Ratios_200807_Final.pdf).

In the FY 2009 IPPS final rule (73 FR 48458 through 48467), in response to the RTI's recommendations concerning cost report refinements, we discussed our decision to pursue changes to the cost report to split the cost center for Medical Supplies Charged to Patients into one line for “Medical Supplies Charged to Patients” and another line for “Implantable Devices Charged to Patients.” We acknowledged, as RTI had found, that charge compression occurs in several cost centers that exist on the Medicare cost report. However, as we stated in the FY 2009 IPPS final rule, we focused on the CCR for Medical Supplies and Equipment because RTI found that the largest impact on the MS-DRG relative weights could result from correcting charge compression for devices and implants. In determining the items that should be reported in these respective cost centers, we adopted the commenters' recommendations that hospitals should use revenue codes established by the AHA's National Uniform Billing Committee to determine the items that should be reported in the “Medical Supplies Charged to Patients” and the “Implantable Devices Charged to Patients” cost centers. Accordingly, a new subscripted line for “Implantable Devices Charged to Patients” was created in July 2009. This new subscripted cost center has been available for use for cost reporting periods beginning on or after May 1, 2009.

As we discussed in the FY 2009 IPPS final rule (73 FR 48458) and in the CY 2009 OPPS/ASC final rule with comment period (73 FR 68519 through 68527), in addition to the findings regarding implantable devices, RTI also found that the costs and charges of computed tomography (CT) scans, magnetic resonance imaging (MRI), and cardiac catheterization differ significantly from the costs and charges of other services included in the standard associated cost center. RTI also concluded that both the IPPS and the OPPS relative weights would better estimate the costs of those services if CMS were to add standard cost centers for CT scans, MRIs, and cardiac catheterization in order for hospitals to report separately the costs and charges for those services and in order for CMS to calculate unique CCRs to estimate the costs from charges on claims data. In the FY 2011 IPPS/LTCH PPS final rule (75 FR 50075 through 50080), we finalized our proposal to create standard cost centers for CT scans, MRIs, and cardiac catheterization, and to require that hospitals report the costs and charges for these services under new cost centers on the revised Medicare cost report Form CMS-2552-10. (We refer readers to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50075 through 50080) for a detailed discussion of the reasons for the creation of standard cost centers for CT scans, MRIs, and cardiac catheterization.) The new standard cost centers for CT scans, MRIs, and cardiac catheterization are effective for cost reporting periods beginning on or after May 1, 2010, on the revised cost report Form CMS-2552-10.

In the FY 2009 IPPS final rule (73 FR 48468), we stated that, due to what is typically a 3-year lag between the reporting of cost report data and the availability for use in ratesetting, we anticipated that we might be able to use data from the new “Implantable Devices Charged to Patients” cost center to develop a CCR for “Implantable Devices Charged to Patients” in the FY 2012 or FY 2013 IPPS rulemaking cycle. However, as noted in the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43782), due to delays in the issuance of the revised cost report Form CMS 2552-10, we determined that a new CCR for “Implantable Devices Charged to Patients” might not be available before FY 2013. Similarly, when we finalized the decision in the FY 2011 IPPS/LTCH PPS final rule to add new cost centers for CT scans, MRIs, and cardiac catheterization, we explained that data from any new cost centers that may be created will not be available until at least 3 years after they are first used (75 FR 50077). In preparation for the FY 2012 IPPS/LTCH PPS rulemaking, we checked the availability of data in the “Implantable Devices Charged to Patients” cost center on the FY 2009 cost reports, but we did not believe that there was a sufficient amount of data from which to generate a meaningful analysis in this particular situation. Therefore, we did not propose to use data from the “Implantable Devices Charged to Patients” cost center to create a distinct CCR for “Implantable Devices Charged to Patients” for use in calculating the MS-DRG relative weights for FY 2012. We indicated that we would reassess the availability of data for the “Implantable Devices Charged to Patients” cost center for the FY 2013 IPPS/LTCH PPS rulemaking cycle and, if appropriate, we would propose to create a distinct CCR at that time.

During the development of the FY 2013 IPPS/LTCH PPS proposed and final rules, hospitals were still in the process of transitioning from the previous cost report Form CMS-2552-96 to the new cost report Form CMS-2552-10. Therefore, we were able to access only those cost reports in the FY 2010 HCRIS with fiscal year begin dates on or after October 1, 2009, and before May 1, 2010; that is, those cost reports on Form CMS-2552-96. Data from the Form CMS-2552-10 cost reports were not available because cost reports filed on the Form CMS-2552-10 were not accessible in the HCRIS. Further complicating matters was that, due to additional unforeseen technical difficulties, the corresponding information regarding charges for implantable devices on hospital claims was not yet available to us in the MedPAR file. Without the breakout in the MedPAR file of charges associated with implantable devices to correspond to the costs of implantable devices on the cost report, we believed that we had no choice but to continue computing the relative weights with the current CCR that combines the costs and charges for supplies and implantable devices. We stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53281 through 53283) that when we do have the necessary data for supplies and implantable devices on the claims in the MedPAR file to create distinct CCRs for the respective cost centers for supplies and implantable devices, we hoped that we would also have data for an analysis of creating distinct CCRs for CT scans, MRIs, and cardiac catheterization, which could then be finalized through rulemaking. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53281), we stated that, prior to proposing to create these CCRs, we would first thoroughly analyze and determine the impacts of the data, and that distinct CCRs for these new cost centers would be used in the calculation of the relative weights only if they were first finalized through rulemaking.

At the time of the development of the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27506 through 27507), we had a substantial number of hospitals completing all, or some, of these new cost centers on the FY 2011 Medicare cost reports, compared to prior years. We stated that we believed that the analytic findings described using the FY 2011 cost report data and FY 2012 claims data supported our original decision to break out and create new cost centers for implantable devices, MRIs, CT scans, and cardiac catheterization, and we saw no reason to further delay proposing to implement the CCRs of each of these cost centers. Therefore, beginning in FY 2014, we proposed to calculate the MS-DRG relative weights using 19 CCRs, creating distinct CCRs from cost report data for implantable devices, MRIs, CT scans, and cardiac catheterization (78 FR 27509).

We refer readers to the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27507 through 27509) and final rule (78 FR 50518 through 50523) in which we presented data analyses using distinct CCRs for implantable devices, MRIs, CT scans, and cardiac catheterization. The FY 2014 IPPS/LTCH PPS final rule also set forth our responses to public comments we received on our proposal to implement these CCRs. As explained in more detail in the FY 2014 IPPS/LTCH PPS final rule, we finalized our proposal to use 19 CCRs to calculate MS-DRG relative weights beginning in FY 2014—the then existing 15 cost centers and the 4 new CCRs for implantable devices, MRIs, CT scans, and cardiac catheterization. Therefore, beginning in FY 2014, we calculated the IPPS MS-DRG relative weights using 19 CCRs, creating distinct CCRs for implantable devices, MRIs, CT scans, and cardiac catheterization.

2. Discussion of Policy for FY 2015

As we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27999), to calculate the MS-DRG relative weights for FY 2015, we used two data sources: the MedPAR file as the claims data source and the HCRIS as the cost report data source. We adjusted the charges from the claims to costs by applying the 19 national average CCRs developed from the cost reports. The description of the calculation of the 19 CCRs and the MS-DRG relative weights for FY 2015 is included in section II.H. of the preamble of this final rule.

Comment: One commenter supported CMS' plans to continue to use data from the implantable devices cost center to create a distinct CCR for implantable devices in the calculation of the FY 2015 relative weights. The commenter also urged CMS to promote transparency by making detailed data from the implantable device cost center available to the public so that hospitals could evaluate these costs in the context of overall hospital charges.

Response: We did not propose any changes to the methodology or data sources for the FY 2015 CCRs and relative weights. Regarding the commenter's request to make data from the implantable devices cost center available to the public, we note that hospital cost report data, via HCRIS, are available to the public. For more information, we refer to readers to the CMS Web site at: http://www.cms.gov/Research-Statistics-Data-and-Systems/Files-for-Order/CostReports/index.html?redirect=/costReports.

F. Adjustment to MS-DRGs for Preventable Hospital-Acquired Conditions (HACs), Including Infections for FY 2015

1. Background

Section 1886(d)(4)(D) of the Act addresses certain hospital-acquired conditions (HACs), including infections. This provision is part of an array of Medicare tools that we are using to promote increased quality and efficiency of care. Under the IPPS, hospitals are encouraged to treat patients efficiently because they receive the same DRG payment for stays that vary in length and in the services provided, which gives hospitals an incentive to avoid unnecessary costs in the delivery of care. In some cases, conditions acquired in the hospital do not generate higher payments than the hospital would otherwise receive for cases without these conditions. To this extent, the IPPS encourages hospitals to avoid complications.

However, the treatment of these conditions can generate higher Medicare payments in two ways. First, if a hospital incurs exceptionally high costs treating a patient, the hospital stay may generate an outlier payment. Because the outlier payment methodology requires that hospitals experience large losses on outlier cases before outlier payments are made, hospitals have an incentive to prevent outliers. Second, under the MS-DRG system that took effect in FY 2008 and that has been refined through rulemaking in subsequent years, certain conditions can generate higher payments even if the outlier payment requirements are not met. Under the MS-DRG system, there are currently 261 sets of MS-DRGs that are split into 2 or 3 subgroups based on the presence or absence of a complication or comorbidity (CC) or a major complication or comorbidity (MCC). The presence of a CC or an MCC generally results in a higher payment.

Section 1886(d)(4)(D) of the Act specifies that, by October 1, 2007, the Secretary was required to select, in consultation with the Centers for Disease Control and Prevention (CDC), at least two conditions that: (a) Are high cost, high volume, or both; (b) are assigned to a higher paying MS-DRG when present as a secondary diagnosis (that is, conditions under the MS-DRG system that are CCs or MCCs); and (c) could reasonably have been prevented through the application of evidence-based guidelines. Section 1886(d)(4)(D) of the Act also specifies that the list of conditions may be revised, again in consultation with the CDC, from time to time as long as the list contains at least two conditions.

Effective for discharges occurring on or after October 1, 2008, under the authority of section 1886(d)(4)(D) of the Act, Medicare no longer assigns an inpatient hospital discharge to a higher paying MS-DRG if a selected condition is not present on admission (POA). Thus, if a selected condition that was not POA manifests during the hospital stay, it is considered a HAC and the case is paid as though the secondary diagnosis was not present. However, even if a HAC manifests during the hospital stay, if any nonselected CC or MCC appears on the claim, the claim will be paid at the higher MS-DRG rate. In addition, Medicare continues to assign a discharge to a higher paying MS-DRG if a selected condition is POA. When a HAC is not POA, payment can be affected in a manner shown in the diagram below

2. HAC Selection

Beginning in FY 2007, we have set forth proposals, and solicited and responded to public comments, to implement section 1886(d)(4)(D) of the Act through the IPPS annual rulemaking process. For specific policies addressed in each rulemaking cycle, including a detailed discussion of the collaborative interdepartmental process and public input regarding selected and potential candidate HACs, we refer readers to the following rules: The FY 2007 IPPS proposed rule (71 FR 24100) and final rule (71 FR 48051 through 48053); the FY 2008 IPPS proposed rule (72 FR 24716 through 24726) and final rule with comment period (72 FR 47200 through 47218); the FY 2009 IPPS proposed rule (73 FR 23547) and final rule (73 FR 48471); the FY 2010 IPPS/RY 2010 LTCH PPS proposed rule (74 FR 24106) and final rule (74 FR 43782); the FY 2011 IPPS/LTCH PPS proposed rule (75 FR 23880) and final rule (75 FR 50080); the FY 2012 IPPS/LTCH PPS proposed rule (76 FR 25810 through 25816) and final rule (76 FR 51504 through 51522); the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27892 through 27898) and final rule (77 FR 53283 through 53303); and the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27509 through 27512) and final rule (78 FR 50523 through 50527). A complete list of the 11 current categories of HACs is included on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/Hospital-Acquired_Conditions.html.

3. Present on Admission (POA) Indicator Reporting

Collection of POA indicator data is necessary to identify which conditions were acquired during hospitalization for the HAC payment provision as well as for broader public health uses of Medicare data. In previous rulemaking, we provided both CMS and CDC Web site resources that are available to hospitals for assistance in this reporting effort. For detailed information regarding these sites and materials, including the application and use of POA indicators, we refer the reader to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51506 through 51507).

Currently, as we have discussed in the prior rulemaking cited under section II.I.2. of the preamble of this final rule, the POA indicator reporting requirement only applies to IPPS hospitals because they are subject to this HAC provision. Non-IPPS hospitals, including CAHs, LTCHs, IRFs, IPFs, cancer hospitals, children's hospitals, RNHCIs, and the Department of Veterans Affairs/Department of Defense hospitals, are exempt from POA reporting.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50524 through 50525), we noted that hospitals in Maryland operating under a statutory waiver were not paid under the IPPS, but rather were paid under the provisions of section 1814(b)(3) of the Act, and therefore prior to FY 2014 these hospitals were exempt from reporting POA indicators. However, we believed it was appropriate to require them to use POA indicator reporting on their claims so that we could include their data and have as complete a dataset as possible when we analyze trends and make further payment policy determinations, such as those authorized under section 1886(p) of the Act. Therefore, in the FY 2014 IPPS/LTCH PPS final rule, we finalized our policy that hospitals in Maryland that formerly operated under section 1814(b)(3) of the Act were no longer exempted from the POA indicator reporting requirement beginning with claims submitted on or after October 1, 2013, including all claims for discharges on or after October 1, 2013. We noted that, while this requirement was not effective until October 1, 2013, hospitals in Maryland could submit data with POA indicators before that date with the expectation that these data would be accepted by Medicare's claims processing systems. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707 through 50712) for a discussion of our FY 2014 final policies to implement section 1886(p) of the Act that are applicable to Maryland hospitals.)

Subsequent to our FY 2014 rulemaking, the State of Maryland entered into an agreement with CMS, effective January 1, 2014, to participate in CMS' new Maryland All-Payer Model, a 5-year hospital payment model. This model is being implemented under section 1115A of the Act, as added by section 3021 of the Affordable Care Act, which authorizes the testing of innovative payment and service delivery models, including models that allow States to “test and evaluate systems of all-payer payment reform for the medical care of residents of the State, including dual eligible individuals.” Section 1115A of the Act authorizes the Secretary to waive such requirements of titles XI and XVIII of the Act as may be necessary solely for purposes of carrying out section 1115A of the Act with respect to testing models.

Under the agreement with CMS, Maryland will limit per capita total hospital cost growth for all payers, including Medicare. In order to implement the new model, effective January 1, 2014, Maryland elected to no longer have Medicare make payments to Maryland hospitals in accordance with section 1814(b)(3) of the Act. Maryland also represented that it is no longer in continuous operation of a demonstration project reimbursement system since July 1, 1977, as specified under section 1814(b)(3) of the Act. Because Maryland hospitals are no longer paid under section 1814(b)(3) of the Act, they are no longer subject to those provisions of the Act and related implementing regulations that are specific to section 1814(b)(3) hospitals. Although CMS has waived certain provisions of the Act for Maryland hospitals, as set forth in the agreement between CMS and Maryland and subject to Maryland's compliance with the terms of the agreement, CMS has not waived the POA indicator reporting requirement. In other words, the changes to the status of Maryland hospitals under section 1814(b)(3) of the Act as described above do not in any way change the POA indicator reporting requirement for Maryland hospitals.

There are currently four POA indicator reporting options, “Y”, “W”, “N”, and “U”, as defined by the ICD-9-CM Official Guidelines for Coding and Reporting. We note that prior to January 1, 2011, we also used a POA indicator reporting option “1”. However, beginning on or after January 1, 2011, hospitals were required to begin reporting POA indicators using the 5010 electronic transmittal standards format. The 5010 format removes the need to report a POA indicator of “1” for codes that are exempt from POA reporting. We issued CMS instructions on this reporting change as a One-Time Notification, Pub. No. 100-20, Transmittal No. 756, Change Request 7024, effective on August 13, 2010, which can be located at the following link on the CMS Web site: http://www.cms.gov/manuals/downloads/Pub100_20.pdf.) The current POA indicators and their descriptors are shown in the chart below:

IndicatorDescriptor
YIndicates that the condition was present on admission.
WAffirms that the hospital has determined that, based on data and clinical judgment, it is not possible to document when the onset of the condition occurred.
NIndicates that the condition was not present on admission.
UIndicates that the documentation is insufficient to determine if the condition was present at the time of admission.

Under the HAC payment policy, we treat HACs coded with “Y” and “W” indicators as POA and allow the condition on its own to cause an increased payment at the CC and MCC level. We treat HACs coded with “N” and “U” indicators as Not Present on Admission (NPOA) and do not allow the condition on its own to cause an increased payment at the CC and MCC level. We refer readers to the following rules for a detailed discussion of POA indicator reporting: the FY 2009 IPPS proposed rule (73 FR 23559) and final rule (73 FR 48486 through 48487); the FY 2010 IPPS/RY 2010 LTCH PPS proposed rule (74 FR 24106) and final rule (74 FR 43784 through 43785); the FY 2011 IPPS/LTCH PPS proposed rule (75 FR 23881 through 23882) and final rule (75 FR 50081 through 50082); the FY 2012 IPPS/LTCH PPS proposed rule (76 FR 25812 through 25813) and final rule (76 FR 51506 through 51507); the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27893 through 27894) and final rule (77 FR 53284 through 53285); and the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27510 through 27511) and final rule (78 FR 50524 through 50525).

In addition, as discussed previously in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53324), the 5010 format allows the reporting and, effective January 1, 2011, the processing of up to 25 diagnoses and 25 procedure codes. As such, it is necessary to report a valid POA indicator for each diagnosis code, including the principal diagnosis and all secondary diagnoses up to 25.

4. HACs and POA Reporting in Preparation for Transition to ICD-10-CM and ICD-10-PCS

In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51506 and 51507), in preparation for the transition to the ICD-10-CM and ICD-10-PCS code sets, we indicated that further information regarding the use of the POA indicator with the ICD-10-CM/ICD-10-PCS classifications as they pertain to the HAC policy would be discussed in future rulemaking.

At the March 5, 2012 and the September 19, 2012 meetings of the ICD-9-CM Coordination and Maintenance Committee, an announcement was made with regard to the availability of the ICD-9-CM HAC list translation to ICD-10-CM and ICD-10-PCS code sets. Participants were informed that the list of the ICD-9-CM selected HACs has been translated into codes using the ICD-10-CM and ICD-10-PCS classification system. It was recommended that the public review this list of ICD-10-CM/ICD-10-PCS code translations of the selected HACs available on the CMS Web site at: http://www.cms.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. The translations can be found under the link titled “ICD-10-CM/PCS MS-DRG v30 Definitions Manual Table of Contents—Full Titles—HTML Version in Appendix I—Hospital-Acquired Conditions (HACs).” This CMS Web site regarding the ICD-10-MS-DRG Conversion Project is also available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/icd10_hacs.html. We encouraged the public to submit comments on these translations through the HACs Web page using the CMS ICD-10-CM/PCS HAC Translation Feedback Mailbox that was set up for this purpose under the Related Links section titled “CMS HAC Feedback.”

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50525), we stated that the final HAC list translation from ICD-9-CM to ICD-10-CM/ICD-10-PCS would be subject to formal rulemaking. We encouraged readers to review the educational materials and draft code sets available for ICD-10-CM/ICD-10-PCS on the CMS Web site at: http://www.cms.gov/ICD10/. In addition, we stated that the draft ICD-10-CM/ICD-10-PCS Coding Guidelines could be viewed on the CDC Web site at: http://www.cdc.gov/nchs/icd/icd10cm.htm.

The HACs code translation list from ICM-9-CM to ICD-10-CM/ICD-10-PCS is available to the public on the CMS Web site at: http://www.cms.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We note that Appendix I of the ICD-10-CM/PCS MS-DRG V31R Definitions Manual Table of Contents—Full Titles files (available in both text and HTML formats) are posted on the Web site and contain the DRA HACs translated to ICD-10.

We note that section 212 of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93), enacted on April 1, 2014, provides that the Secretary may not adopt ICD-10 prior to October 1, 2015. This effectively delayed the transition from ICD-9-CM to ICD-10. The Secretary expects to release a final rule in the near future that will include a new compliance date for use of ICD-10.

5. Current HACs and Previously Considered Candidate HACs

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28002), we did not propose to add or remove categories of the HACs. However, we indicated that we continue to encourage public dialogue about refinements to the HAC list by written stakeholder comments about both previously selected and potential candidate HACs. We refer readers to section II.F.6. of the FY 2008 IPPS final rule with comment period (72 FR 47202 through 47218) and to section II.F.7. of the FY 2009 IPPS final rule (73 FR 48774 through 48491) for detailed discussion supporting our determination regarding each of these conditions. We also refer readers to section II.F.5. of the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27892 through 27898), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53285 through 53292) for the HAC policy for FY 2013, and the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27509 through 27512) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50523 through 50527) for the HAC policy for FY 2014.

Comment: Some commenters stated they were pleased the CMS did not propose to expand the list of categories or conditions subject to the Deficit Reduction Act of 2005 provisions that would reduce payment for HACs not present on admission. However, one commenter suggested that CMS remove “falls and trauma” from the categories of conditions to which the HAC policy applies. Another believed that iatrogenic pneumothorax with thoracentesis and accidental puncture/bleeding with paracentesis are two conditions that meet the HAC criteria for inclusion and urged CMS to expand the HAC program in FY 2015 to include them.

Response: We value and appreciate these public comments, and we will take the comments and suggestions into consideration in future rulemaking.

Comment: One commenter recognized the importance of targeting HACs, but stated that the DRA HAC program does not recognize that certain conditions are not 100 percent preventable, despite adherence to evidence-based practices. The commenter noted that facilities that treat patients with greater comorbidities and complex conditions are at a greater risk for penalties. Specifically, the commenter reiterates concerns about the inclusion of Surgical Site Infections (SSI) Following Cardiac Implantable Electronic Device (CIED) as a HAC category. The commenter stated that there are many variables that may contribute to the risk of CIED-related infections and that the implanting physician may not be able to control all circumstances (for example, pre-operative white blood cell count, fever within 24 hours, and timing of perioperative antibiotic administration).

Response: In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51510 through 51511), we addressed commenters' concerns regarding the preventability of DRA HACs and noted that the statute does not require that a condition be “always preventable” in order to qualify as an HAC. We stated that the statute indicated that the condition be “reasonably preventable,” which necessarily implies something less than 100 percent.

Comment: One commenter recommended that CMS address the question that its hospital customers have posed regarding the effect of the DRA HAC policy when a patient is discharged from a hospital and then returns to a hospital to have a foreign object removed. Specifically, the commenter stated that hospitals need to be better informed about how Medicare payment changes if the hospital removing the foreign object is the same hospital at which the foreign object was left or is a different hospital, and if the foreign object is removed during an outpatient procedure or during an inpatient procedure.

Response: Questions related to payment for HACs are dependent upon how the conditions are coded and reported with ICD-9-CM and the corresponding POA indicator. The American Hospital Association (AHA) Central OfficeTM is the national clearinghouse for medical coding advice. Coding inquiries can be directed to the following AHA Web site: http://www.CodingClinicAdvisor.com. Instructions for how to assign the correct POA indicator can be found in the ICD-9-CM Official Guidelines for Coding and Reporting located at the CDC Web site: http://www.cdc.gov/nchs/icd/icd9cm_addenda_guidelines.htm. Also, illustrations of how to assign POA indicators are included in the Present on Admission (POA) Indicator Reporting by Acute Inpatient Prospective Payment System (IPPS) Hospitals Fact Sheet located on the CMS Hospital-Acquired Conditions Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/EducationalResources.html in the “Downloads” section. Table 1: CMS POA Indicator Reporting Options, Description, and Payment contains an explanation of when payment for a condition is made or not made, based on the POA indicator assigned, as shown below.

POA indicatorDescriptionMedicare payment
YDiagnosis was present at time of inpatient admissionPayment made for condition by Medicare, when an HAC is present.
NDiagnosis was not present at time of inpatient admissionNo payment made for condition by Medicare, when an HAC is present.
UDocumentation insufficient to determine if condition was present at the time of inpatient admissionNo payment made for condition by Medicare, when an HAC is present.
WClinically undetermined. Provider unable to clinically determine whether the condition was present at the time of inpatient admissionPayment made for condition by Medicare, when an HAC is present.

6. RTI Program Evaluation

On September 30, 2009, a contract was awarded to RTI to evaluate the impact of the Hospital-Acquired Condition-Present on Admission (HAC-POA) provisions on the changes in the incidence of selected conditions, effects on Medicare payments, impacts on coding accuracy, unintended consequences, and infection and event rates. This was an intra-agency project with funding and technical support from CMS, OPHS, AHRQ, and CDC. The evaluation also examined the implementation of the program and evaluated additional conditions for future selection. The contract with RTI ended on November 30, 2012. Summary reports of RTI's analysis of the FYs 2009, 2010, and 2011 MedPAR data files for the HAC-POA program evaluation were included in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50085 through 50101), the FY 2012 IPPS/LTCH PPS final rule (76 FR 51512 through 51522), and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53292 through 53302). Summary and detailed data also were made publicly available on the CMS Web site at: http://www.cms.gov/HospitalAcqCond/01_Overview.asp and the RTI Web site at: http://www.rti.org/reports/cms/.

In addition to the evaluation of HAC and POA MedPAR claims data, RTI also conducted analyses on readmissions due to HACs, the incremental costs of HACs to the health care system, a study of spillover effects and unintended consequences, as well as an updated analysis of the evidence-based guidelines for selected and previously considered HACs. Reports on these analyses have been made publicly available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/index.html.

7. Current and Previously Considered Candidate HACs—RTI Report on Evidence-Based Guidelines

The RTI program evaluation includes a report that provides references for all evidence-based guidelines available for each of the selected and previously considered candidate HACs that provide recommendations for the prevention of the corresponding conditions. Guidelines were primarily identified using the AHRQ National Guidelines Clearing House (NGCH) and the CDC, along with relevant professional societies. Guidelines published in the United States were used, if available. In the absence of U.S. guidelines for a specific condition, international guidelines were included.

Evidence-based guidelines that included specific recommendations for the prevention of the condition were identified for each of the selected conditions. In addition, evidence-based guidelines also were found for the previously considered candidate conditions. RTI prepared a final report to summarize its findings regarding evidence-based guidelines. This report can be found on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/Downloads/Evidence-Based-Guidelines.pdf.

Subsequent to this final report, RTI was awarded an FY 2014 Evidence-Based Guidelines Monitoring contract. Under the contract, RTI was to provide a summary report of all evidence-based guidelines available for each of the selected and previously considered candidate HACs that provide recommendations for the prevention of the corresponding conditions. This report is usually delivered to CMS annually in a May/June timeframe. We received the updated 2014 report and have made it available to the public on the CMS Hospital-Acquired Conditions Web page in the “Downloads” section at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HospitalAcqCond/index.html?redirect=/HospitalAcqCond/.

G. Changes to Specific MS-DRG Classifications

1. Discussion of Changes to Coding System and Basis for MS-DRG Updates

a. Conversion of MS-DRGs to the International Classification of Diseases, 10th Revision (ICD-10)

Providers use the code sets under the ICD-9-CM coding system to report diagnoses and procedures for Medicare hospital inpatient services under the MS-DRG system. A later coding edition, the ICD-10 coding system, includes the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) for diagnosis coding and the International Classification of Diseases, 10th Revision, Procedure Coding System (ICD-10-PCS) for inpatient hospital procedure coding, as well as the Official ICD-10-CM and ICD-10-PCS Guidelines for Coding and Reporting. The ICD-10 coding system was initially adopted for transactions conducted on or after October 1, 2013, as described in the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Administrative Simplification: Modifications to Medical Data Code Set Standards to Adopt ICD-10-CM and ICD-10-PCS Final Rule published in the Federal Register on January 16, 2009 (74 FR 3328 through 3362) (hereinafter referred to as the “ICD-10-CM and ICD-10-PCS final rule”). However, the Secretary of Health and Human Services issued a final rule that delayed the compliance date for ICD-10 from October 1, 2013, to October 1, 2014. That final rule, entitled “Administrative Simplification: Adoption of a Standard for a Unique Health Plan Identifier; Addition to the National Provider Identifier Requirements; and a Change to the Compliance Date for ICD-10-CM and ICD-10-PCS Medical Data Code Sets,” CMS-0040-F, was published in the Federal Register on September 5, 2012 (77 FR 54664) and is available for viewing on the Internet at: http://www.gpo.gov/fdsys/pkg/FR-2012-09-05/pdf/2012-21238.pdf. On April 1, 2014, the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93) was enacted, which specified that the Secretary may not adopt ICD-10 prior to October 1, 2015. Section 212 of Public Law 113-93, titled “Delay in Transition from ICD-9 to ICD-10 Code Sets,” provides that “[t]he Secretary of Health and Human Services may not, prior to October 1, 2015, adopt ICD-10 code sets as the standard for code sets under section 1173(c) of Act. On May 1, 2014, the Secretary announced plans to release an interim final rule in the near future that will include a new compliance date to require the use of ICD-10 beginning October 1, 2015. The rule will also require HIPAA covered entities to continue to use ICD-9-CM through September 30, 2015.

The anticipated move to ICD-10 necessitated the development of an ICD-10-CM/ICD-10-PCS version of the MS-DRGs. CMS began a project to convert the ICD-9-CM-based MS-DRGs to ICD-10 MS-DRGs. In response to the FY 2011 IPPS/LTCH PPS proposed rule, we received public comments on the creation of the ICD-10 version of the MS-DRGs, which will be implemented at the same time as ICD-10 (75 FR 50127 and 50128). While we did not propose an ICD-10 version of the MS-DRGs in the FY 2011 IPPS/LTCH PPS proposed rule, we noted that we have been actively involved in converting current MS-DRGs from ICD-9-CM codes to ICD-10 codes and sharing this information through the ICD-10 (previously ICD-9-CM) Coordination and Maintenance Committee. We undertook this early conversion project to assist other payers and providers in understanding how to implement their own conversion projects. We posted ICD-10 MS-DRGs based on Version 26.0 (FY 2009) of the MS-DRGs. We also posted a paper that describes how CMS went about completing this project and suggestions for other payers and providers to follow. Information on the ICD-10 MS-DRG conversion project can be found on the ICD-10 MS-DRG Conversion Project Web site at: http://cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We have continued to keep the public updated on our maintenance efforts for ICD-10-CM and ICD-10-PCS coding systems, as well as the General Equivalence Mappings that assist in conversion through the ICD-10 (previously ICD-9-CM) Coordination and Maintenance Committee. Information on these committee meetings can be found on the CMS Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/index.html.

During FY 2011, we developed and posted Version 28.0 of the ICD-10 MS-DRGs based on the FY 2011 MS-DRGs (Version 28.0) that we finalized in the FY 2011 IPPS/LTCH PPS final rule on the CMS Web site. This ICD-10 MS-DRGs Version 28.0 also included the CC Exclusion List and the ICD-10 version of the hospital-acquired conditions (HACs), which was not posted with Version 26.0. We also discussed this update at the September 15-16, 2010 and the March 9-10, 2011 meetings of the ICD-9-CM Coordination and Maintenance Committee. The minutes of these two meetings are posted on the CMS Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/index.html.

We reviewed comments on the ICD-10 MS-DRGs Version 28.0 and made updates as a result of these comments. We called the updated version the ICD-10 MS-DRGs Version 28-R1. We posted a Definitions Manual of ICD-10 MS-DRGs Version 28-R1 on our ICD-10 MS-DRG Conversion Project Web site. To make the review of Version 28-R1 updates easier for the public, we also made available pilot software on a CD ROM that could be ordered through the National Technical Information Service (NTIS). A link to the NTIS ordering page was provided on the CMS ICD-10 MS-DRGs Web page. We stated that we believed that, by providing the ICD-10 MS-DRGs Version 28-R1 Pilot Software (distributed on CD ROM), the public would be able to more easily review and provide feedback on updates to the ICD-10 MS-DRGs. We discussed the updated ICD-10 MS-DRGs Version 28-R1 at the September 14, 2011 ICD-9-CM Coordination and Maintenance Committee meeting. We encouraged the public to continue to review and provide comments on the ICD-10 MS-DRGs so that CMS could continue to update the system.

In FY 2012, we prepared the ICD-10 MS-DRGs Version 29.0, based on the FY 2012 MS-DRGs (Version 29.0) that we finalized in the FY 2012 IPPS/LTCH PPS final rule. We posted a Definitions Manual of ICD-10 MS-DRGs Version 29.0 on our ICD-10 MS-DRG Conversion Project Web site. We also prepared a document that describes changes made from Version 28.0 to Version 29.0 to facilitate a review. The ICD-10 MS-DRGs Version 29.0 was discussed at the ICD-9-CM Coordination and Maintenance Committee meeting on March 5, 2012. Information was provided on the types of updates made. Once again the public was encouraged to review and comment on the most recent update to the ICD-10 MS-DRGs.

CMS prepared the ICD-10 MS-DRGs Version 30.0 based on the FY 2013 MS-DRGs (Version 30.0) that we finalized in the FY 2013 IPPS/LTCH PPS final rule. We posted a Definitions Manual of the ICD-10 MS-DRGs Version 30.0 on our ICD-10 MS-DRG Conversion Project Web site. We also prepared a document that describes changes made from Version 29.0 to Version 30.0 to facilitate a review. We produced mainframe and computer software for Version 30.0, which was made available to the public in February 2013. Information on ordering the mainframe and computer software through NTIS was posted on the ICD-10 MS-DRG Conversion Project Web site. The ICD-10 MS-DRGs Version 30.0 computer software facilitated additional review of the ICD-10 MS-DRGs conversion.

We provided information on a study conducted on the impact of converting MS-DRGs to ICD-10. Information on this study is summarized in a paper entitled “Impact of the Transition to ICD-10 on Medicare Inpatient Hospital Payments.” This paper was posted on the CMS ICD-10 MS-DRGs Conversion Project Web site and was distributed and discussed at the September 15, 2010 ICD-9-CM Coordination and Maintenance Committee meeting. The paper described CMS' approach to the conversion of the MS-DRGs from ICD-9-CM codes to ICD-10 codes. The study was undertaken using the ICD-9-CM MS-DRGs Version 27.0 (FY 2010) which was converted to the ICD-10 MS-DRGs Version 27.0. The study estimated the impact on aggregate payment to hospitals and the distribution of payments across hospitals. The impact of the conversion from ICD-9-CM to ICD-10 on Medicare MS-DRG hospital payments was estimated using FY 2009 Medicare claims data. The study found a hospital payment increase of 0.05 percent using the ICD-10 MS-DRGs Version 27.0.

CMS provided an overview of this hospital payment impact study at the March 5, 2012 ICD-9-CM Coordination and Maintenance Committee meeting. This presentation followed presentations on the creation of ICD-10 MS-DRGs Version 29.0. A summary report of this meeting can be found on the CMS Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/index.html. At this March 2012 meeting, CMS announced that it would produce an update on this impact study based on an updated version of the ICD-10 MS-DRGs. This update of the impact study was presented at the March 5, 2013 ICD-9-CM Coordination and Maintenance Committee meeting. The study found that moving from an ICD-9-CM-based system to an ICD-10 MS-DRG replicated system would lead to DRG reassignments on only 1 percent of the 10 million MedPAR sample records used in the study. Ninety-nine percent of the records did not shift to another MS-DRG when using an ICD-10 MS-DRG system. For the 1 percent of the records that shifted, 45 percent of the shifts were to a higher weighted MS-DRG, while 55 percent of the shifts were to lower weighted MS-DRGs. The net impact across all MS-DRGs was a reduction by 4/10000 or minus 4 pennies per $100. The updated paper is posted on the CMS Web site at: http://cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html under the “Downloads” section. Information on the March 5, 2013 ICD-9-CM Coordination and Maintenance Committee meeting can be found on the CMS Web site at: http://cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/ICD-9-CM-C-and-M-Meeting-Materials.html. This update of the impact paper and the ICD-10 MS-DRG Version 30.0 software provided additional information to the public who were evaluating the conversion of the MS-DRGs to ICD-10 MS-DRGs.

CMS prepared the ICD-10 MS-DRGs Version 31.0 based on the FY 2014 MS-DRGs (Version 31.0) that we finalized in the FY 2014 IPPS/LTCH PPS final rule. In November 2013, we posted a Definitions Manual of the ICD-10 MS-DRGs Version 31.0 on the ICD-10 MS-DRG Conversion Project Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We also prepared a document that described changes made from Version 30.0 to Version 31.0 to facilitate a review. We produced mainframe and computer software for Version 31.0, which was made available to the public in December 2013. Information on ordering the mainframe and computer software through NTIS was posted on the CMS Web site at: http://cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html under the “Related Links” section. This ICD-10 MS-DRGs Version 31.0 computer software facilitated additional review of the ICD-10 MS-DRGs conversion. We encouraged the public to submit to CMS any comments on areas where they believed the ICD-10 MS-DRGs did not accurately reflect grouping logic found in the ICD-9-CM MS-DRGs Version 31.0.

We reviewed comments received and developed an update of ICD-10 MS-DRGs Version 31.0, which we called ICD-10 MS-DRGs Version 31.0-R. We have posted a Definitions Manual of the ICD-10 MS-DRGs Version 31.0-R on the ICD-10 MS-DRG Conversion Project Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We also prepared a document that describes changes made from Version 31.0 to Version 31.0-R to facilitate a review. We will continue to share ICD-10-MS-DRG conversion activities with the public through this Web site.

b. Basis for FY 2015 MS-DRG Updates

CMS encourages input from our stakeholders concerning the annual IPPS updates when that input is made available to us by December 7 of the year prior to the next annual proposed rule update. For example, to be considered for any updates or changes in FY 2016, comments and suggestions should be submitted by December 7, 2014. The comments that were submitted in a timely manner for FY 2015 are discussed below in this section.

Following are the changes we proposed to the MS-DRGs for FY 2015. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28004), we invited public comment on each of the MS-DRG classification proposed changes described below, as well as our proposals to maintain certain existing MS-DRG classifications, which also are discussed below. In some cases, we proposed changes to the MS-DRG classifications based on our analysis of claims data. In other cases, we proposed to maintain the existing MS-DRG classification based on our analysis of claims data. For the FY 2015 proposed rule, our MS-DRG analysis was based on claims data from the December 2013 update of the FY 2013 MedPAR file, which contains hospital bills received through September 30, 2013, for discharges occurring through September 30, 2013. In our discussion of the proposed MS-DRG reclassification changes that follows, we refer to our analysis of claims data from the “December 2013 update of the FY 2013 MedPAR file.”

As explained in previous rulemaking (76 FR 51487), in deciding whether to propose to make further modification to the MS-DRGs for particular circumstances brought to our attention, we considered whether the resource consumption and clinical characteristics of the patients with a given set of conditions are significantly different than the remaining patients in the MS-DRG. We evaluated patient care costs using average costs and lengths of stay and relied on the judgment of our clinical advisors to decide whether patients are clinically distinct or similar to other patients in the MS-DRG. In evaluating resource costs, we considered both the absolute and percentage differences in average costs between the cases we selected for review and the remainder of cases in the MS-DRG. We also considered variation in costs within these groups; that is, whether observed average differences were consistent across patients or attributable to cases that were extreme in terms of costs or length of stay, or both. Further, we considered the number of patients who will have a given set of characteristics and generally preferred not to create a new MS-DRG unless it would include a substantial number of cases.

2. MDC 1 (Diseases and Disorders of the Nervous System)

a. Intracerebral Therapies: Gliadel® Wafer

During the comment period for the FY 2014 IPPS/LTCH PPS proposed rule, we received a public comment that we considered to be outside the scope of that proposed rule. We stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50550) that we would consider this issue in future rulemaking as part of our annual review process. The commenter requested that a new MS-DRG be created for intracerebral therapies, including implantation of chemotherapeutic agents. Specifically, the commenter referred to the Gliadel® Wafer for the treatment of High-Grade Malignant Gliomas (HGGs) defined as aggressive tumors originating in the brain.

The Gliadel® Wafer has been discussed in prior rulemaking, including the FY 2004 IPPS proposed rule (68 FR 27187) and final rule (68 FR 45354 through 45355 and 68 FR 45391 through 45392); the FY 2005 IPPS proposed rule (69 FR 28221 through 28222) and final rule (69 FR 48957 through 48971); and the FY 2008 IPPS/LTCH PPS final rule (72 FR 47252 through 47253). We refer readers to these prior discussions for further background information regarding the Gliadel® Wafer.

Effective October 1, 2002, ICD-9-CM procedure code 00.10 (Implantation of chemotherapeutic agent) was created to identify and describe insertion of the Gliadel® Wafer. This procedure code is assigned to MS-DRG 023 (Craniotomy with Major Device Implant/Acute Complex Central Nervous System (CNS) PDX with MCC or Chemo Implant) in MDC 1. According to the commenter, this current MS-DRG assignment does not compensate providers adequately for the expenses incurred to perform the surgery and implantation of the wafer device. The commenter noted that MS-DRG 023 has a national average payment rate of approximately $28,016. However, the commenter stated, “the acquisition cost for 1 box of the Gliadel® Wafer alone (typical utilization per procedure is 8 wafers or 1 box) is $29,035.”

We conducted an analysis using claims data from the December 2013 update of the FY 2013 MedPAR file. Our findings are shown in the table below.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 023—All cases5,38310.98$36,982
MS-DRG 023—Cases with procedure code 00.101587.034,027

As shown in the table above, there were a total of 5,383 cases in MS-DRG 023 with an average length of stay of 10.98 days and average costs of $36,982. The number of cases reporting procedure code 00.10 in MS-DRG 023 totaled 158, with an average length of stay of 7.0 days and average costs of $34,027.

The data clearly demonstrate that the volume of cases reporting procedure code 00.10 within MS-DRG 023 have a shorter average length of stay and are lower in average costs in comparison to all the cases in the MS-DRG. As we stated in the proposed rule, given the low volume of cases, shorter average length of stay, and lower average costs, the data do not support the creation of a new MS-DRG for cases utilizing the Gliadel® Wafer. In addition, our clinical advisors determined that cases reporting procedure code 00.10 are appropriately assigned within MS-DRG 023.

As discussed in the FY 2005 IPPS final rule (69 FR 48959), Gliadel® Wafer cases were assigned to a new DRG that was clinically coherent and reflected the resources used to treat those cases, which appropriately addressed the concerns of commenters who raised questions regarding DRG assignment for those cases at that time. Subsequently, with the adoption of the MS-DRGs, in the FY 2008 IPPS/LTCH PPS final rule (72 FR 47252 through 47253), we assigned all cases utilizing the Gliadel® Wafer technology to MS-DRG 023, the higher severity level, and revised the title of this MS-DRG in recognition of the complexity and costs associated with the implantation. Our clinical advisors continue to support this assignment for these same reasons. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to create a new MS-DRG for FY 2015 for cases where ICD-9-CM procedure code 00.10 is reported. We invited public comments on our proposal to maintain the current MS-DRG structure.

Comment: Several commenters supported CMS' proposal to maintain cases reporting procedure code 00.10 in MS-DRG 23, stating it was reasonable given the data and information provided.

Response: We appreciate the commenters' support.

Comment: Some commenters believed that MS-DRG 23 does not provide adequate payment to hospitals that perform craniotomies with insertion of the Gliadel® Wafer. These commenters suggested the MedPAR data are flawed for a number of reasons. The commenters indicated that, upon conducting their own analysis of FY 2012 MedPAR data, there appears to be confusion among providers on how to accurately report procedure code 00.10. The commenters reported that, during their analysis, they encountered claims where procedure code 00.10 was reported for diagnoses of several other types of cancers (small and large bowel, pancreatic, and liver) that were completely unrelated to the brain. One commenter suggested that several providers who have reported procedure code 00.10 did not ever purchase the Gliadel® Wafer product. This commenter noted that it is unclear if the product should be classified as an implant or a drug within the revenue codes and that this uncertainty results in additional confusion. The same commenter urged CMS to consider more input from the professional community and Medicare beneficiaries, as well as data sources other than the MedPAR file when evaluating MS-DRG assignments for low volume procedures so as not to restrict access to care for patients in need of this intracerebral therapy.

Response: We acknowledge the commenters' concerns. With regard to confusion on how to accurately report procedure code 00.10 and concern that the code is being reported for other types of cancers besides brain cancer, we point out that the AHA's Coding Clinic for ICD-9-CM has provided coding instruction and examples for how to appropriately assign and report this code. Specifically, Coding Clinic Fourth Quarter, 2002, explains how the chemotherapy wafer is utilized in brain cancer and that chemotherapy wafers also have been used to treat the liver and bladder as well as other sites. We also note that the terms associated with procedure code 00.10 within ICD-9-CM are not restricted solely for use of the Gliadel® Wafer product. The ICD-9-CM coding classification system is not device specific.

With respect to the comment that providers are confused as to assigning an implant or drug revenue code to the Gliadel® Wafer product, we note that where explicit instructions are not provided, providers should report their charges under the revenue code that will result in the charges being assigned to the same cost center to which the cost of those services are assigned in the cost report. We appreciate the commenter's suggestion to obtain additional input from the professional community.

Comment: One commenter recommended that a new MS-DRG be created specifically for the Gliadel® Wafer product. The commenter stated that it is unacceptable for CMS to state there are too few cases to do so.

Response: As explained in the FY 2015 IPPS/LTCH PPS proposed rule, our analysis of the claims data and our clinical advisors did not support creation of a new MS-DRG. Furthermore, the MS-DRGs are a classification system intended to group together those diagnoses and procedures with similar clinical characteristics and utilization of resources. Basing a new MS-DRG on such a small number of cases could lead to distortions in the relative payment weights for the MS-DRG because several expensive cases could impact the overall relative payment weight. Having larger clinical cohesive groups within an MS-DRG provides greater stability for annual updates to the relative payment weights. Moreover, our clinical advisors have examined this issue and continue to advise us that the procedure code 00.10 cases are appropriately classified within MS-DRG 23 because they are clinically similar based on both the craniotomy and the insertion of the device, among other reasons. Our advisors reaffirmed their assessment that the groupings were not overly broad or heterogeneous, reiterating that the clinical flexibility of both physicians and hospitals is maximized when larger cohorts of clinically similar patients are grouped and the costs averaged. They note that many factors are considered when comparing groups of patients, including such factors as length of stay, cost of specific devices, type of device, type of procedure, and anatomical location, among others, and stated that the commenter did not identify any factors that would necessitate an atypical small, separate grouping when these cases are categorized. Our clinical advisors do not support creating a new MS DRG for such a small number of cases but would not support creating a separate DRG even if the volume of cases was large.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current structure for MS-DRG 23 for FY 2015.

b. Endovascular Embolization or Occlusion of Head and Neck

We received a request to change the MS-DRG assignment for the following three ICD-9-CM procedure codes representing endovascular embolization or occlusion procedures of the head and neck:

  • 39.72 (Endovascular (total) embolization or occlusion of head and neck vessels);
  • 39.75 (Endovascular embolization or occlusion of vessel(s) of head or neck using bare coils); and
  • 39.76 (Endovascular embolization or occlusion of vessel(s) of head or neck using bioactive coils).

These three procedure codes are currently assigned to the following eight MS-DRGs under MDC 1. Cases assigned to MS-DRGs 020, 021, and 022 require a principal diagnosis of hemorrhage. Cases assigned to MS-DRGs 023 and 024 require the insertion of a major implant or an acute complex central nervous system (CNS) principal diagnosis. Cases assigned to MS-DRGs 025, 026, and 027 do not have a principal diagnosis of hemorrhage, an acute complex CNS principal diagnosis, or a major device implant.

  • MS-DRG 020 (Intracranial Vascular Procedures with Principal Diagnosis of Hemorrhage with MCC)
  • MS-DRG 021 (Intracranial Vascular Procedures with Principal Diagnosis of Hemorrhage with CC)
  • MS-DRG 022 (Intracranial Vascular Procedures with Principal Diagnosis of Hemorrhage without CC/MCC)
  • MS-DRG 023 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis with MCC or Chemo Implant)
  • MS-DRG 024 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis without MCC)
  • MS-DRG 025 (Craniotomy & Endovascular Intracranial Procedures with MCC)
  • MS-DRG 026 (Craniotomy & Endovascular Intracranial Procedures with CC)
  • MS-DRG 027 (Craniotomy & Endovascular Intracranial Procedures without CC/MCC)

The requestor recommended that cases with procedure codes 39.72, 39.75, and 39.76 be moved from MS-DRGs 025, 026, and 027 to MS-DRGs 023 and 024, even when there is no reported acute complex CNS principal diagnosis or a major device implant. The requestor stated that unruptured aneurysms can be treated by a minimally invasive technique utilizing endovascular coiling. The requester noted that a microcatheter is inserted into a groin artery and navigated through the vascular system to the location of the aneurysm. The coils are inserted through the microcatheter into the aneurysm in order to occlude (fill) the aneurysm from inside the blood vessel. Once the coils are implanted, the blood flow pattern within the aneurysm is altered. The requestor stated that these cases do not have a principal diagnosis of hemorrhage because the treatment is for an unruptured aneurysm which has not hemorrhaged. Furthermore, the requestor stated that only a few of these cases without hemorrhage have a complex CNS principal diagnosis. Therefore, the requester believed that most of the cases should be assigned to MS-DRGs 025, 026, and 027.

The requestor stated that the average costs of coil cases captured by procedure codes 39.72, 39.75, and 39.76 are significantly higher than other cases within MS-DRGs 025, 026, and 027 where most of the coil cases are assigned. As stated earlier, the requester recommended that cases with procedure codes 39.72, 39.75, and 39.76 be moved to MS-DRGs 023 and 024, even when there is not an acute complex CNS principal diagnosis or a major device implant reported.

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for cases of endovascular embolization or occlusion of head and neck. The table below shows our findings. For MS-DRGs 025, 026, and 027, the cases identified by procedure code 39.72, 39.75, or 39.76 (endovascular embolization or occlusion of head and neck) have higher average costs and shorter lengths of stay in comparison to all the cases within each of those respective MS-DRGs. The average costs of cases in MS-DRG 024 are $4,049 higher than the average costs of the 1,731 endovascular embolization or occlusion of head and neck procedures cases in MS-DRG 027 ($26,250 versus $22,201). The findings also show that the 524 cases with procedure code 39.72, 39.75, or 39.76 with average costs of $41,030 in MS-DRG 025 are closer to the average costs of $36,982 for cases in MS-DRG 023. Lastly, we found that the 721 endovascular embolization or occlusion of head and neck procedure cases in MS-DRG 026 have average costs of $27,998 compared to average costs of $26,250 for cases in MS-DRG 024.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 23—All cases5,38310.98$36,982
MS-DRG 24—All cases1,7456.3026,250
MS-DRG 25—All cases15,9379.6829,722
MS-DRG 25—Cases with procedure code 39.72, 39.75, or 39.765247.9741,030
MS-DRG 26—All cases8,5206.1621,194
MS-DRG 26—Cases with procedure code 39.72, 39.75, or 39.767213.1427,998
MS-DRG 27—All cases10,3263.3016,389
MS-DRG 27—Cases with procedure code 39.72, 39.75, or 39.761,7311.6622,201

Our clinical advisors reviewed the results of our examination and determined that the endovascular embolization or occlusion of head and neck procedures are appropriately classified within MS-DRGs 025, 026, and 027 because they do not have an acute complex CNS principal diagnosis or a major device implant which would add to their clinical complexity. Cases in MS-DRG 024 have average costs that are $4,049 higher than cases in MS-DRG 027 with procedure code 39.72, 39.75, or 39.76. We acknowledge that the 1,245 cases with procedure code 39.72, 39.75, or 39.76 in MS-DRGs 025 and 026 have average costs that are closer to those in MS-DRGs 023 and 024. However, these cases are 1,245 of the total 2,976 cases that would be involved if we moved all MS-DRGs 025, 026, and 027 cases with procedure code 39.72, 39.75, or 39.76 to MS-DRGs 023 and 024, even if they did not have an acute complex CNS principal diagnosis or a major device implant. Based on these findings and the recommendations from our clinical advisors, we determined that proposing to move endovascular embolization or occlusion of head and neck procedures from MS-DRGs 025, 026, and 027 to MS-DRGs 023 and 024 was not warranted. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to maintain the current MS-DRG assignments for endovascular embolization or occlusion of head and neck procedures. We invited public comments on our proposal.

Comment: A number of commenters supported CMS' proposal to maintain the current MS-DRG assignment for codes 39.72, 39.75, or 39.76 in MS-DRGs 025, 026, and 027. The commenters stated this was reasonable, given the data and information provided.

A number of commenters objected to the proposal to maintain the current MS-DRG assignments for endovascular embolizations captured in codes 39.72, 39.75 and 39.76. The commenters recommended that CMS move the three codes to MS-DRGs 023 and 024. The commenters stated that the coils used in the endovascular embolizations are expensive and the endovascular procedures require substantial additional resources. The commenters stated that their hospitals are significantly underpaid for these cases. The commenters recommended that endovascular embolization codes 39.72, 39.75 and 39.76 be classified a “Major Device Implants” and therefore assigned to MS-DRGs 023 and 024.

Several commenters recommended that CMS create new severity subgroups within MS-DRG 024 to indicate cases with CC and cases without CC/MCC. The commenters recommended a three-level severity split as follows:

  • MS-DRG 023 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis with MCC or Chemo Implant);
  • MS-DRG 024 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis with CC); and
  • MS-DRG XXX (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis without CC/MCC)

The commenters recommended that endovascular embolizations captured in codes 39.72, 39.75 and 39.76 be added to these three recommended MS-DRGs as part of the Major Device Implant group.

One of the commenters recommended the creation of a new set of MS-DRGs to capture intracranial endovascular embolization procedures if CMS decided not to modify the current MS-DRGs by moving codes 39.72, 39.75, and 39.76 to MS-DRGs 023 and 024. The commenter suggested the following titles for the recommended new MS-DRGs:

  • Recommended new MS-DRG 043 (Intracranial Endovascular Embolization Procedures with MCC)
  • Recommended new MS-DRG 044 (Intracranial Endovascular Embolization Procedures with CC)
  • Recommended new MS-DRG 045 (Intracranial Endovascular Embolization Procedures with Device Implant without CC/MCC).

The commenter acknowledged that there were a limited number of other intracranial endovascular procedures that could also be considered for inclusion in the new base MS-DRG with this new option. The commenter supported including any additional intracranial endovascular embolization procedures that CMS deemed to be clinically appropriate.

Response: We appreciate the commenters' support of our proposal to maintain the current MS-DRG assignment. We examined the commenters' recommendation of subdividing MS-DRG 024 by adding an additional severity level (with CC and without CC/MCC). The findings from the examination of the claims data in the December 2013 update of the FY 2013 MedPAR file on endovascular embolization or occlusion of head and neck procedures are shown in the first table below. We applied the following criteria established in FY 2008 (72 FR 47169) to determine if the creation of a new CC or MCC subgroup within a base MS-DRG was warranted:

  • A reduction in variance of costs of at least 3 percent.
  • At least 5 percent of the patients in the MS-DRG fall within the CC or MCC subgroup.
  • At least 500 cases are in the CC or MCC subgroup.
  • There is at least a 20 percent different in average costs between subgroups.
  • There is a $2,000 difference in average costs between subgroups.

In order to warrant creation of a CC or MCC subgroup within a base MS-DRG, the subgroup must meet all five of the criteria.

Endovascular Embolization or Occlusion of Head and Neck Procedures

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 23—All cases5,38310.98$36,982
MS-DRG 24—All cases1,7456.3026,250

The following table shows the number of cases that would be within each of the new requested three MS-DRGs, including the two proposed severity levels.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 23 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis with MCC or Chemo Implant)5,38310.98$36,982
Proposed MS-DRG 24 (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis with CC or Chemo Implant)1,2117.6527,360
Proposed MS-DRG XX (Craniotomy with Major Device Implant/Acute Complex CNS Principal Diagnosis without CC/MCC or Chemo Implant)5343.2523,733

We determined that the requested new severity subdivision of with CC and without CC/MCC would meet only four of the five criteria. The requested new with CC and without CC/MCC severity levels do not meet the criterion that there is at least a 20 percent difference in average costs between subgroups.

Because the requested new severity level does not meet all five criteria, we are not modifying MS-DRG 024 to create severity levels for cases with CC and cases without CC/MCC.

We also evaluated the request to add endovascular embolizations captured by codes 39.72, 39.75 and 39.76 to the group labeled “Major Device Implants” within MS-DRGs 023 and 024. Major Device Implants within MS-DRGs 023 and 024 include the following three sets of intracranial neurostimulator procedures. Each of the three is composed of the implantation of an intracranial neurostimulator pulse generator which is implanted in the patient, as well as the insertion of a neurostimulator lead which is inserted through a burr hole in the skull into the patient's brain.

  • 01.20 (Cranial implantation or replacement of neurostimulator pulse generator) and 02.93 (Implantation or replacement of intracranial neurostimulator lead(s))
  • 02.93 (Implantation or replacement of intracranial neurostimulator lead(s)) and 86.95 (Insertion or replacement of multiple array neurostimulator pulse generator, not specified as rechargeable)
  • 02.93 (Implantation or replacement of intracranial neurostimulator lead(s)) and 86.98 (Insertion or replacement of multiple array (two or more) rechargeable neurostimulator pulse generator)

Our clinical advisors reviewed this issue and advised us not to classify endovascular embolization procedures in the same manner as patients who receive intracranial neurostimulators. They advised against classifying endovascular embolizations as Major Device Implants for several reasons. First, the endovascular embolization device itself is a simple mechanical device, such as a wire, not a complex electronic device. The work involved in configuring the neurostimulator device to the patient, both before and after insertion, is significantly different from that of the endovascular embolizations. Second, endovascular embolizations are not devices implanted through an open procedure as are intracranial neurostimulator pulse generators and neurostimulator leads. Our clinical advisors stated that open procedures, including open procedures to implant the generator but especially including open skull procedures, from a clinical standpoint are significantly different than endovascular procedures, both in terms of the work, the facilities, the risks, and recovery rates (length of stay). Our clinical advisors specifically stated that the insertion of coils through an endovascular approach is not similar to the insertion of a complex electronic device. Endovascular embolizations do not match the clinical complexity and severity of the intracranial neurostimulators which have greater lengths of stay. Our clinical advisors stated that care of patients who receive endovascular embolizations is not at the same severity level as for those patients who have a major device implant such as an intracranial neurostimulator or those patients with an acute complex central nervous system principal diagnosis. Therefore, our clinical advisors recommended not moving endovascular embolizations to MS-DRGs 023 or 024. They recommended maintaining their current assignments in MS-DRGs 025, 026, and 027.

We evaluated the request to create a new set of MS-DRGs to capture intracranial endovascular embolization procedures. The requestor recommended including codes 39.72, 39.75, and 39.76 and any other procedures which CMS deemed appropriate. Our clinical advisors stated that codes 39.72, 39.75, and 39.76 were appropriately assigned to MS-DRGs 025, 026, and 027 because they are clinically similar to other cases in MS-DRGs 025, 026, and 027. In addition, as stated earlier, these cases do not match the clinical complexity and severity of the intracranial neurostimulators within MS-DRGs 023 and 024. For these reasons, our clinical advisors did not support creating a new set of MS-DRG for these codes and any additional intracranial endovascular embolization procedures.

After consideration of public comments we received, we are finalizing our proposal to maintain the current MS-DRG assignments for codes 39.72, 39.75 and 39.76 in MS-DRGs 025, 026, and 027.

3. MDC 4 (Diseases and Disorders of the Ear, Nose, Mouth and Throat): Avery Breathing Pacemaker System

We received a request to create a new MS-DRG for the Avery Breathing Pacemaker System. This system is also called a diaphragmatic pacemaker and is captured by ICD-9-CM procedure code 34.85 (Implantation of diaphragmatic pacemaker). The requestor stated that the diaphragmatic pacemaker is indicated for adult and pediatric patients with chronic respiratory insufficiency that would otherwise be dependent on ventilator support. The procedure consists of surgically implanted receivers and electrodes mated to an external transmitter by antennas worn over the implanted receivers. The external transmitter and antennas send radiofrequency energy to the implanted receivers under the skin. The receivers then convert the radio waves into stimulating pulses sent down the electrodes to the phrenic nerves, causing the diaphragm to contract. The requestor stated that this normal pattern is superior to mechanical ventilators that force air into the chest. The requestor also stated that the system is expensive; the device cost is approximately $57,000. According to the requestor, given the cost of the device, hospitals are reluctant to use it. The requestor did not make a specific MS-DRG reassignment request.

When used for a respiratory failure patient, procedure code 34.85 is assigned to MS-DRGs 163, 164, and 165 (Major Chest Procedures with MCC, with CC, and without CC/MCC, respectively).

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for diaphragmatic pacemaker cases. The following table shows our findings.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 163—All cases11,76613.13$34,308
MS-DRG 163—Cases with procedure code 34.85132.23$29,406
MS-DRG 164—All cases16,0876.58$18,352
MS-DRG 164—Cases with procedure code 34.85341.71$23,406
MS-DRG 165—All cases9,2073.91$13,081
MS-DRG 165—Cases with procedure code 34.8511.00$22,977

There were only 48 cases of diaphragmatic pacemakers within MS-DRGs 163, 164, and 165. The average costs of these diaphragmatic pacemaker cases ranged from $22,977 for the single case in MS-DRG 165 to $29,406 for the cases in MS-DRG 163, compared to the average costs for all cases in MS-DRGs 163, 164, and 165, which range from $13,081 to $34,308. The average cost for diaphragmatic pacemaker cases in MS-DRG 163 was lower than that for all cases in MS-DRG 163, $29,406 compared to $34,308 for all cases. The average cost for diaphragmatic pacemaker cases was higher for MS-DRG 164, $23,406 compared to $18,352 for all cases. While the average cost for the single diaphragmatic pacemaker case was significantly higher for MS-DRG 165, $22,977 compared to $13,081, we were unable to determine if additional factors might have impacted the higher cost for this single case.

We stated in the FY 2015 IPPS/LTCH PPS proposed rule that, given the small number of diaphragmatic pacemaker cases that we found, we did not believe that there was justification for creating a new MS-DRG. Basing a new MS-DRG on such a small number of cases could lead to distortions in the relative payment weights for the MS-DRG because several expensive cases could impact the overall relative payment weight. Having larger clinical cohesive groups within an MS-DRG provides greater stability for annual updates to the relative payment weights. We noted that, as discussed in section II.G.4.c. of the preamble of the proposed rule, one of the criteria we apply in evaluating whether to create new severity subgroups within an MS-DRG is whether there are at least 500 cases in the CC or MCC subgroup. While this criterion is used to evaluate whether to create a severity subgroup within an MS-DRG, applying it here suggests that creating a new MS-DRG for only 48 cases would not be appropriate. Although the average costs of these diaphragmatic pacemaker cases are higher than the average costs of all cases in MS-DRG 164, the average costs are lower than all cases in MS-DRG 163. We believe the current MS-DRG assignment is appropriate and that the data do not support creating an MS-DRG because there are so few cases.

Our clinical advisors reviewed this issue and determined that the diaphragmatic pacemaker cases are appropriately classified within MS-DRGs 163, 164, and 165 because they are clinically similar to other cases of patients with major chest procedures within MS-DRGs 163, 164, and 165. Our clinical advisors did not support creating a new MS-DRG for such a small number of cases.

Based on the results of the examination of the claims data, the recommendations from our clinical advisors, and the small number of diaphragmatic pacemaker cases, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to create a new MS-DRG for diaphragmatic pacemaker cases for FY 2015. We proposed to maintain the current MS-DRG assignments for diaphragmatic pacemaker cases. We invited public comments on our proposal.

Comment: A number of commenters supported CMS' proposal to maintain the current MS-DRG assignment for diaphragmatic pacemakers. The commenters stated that the proposal was reasonable given the data and information presented.

Another commenter expressed appreciation for the analysis performed on this issue, but disagreed with the conclusion to leave diaphragmatic pacemakers in MS-DRGs 163, 164, and 165. The commenter stated that, although the number of cases identified (48) is small, they are unique in both their costs and their length of stay. The commenter stated that these cases do not represent the full universe of Medicare beneficiaries who would be good candidates for the diaphragmatic pacemaker. The commenter expressed surprise at the average cost data presented in the table in the proposed rule. The commenter stated that it sells this system directly to hospitals and does not know what insurance plan covers the procedure. However, in investigating systems hospitals reported with code 34.85, the commenter stated that it discovered that this code covers systems provided by other manufacturers and that the cost of devices by other manufacturers is lower than the Avery system and is closer to the costs in CMS' claims data. The commenter stated that the Avery system is fully implantable, whereas other systems are not. The commenter asserted that one other system has percutaneous lead wires that leave the patients; therefore, the other system is not totally implantable. The commenter made inquiries of hospitals and found that a majority of those hospitals contacted were using a lower priced system. The commenter stated that by grouping multiple manufacturers' devices into the same MS-DRG, with the same payment rate, CMS was limiting physician and patient choice of a device. The commenter recommended that MS-DRG payments be made based on the equipment provided and allow hospitals to recoup the costs of each system used.

The commenter stated that inadequate payment discourages hospitals from offering the service to patients. The commenter also stated that these cases are anomalies in the current MS-DRGs to which they are assigned and should be classified into a single, unique MS-DRG that would be clinically and financially coherent. The commenter believed that such a correction could increase the number of eligible Medicare beneficiaries who would benefit from use of the device, allowing them to stop using mechanical ventilation, which would greatly improve their overall health and quality of life.

The commenter also stated that the average costs for 35 of the cases with procedure code 34.85 exceed the average costs of the other cases in the MS-DRG to which they are assigned. The commenter stated that it found the average length of stay for all 48 cases to be substantially less than the average length of stay for all of the other cases. Therefore, the commenter stated that the costs for the hospital are related primarily to the device and not to the direct hospital care provided to the patients. The commenter stated that the small number of diaphragmatic pacemaker cases compared to the large volume of other cases in each MS-DRG means that the unique cost factors of most of the pacemaker cases will never be reflected in the payment for these MS-DRGs. The commenter stated that hospitals have no incentive to make the service available to patients who could use the system. The commenter stated that the number of individuals who can use the pacemaker is small because of the comparatively small volume of individuals who suffer from the conditions that make the pacemaker necessary, but there are more than 48 Medicare beneficiaries who could benefit from the device.

The commenter further questioned the rationale for not basing a new MS-DRG on such a small number of cases. The commenter questioned the reference to the use of 500 cases, which is one of the criteria for a severity level, when the requestor did not want a severity level, but instead was requesting a new MS-DRG for these Avery Diaphragmatic Pacemaker cases.

In conclusion, the commenter urged CMS to create a new MS-DRG for procedure code 34.85.

Response: We appreciate the commenters' support for our proposal not to change the MS-DRG for diaphragmatic pacemakers. As noted by one commenter, the ICD-9-CM procedure codes capture the procedure performed, in this case the implantation of a diaphragmatic pacemaker. The codes are not manufacturer specific. This is the case for all types of implanted devices such as cardiac pacemakers, defibrillators, and orthopedic devices. The procedure codes are grouped into clinically appropriate MS-DRGs. MS-DRGs were not created to capture a device by a single manufacturer. It is assumed that hospitals and their physician staff will select the appropriate devices. CMS makes Medicare payments to hospitals for groups of similar patients within each MS-DRG. The average costs provided in the tables above were based on Medicare patients reported to have received a diaphragmatic pacemaker. Hospitals have been receiving payments by diagnosis-related groups for several decades and are aware that average payments will exceed the costs of some cases and be less than the costs of other cases. They are aware that the selection of a particular manufacturer, or a particular device made by one manufacturer, should be consistent with the needs of the patient. Our data do not identify which manufacturer's devices the hospitals and physicians chose to utilize.

As stated earlier, given the small number of diaphragmatic pacemaker cases, we do not believe there is justification for creating a new MS-DRG. Basing a new MS-DRG on such a small number of cases could lead to distortions in the relative payment weights for the MS-DRG because several expensive cases could impact the overall relative payment weight. Having larger clinical cohesive groups within an MS DRG provides greater stability for annual updates to the relative payment weights.

Our clinical advisors reviewed this issue and the public comments received and continue to advise that that the diaphragmatic pacemaker cases are appropriately classified within MS-DRGs 163, 164, and 165 because they are clinically similar to other cases of patients with major chest procedures within MS-DRGs 163, 164, and 165. They stated that the clinical flexibility of both physicians and hospitals is maximized when larger cohorts of clinically similar patients are grouped and the costs averaged. Our clinical advisors note that many factors are considered when comparing groups of patients, including such factors as length of stay, cost of specific devices, type of device, type of procedure, and anatomical location, among others. They stated that the commenter did not identify any factors that they had failed to consider when categorizing these cases. Our clinical advisors do not support creating a new MS DRG for such a small number of cases.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current MS-DRG assignments for diaphragmatic pacemaker cases within MS-DRGs 163, 164, and 165.

4. MDC 5 (Diseases and Disorders of the Circulatory System)

a. Exclusion of Left Atrial Appendage

We received a request to move the exclusion of the left atrial appendage procedure, which is a non-O.R. procedure and captured by ICD-9-CM procedure code 37.36 (Excision, destruction or exclusion of left atrial appendage (LAA)), from MS-DRGs 250 (Percutaneous Cardiovascular without Coronary Artery Stent with MCC) and 251 (Percutaneous Cardiovascular without Coronary Artery Stent without MCC) to MS-DRGs 237 (Major Cardiovascular Procedures with MCC) and 238 (Major Cardiovascular Procedures without MCC). The requestor stated that the exclusion of the left atrial appendage procedure code 37.36 is not clinically coherent with the other procedures in MS-DRGs 250 and 251 and that this current assignment to MS-DRGs 250 and 251 does not compensate providers adequately for the expenses incurred to perform this procedure and placement of the device.

The exclusion of the left atrial appendage procedure involves a percutaneous placement of a snare/suture around the left atrial appendage to close it off. The exclusion of the left atrial appendage procedure takes place in the cardiac catheterization laboratory under general anesthesia and is a catheter based closed-chest procedure instead of an open heart surgical technique to treat the same clinical condition, with the same intended results. The procedure can be performed by either an interventional cardiologist or an electrophysiologist.

We analyzed claims data from the December 2013 update of the FY 2013 MedPAR file for cases assigned to MS-DRGs 250 and 251 and MS-DRGs 237 and 238. Our findings are shown in the table below.

MS-DRGNumber of  casesAverage  length of stayAverage  costs
MS-DRG 250—All cases9,1746.90$21,319
MS-DRG 250—Cases with procedure code 37.36617.2129,637
MS-DRG 251—All cases26,3313.0114,614
MS-DRG 251—Cases with procedure code 37.363413.0118,298
MS-DRG 237—All cases17,8139.6635,642
MS-DRG 238—All cases33,6443.7324,511

The data in the table above show that, while the average costs of the atrial appendage exclusion procedures are higher ($29,637) than those for all cases ($21,319) within MS-DRG 250 and are higher ($18,298) than for all cases ($14,614) within MS-DRG 251, they are lower than those in MS-DRGs 237 ($35,642) and 238 ($24,511). Our clinical advisors reviewed this issue and recommended not moving these stand-alone percutaneous cases to MS-DRGs 237 and 238 because they do not consider them to be major cardiovascular procedures. Our clinical advisors stated that cases reporting ICD-9-CM procedure code 37.36 are appropriately assigned within MS-DRG 250 and 251 because they are percutaneous cardiovascular procedures and are clinically similar to other procedures within the MS-DRG. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to reassign exclusion of atrial appendage procedure cases from MS-DRGs 250 and 251 to MS-DRGs 237 and 238 for FY 2015. We invited public comments on our proposal to maintain the current MS-DRG structure for the exclusion of the left atrial appendage.

Comment: Several commenters supported CMS' proposal to maintain the current MS-DRGs 250 and 251 assignment for exclusion of the left atrial appendage. Several commenters disagreed with the proposal and recommended that CMS assign exclusion of the left atrial appendage to MS-DRG 237 and 238 because the procedure can be performed as a standalone percutaneous procedure or in combination with an open chest procedure such as cardiac bypass surgery. The commenters stated that when the procedure is performed in conjunction with an open chest procedure, the procedure is performed in a surgical suite. Therefore, the commenters recommended that exclusion of the left atrial appendage be assigned to MS-DRGs 237 and 238 when it is a standalone procedure.

Response: We appreciate the commenters' support for our proposal to maintain the current MS-DRG assignment for the exclusion of atrial appendage procedures. We are not accepting the commenters' recommendation to move the cases to MS-DRGs 237 and 238. Our clinical advisors reviewed these public comments and continue to maintain that cases reporting ICD-9-CM procedure code 37.36 are appropriately assigned within MS-DRG 250 and 251 because they are percutaneous cardiovascular procedures and are clinically similar to other procedures within the MS-DRGs. They also stated that when performed with an open chest procedure, these procedures would map to a clinically appropriate open chest MS-DRG under the current MS-DRG logic. Our clinical advisors confirmed that although these are not insignificant procedures, the procedures are not considered to be major cardiovascular procedures on the same scale and with similar characteristics as cases grouped together in MS-DRGs 237 and 238.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current MS-DRG assignment for exclusion of atrial appendage in MS-DRGs 250 and 251 for FY 2015.

b. Transcatheter Mitral Valve Repair: MitraClip®

The MitraClip® System (hereafter referred to as MitraClip®) for transcatheter mitral valve repair has been discussed in extensive detail in previous rulemaking, including the FY 2012 IPPS/LTCH PPS proposed rule (76 FR 25822) and final rule (76 FR 51528 through 51529) and the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27902 through 27903) and final rule (77 FR 53308 through 53310), in response to requests for MS-DRG reclassification, as well as, in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27547 through 27552) under the new technology add-on payment policy. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50575), the application for a new technology add-on payment for MitraClip® was unable to be considered further due to lack of FDA approval by the July 1, 2013 deadline.

Subsequently, on October 24, 2013, MitraClip® received FDA approval. As a result, the manufacturer has submitted new requests for both an MS-DRG reclassification and new technology add-on payment for FY 2015. We refer readers to section II.I. of the preamble of the proposed rule and this final rule for a discussion regarding the application for MitraClip® under the new technology add-on payment policy. Below we discuss the MS-DRG reclassification request.

The manufacturer's request for MS-DRG reclassification involves two components. The first component consists of reassigning cases reporting a transcatheter mitral valve repair using the MitraClip® from MS-DRGs 250 and 251 (Percutaneous Cardiovascular Procedure without Coronary Artery Stent with MCC and without MCC, respectively) to MS-DRGs 216 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with MCC), 217 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with CC), 218 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization without CC/MCC), 219 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with MCC), 220 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with CC), and 221 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization without CC/MCC). The second component of the manufacturer's request was for CMS to examine the creation of a new base MS-DRG for transcatheter valve therapies.

Effective October 1, 2010, ICD-9-CM procedure code 35.97 (Percutaneous mitral valve repair with implant) was created to identify and describe the MitraClip® technology.

To address the first component of the manufacturer's request, we conducted an analysis of claims data from the December 2013 update of the FY 2013 MedPAR file for cases reporting procedure code 35.97 in MS-DRGs 250 and 251. The table below shows our findings.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 250—All cases9,1746.90$21,319
MS-DRG 250—Cases with procedure code 35.97678.4839,103
MS-DRG 251—All cases26,3313.0114,614
MS-DRG 251—Cases with procedure code 35.971273.9425,635

As displayed in the table above, the data demonstrate that, for MS-DRG 250, there were a total of 9,174 cases with an average length of stay of 6.90 days and average costs of $21,319. The number of cases reporting the ICD-9-CM procedure code 35.97 in MS-DRG 250 totaled 67 with an average length of stay of 8.48 days and average costs of $39,103. For MS-DRG 251, there were a total of 26,331 cases with an average length of stay of 3.01 days and average costs of $14,614. There were 127 cases found in MS-DRG 251 reporting the procedure code 35.97 with an average length of stay of 3.94 days and average costs of $25,635. We recognize that the cases reporting procedure code 35.97 have a longer length of stay and higher average costs in comparison to all the cases within MS-DRGs 250 and 251. However, as stated in prior rulemaking (77 FR 53309), it is a fundamental principle of an averaged payment system that half of the procedures in a group will have above average costs. It is expected that there will be higher cost and lower cost subsets, especially when a subset has low numbers.

We also evaluated the claims data from the December 2013 update of the FY 2013 MedPAR file for MS-DRGs 216 through 221. Our findings are shown in the table below.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 216—All cases10,13115.41$65,478
MS-DRG 217—All cases5,3749.5144,695
MS-DRG 218—All cases8826.8839,470
MS-DRG 219—All cases17,85611.6354,590
MS-DRG 220—All cases21,0597.1338,137
MS-DRG 221—All cases4,5865.3234,310

The data in our findings did not warrant reassignment of cases reporting use of the MitraClip®. We stated in the proposed rule that if we were to propose reassignment of cases reporting procedure code 35.97 to MS-DRGs 216 through 221, they would be significantly overpaid, as the average costs range from $34,310 to $65,478 for those MS-DRGs. In addition, our clinical advisors did not support reassigning these cases. They noted that the current MS-DRG assignment is appropriate for the reasons stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53309). To reiterate, our clinical advisors noted that the current MS-DRG assignment is reasonable because the operating room resource utilizations of percutaneous procedures, such as those found in MS-DRGs 250 and 251, tend to group together, and are generally less costly than open procedures, such as those found in MS-DRGs 216 through 221. Percutaneous procedures by organ system represent groups that are reasonably clinically coherent. More significantly, our clinical advisors stated that postoperative resource utilization is significantly higher for open procedures with much greater morbidity and consequent recovery needs. Because the equipment, technique, staff, patient populations, and physician specialty all tend to group by type of procedure (percutaneous or open), separately grouping percutaneous procedures and open procedures is more clinically consistent. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to modify the current MS-DRG assignment for cases reporting procedure code 35.97 from MS-DRGs 250 and 251 to MS-DRGs 216 through 221 for FY 2015. We invited public comments on our proposal to not make any modifications to the current MS-DRG logic for these cases.

Comment: Several commenters supported the proposal to maintain cases reporting procedure code 35.97 in MS-DRGs 250 and 251, stating it was reasonable given the data and information provided.

Response: We acknowledge and appreciate the commenters' support.

Comment: Some commenters suggested that cases utilizing the MitraClip® should be compensated similarly to mitral valve procedures that are performed with an open approach due to the time, staff and resources involved. Commenters reported that this novel technology has improved the quality of life for patients suffering from congestive heart failure. However, the commenters indicated that due to inadequate payment, their respective facilities are not able to offer the MitraClip® to the entire population that is eligible for it. The commenters also indicated that patients do not have access to this life-saving technology not only due to the lack of adequate payment to providers but also due to the cost of the device. Another commenter reported that “the price of the device should be reduced to a level that is feasible for both sponsor and hospital.” Commenters also suggested that congestive heart failure readmissions would be reduced if patients could be treated with the MitraClip®.

Response: As explained in the FY 2015 IPPS/LTCH PPS proposed rule, our clinical advisors believe that the current MS-DRG assignment for the MitraClip® is reasonable because the operating room resource utilizations of percutaneous procedures, such as those found in MS-DRGs 250 and 251, tend to group together, and are generally less costly than open procedures. In addition, the data do not support reassignment. We stated in the proposed rule that if we were to propose reassignment of cases reporting procedure code 35.97 to MS-DRGs 216 through 221, they would be significantly overpaid, as the average costs range from $34,310 to $65,478 for those MS-DRGs and the average costs for cases reporting procedure code 35.97 are $30,286 for MS-DRGs 250 and 251.

Comment: One commenter suggested an alternative option regarding MS-DRG reassignment for the MitraClip® and requested that CMS reassign cases reporting procedure code 35.97 from MS-DRGs 250 and 251 to MS-DRGs 237 and 238 (Major Cardiovascular Procedures with MCC and without MCC, respectively) with concurrent approval of the new technology add-on payment application. The commenter stated that this would allow the MitraClip® to be recognized in MS-DRGs involving a major cardiovascular procedure with an implantable device.

Response: We did not propose to reassign cases reporting procedure code 35.97 from MS-DRGs 250 and 251 to MS-DRGs 237 and 238. Therefore, we consider this comment to be outside of the scope of the FY 2015 IPPS/LTCH PPS proposed rule. We note that, as referenced in section II.G.1.b. of the preamble of this final rule, we encourage input from our stakeholders concerning the annual IPPS updates when that input is made available to us by December 7 of the year prior to the next annual proposed rule update. For example, to be considered for any updates or changes in FY 2016, comments and suggestions should be submitted by December 7, 2014.

We note that the MitraClip® technology is discussed in section II.I. of the preamble of this final rule under the new technology add-on payment policy.

After consideration of the public comments we received, we are finalizing our proposal to not modify the current MS-DRG assignment for cases reporting procedure code 35.97 from MS-DRGs 250 and 251 to MS-DRGs 216 through 221 for FY 2015.

As indicated above, the second component of the manufacturer's request involved the creation of a new base MS-DRG for transcatheter valve therapies. We also received a similar request from another manufacturer recommending that CMS create a new MS-DRG for procedures referred to as endovascular cardiac valve replacement procedures. We reviewed each of these requests using the same data analysis, as set forth below. The discussion for endovascular cardiac valve replacement procedures is included in section II.G.4.c. of the preamble of this final rule and includes findings from the analysis and our proposals and final policies for each of these similar, but distinct requests.

c. Endovascular Cardiac Valve Replacement Procedures

As noted in the previous section related to the MitraClip® technology, we received two requests to create a new base MS-DRG for what was referred to as “transcatheter valve therapies” by one manufacturer and “endovascular cardiac valve replacement” procedures by another manufacturer. Below we summarize the details of each request and review results of the data analysis that was performed.

Transcatheter Valve Therapies

The request related to transcatheter valve therapies consisted of creating a new MS-DRG that would include the MitraClip® technology (ICD-9-CM procedure code 35.97 (Percutaneous mitral valve repair with implant)), along with the following list of ICD-9-CM procedure codes that identify the various types of valve replacements performed by an endovascular or transcatheter technique:

  • 35.05 (Endovascular replacement of aortic valve);
  • 35.06 (Transapical replacement of aortic valve);
  • 35.07 (Endovascular replacement of pulmonary valve);
  • 35.08 (Transapical replacement of pulmonary valve); and
  • 35.09 (Endovascular replacement of unspecified valve).

We performed analysis of claims data from the December 2013 update of the FY 2013 MedPAR file for both the percutaneous mitral valve repair and the transcatheter/endovascular cardiac valve replacement codes in their respective MS-DRGs. The percutaneous mitral valve repair with implant (MitraClip®) procedure code is currently assigned to MS-DRGs 250 and 251, while the transcatheter/endovascular cardiac valve replacement procedure codes are currently assigned to MS-DRGs 216, 217, 218, 219, 220, and 221. As illustrated in the table below, the data demonstrate that, for MS-DRGs 250 and 251, there were a total of 194 cases reporting procedure code 35.97, with an average length of stay of 5.5 days and average costs of $30,286.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 250 through 251—Cases with procedure code 35.971945.5$30,286

Upon analysis of cases in MS-DRGs 216 through 221 reporting the cardiac valve replacement procedure codes, we found a total of 7,287 cases with an average length of stay of 8.1 days and average costs of $53,802, as shown in the table below.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRGs 216 through 221—Cases with procedure codes 35.05, 35.06, 35.07, 35.08 and 35.097,2878.1$53,802
MS-DRGs 216 through 221—Cases without procedure codes 35.05, 35.06, 35.07, 35.08 and 35.0952,60110.147,177

The data clearly demonstrate that the volume of cases for the transcatheter/endovascular cardiac valve replacement procedures is much higher in comparison to the volume of cases for the percutaneous mitral valve repair (MitraClip®) procedure (7,287 compared to 194). In addition, the average costs of the transcatheter/endovascular cardiac valve replacement procedures are significantly higher than the average costs of the percutaneous mitral valve repair with implant ($53,802 compared to $30,286).

Our clinical advisors did not support grouping a percutaneous valve repair procedure with transcatheter/endovascular valve replacement procedures. They do not believe that these procedures are clinically coherent or similar in terms of resource consumption because the MitraClip® technology identified by procedure code 35.97 is utilized for a percutaneous mitral valve repair, while the other technologies, identified by procedure codes 35.05 through 35.09, are utilized for transcatheter/endovascular cardiac valve replacements. Consequently, the data analysis and our clinical advisors did not support the creation of a new MS-DRG. Therefore, for FY 2015, we did not propose to create a new MS-DRG to group cases reporting the percutaneous mitral valve repair (MitraClip®) procedure with transcatheter/endovascular cardiac valve replacement procedures. We invited public comments on our proposal.

Comment: One commenter recommended reassignment of procedure code 35.97 to a more appropriate MS-DRG. However, the commenter did not offer a specific recommendation as to which MS-DRG would be more appropriate.

Response: We appreciate the commenter's recommendation. However, as the commenter did not provide a specific MS-DRG to which procedure code 35.97 should be reassigned, we were unable to evaluate the recommendation. As we noted earlier, and as referenced in section II.G.1.b. of the preamble of this final rule, we encourage input from our stakeholders concerning the annual IPPS updates when that input is made available to us by December 7 of the year prior to the next annual proposed rule update. For example, to be considered for any updates or changes in FY 2016, comments and suggestions should be submitted by December 7, 2014.

Comment: One commenter urged CMS to reassign procedure code 35.97 from its current assignment in MS-DRGs 250 and 251 to a more appropriate MS-DRG that would better recognize case complexity as a major cardiovascular procedure with a permanent implant. This commenter specifically recommended the inclusion of transcatheter mitral valve repair (TMVR) within the newly proposed MS-DRGs 266 and 267, and to subsequently retitle these MS-DRGs, “Endovascular Transcatheter Valve Therapy with Implant.”

Response: As stated in the FY 2015 IPPS/LTCH PPS proposed rule, our analysis did not support including cases reporting procedure code 35.97 for percutaneous mitral valve repair procedures together with transcatheter/endovascular cardiac valve replacement procedures in a new MS-DRG. The average costs of the transcatheter/endovascular cardiac valve replacement procedures are significantly higher than the average costs of the percutaneous mitral valve repair procedures with implant ($53,802 compared to $30,286).

In addition, our clinical advisors did not support grouping a percutaneous valve repair procedure with transcatheter/endovascular valve replacement procedures. They do not believe that these procedures are clinically coherent or similar in terms of resource consumption because the MitraClip® technology identified by procedure code 35.97 is utilized for a percutaneous mitral valve repair, while the other technologies, identified by procedure codes 35.05 through 35.09, are utilized for transcatheter/endovascular cardiac valve replacements.

Comment: One commenter disagreed with the CMS analysis that transcatheter mitral valve repair (TMVR) is significantly different than transcatheter aortic valve replacement (TAVR). The commenter asserted that “unlike alternative open repair and replacement procedures, a heart valve prosthesis is being manipulated/modified from a Transcatheter approach; whether the prosthesis serves to `replace' or `repair' an existing valve is irrelevant in regards to resource consumption.” The commenter urged CMS to consider all transcatheter valve procedures equally with respect to DRG assignment.

Response: We disagree with the commenter that TMVR and TAVR are not significantly different. As explained in the FY 2015 IPPS/LTCH PPS proposed rule, our analysis of the claims data and the recommendation from our clinical advisors do not support treating TMVR and all transcatheter valve procedures equally with respect to MS-DRG assignment. As noted previously, the average costs of the transcatheter/endovascular cardiac valve replacement procedures are significantly higher than the average costs of the percutaneous mitral valve repair procedures with implant ($53,802 compared to $30,286).

After consideration of the public comments we received, we are finalizing our proposal to not create a new MS-DRG to group cases reporting the percutaneous mitral valve repair (MitraClip®) procedure with transcatheter/endovascular cardiac valve replacement procedures.

Endovascular Cardiac Valve Replacement

The similar but separate request relating to endovascular cardiac valve replacement procedures consisted of creating a new MS-DRG that would only include the various types of cardiac valve replacements performed by an endovascular or transcatheter technique. In other words, this request specifically did not include the MitraClip® technology (ICD-9-CM procedure code 35.97 (Percutaneous mitral valve repair with implant)) and only included the list of ICD-9-CM procedure codes that identify the various types of valve replacements performed by an endovascular or transcatheter technique (ICD-9-CM procedure codes 35.05 through 35.09) as described earlier in this section.

The human heart contains four major valves—the aortic, mitral, pulmonary, and tricuspid valves. These valves function to keep blood flowing through the heart. When conditions such as stenosis or insufficiency/regurgitation occur in one or more of these valves, valvular heart disease may result. Cardiac valve replacement surgery is performed in an effort to correct these diseased or damaged heart valves. The endovascular or transcatheter technique presents a viable option for high-risk patients who are not candidates for the traditional open surgical approach.

We reviewed the claims data from the December 2013 update of the FY 2013 MedPAR file for cases in MS-DRGs 216 through 221. Our findings are shown in the chart below. The data analysis shows that cardiac valve replacements performed by an endovascular or transcatheter technique represent a total of 7,287 of the cases in MS-DRGs 216 through 221, with an average length of stay of 8.1 days and higher average costs ($53,802 compared to $47,177) in comparison to all of the cases in MS-DRGs 216 through 221.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRGs 216 through 221—Cases with procedure codes 35.05, 35.06, 35.07, 35.08 and 35.097,2878.1$53,802
MS-DRGs 216 through 221—Cases without procedure codes 35.05, 35.06, 35.07, 35.08 and 35.0952,60110.147,177

As the data appear to indicate support for the creation of a new base MS-DRG, based on our evaluation of resource consumption, patient characteristics, volume, and costs between the cardiac valve replacements performed by an endovascular or transcatheter technique and the open surgical technique, we then applied our established criteria to determine if these cases would meet the requirements to create subgroups. We use five criteria established in the FY 2008 IPPS final rule (72 FR 47169) to review requests involving the creation of a new CC or an MCC subgroup within a base MS-DRG. As outlined in the FY 2012 IPPS proposed rule (76 FR 25819), the original criteria were based on average charges but were later converted to average costs. In order to warrant creation of a CC or an MCC subgroup within a base MS-DRG, this subgroup must meet all of the following five criteria:

  • A reduction in variance of costs of at least 3 percent.
  • At least 5 percent of the patients in the MS-DRG fall within the CC or the MCC subgroup.
  • At least 500 cases are in the CC or the MCC subgroup.
  • There is at least a 20-percent difference in average costs between subgroups.
  • There is a $2,000 difference in average costs between subgroups.

In applying the five criteria, we found that the data support the creation of a new MS-DRG subdivided into two severity levels. We also consulted with our clinical advisors. Our clinical advisors stated that patients receiving endovascular cardiac valve replacements are significantly different from those patients who undergo an open chest cardiac valve replacement. They noted that patients receiving endovascular cardiac valve replacements are not eligible for open chest cardiac valve procedures because of a variety of health constraints. This highlights the fact that peri-operative complications and post-operative morbidity have significantly different profiles for open chest procedures compared with endovascular interventions. This is also substantiated by the different average lengths of stay demonstrated by the two cohorts. Our clinical advisors further noted that separately grouping these endovascular valve replacement procedures provides greater clinical cohesion for this subset of high-risk patients.

In the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to create the following MS-DRGs for endovascular cardiac valve replacements:

  • Proposed new MS-DRG 266 (Endovascular Cardiac Valve Replacement with MCC); and
  • Proposed new MS-DRG 267 (Endovascular Cardiac Valve Replacement without MCC).
Proposed new MS-DRGs for endovascular cardiac valve replacementNumber of casesAverage length of stayAverage costs
Proposed New MS-DRG 266 with MCC3,51610.6$61,891
Proposed New MS-DRG 267 without MCC3,7715.746,259

We invited public comments on our proposal to create these new MS-DRGs for FY 2015.

Comment: Several commenters supported the proposal to create new MS-DRGs for endovascular cardiac valve replacement procedures. One commenter noted that “the endovascular or transcatheter approach presents a viable option for high-risk patients who are not candidates for the traditional open chest surgical approach. The proposed MS-DRGs better align the more extensive cardiac valve procedures based on clinical coherence and similar resource costs.” Another commenter stated that, by establishing these new MS-DRGs, “CMS will continue to be able to collect the necessary information that will help assure appropriate payment in the future as these technologies evolve.” Other commenters supported creation of the new MS-DRGs, noting it was reasonable given the data and information provided. Another commenter applauded CMS for proposing the two new MS-DRGs, noting that “this decision will allow patients, particularly women, to have increased access to innovative therapies that will ease their suffering from the debilitating effects of severe aortic stenosis.”

Response: We appreciate the commenters' support.

Comment: One commenter commended CMS for proposing new MS-DRGs to identify endovascular/transcatheter valve procedures. However, the commenter suggested that CMS reconsider the title of the proposed MS-DRGs. The commenter noted that the accepted nomenclature is “transcatheter” and not “endovascular”.

Response: We acknowledge that many individuals prefer the use of the term “transcatheter”, such as occurs in the frequently used acronym TAVR (transcatheter aortic valve replacement). However, we note that this nomenclature is by no means universal. “Endovascular” is also used to describe these procedures. The current ICD-9-CM procedure code for TAVR, for example, is 35.05 (Endovascular replacement of aortic valve). Recognizing that universal agreement on medical nomenclature is still an unachievable goal at the present time, we have elected to retain the term “endovascular” to maintain consistency with the current ICD-9-CM terminology.

After consideration of the public comments we received, we are finalizing our proposal to create new MS-DRG 266 (Endovascular Cardiac Valve Replacement with MCC) and MS-DRG 267 (Endovascular Cardiac Valve Replacement without MCC).

d. Abdominal Aorta Graft

We received a request that we change the MS-DRG assignment for procedure code 39.71 (Endovascular implantation of other graft in abdominal aorta), which is assigned to MS-DRGs 237 and 238 (Major Cardiovascular Procedures with MCC and without MCC, respectively). The requestor asked that we reassign procedure code 39.71 to MS-DRGs 228, 229, and 230 (Other Cardiothoracic Procedures with MCC, with CC, and without CC/MCC, respectively). The requestor stated that the average cost of endovascular abdominal aorta graft implantation cases is significantly higher than other cases in MS-DRGs 237 and 238. The requestor stated that the average cost of endovascular abdominal aorta graft implantation cases is closer to those in MS-DRGs 228, 229, and 230.

The requestor stated that the goal of endovascular repair for abdominal aneurysm is to isolate the diseased, aneurismal portion of the aorta and common iliac arteries from continued exposure to systemic blood pressure. The procedure involves the delivery and deployment of endovascular prostheses, also referred to as a graft, as required to isolate the aneurysm above and below the extent of the disease. The requestor stated that this significantly reduces patient morbidity and death caused by leakage and/or sudden rupture of an untreated aneurysm.

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for cases of endovascular abdominal aorta graft implantations. The following table shows our findings.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 237—All cases17,8139.66$35,642
MS-DRG 237—Cases with procedure code 39.712,0938.3044,898
MS-DRG 238—All cases33,6443.7324,511
MS-DRG 238—Cases with procedure code 39.7115,4832.3028,484
MS-DRG 228—All cases1,54313.4852,315
MS-DRG 229—All cases2,0037.4732,070
MS-DRG 230—All cases4934.9529,281

As this table shows, endovascular abdominal aorta graft implantation cases have higher average costs and shorter lengths of stay than all cases within MS-DRGs 237 and 238. The average cost for endovascular abdominal aorta graft implantation cases in MS-DRG 237 is $9,256 greater than that for all cases in MS-DRG 237 ($44,898 compared to $35,642). The average cost for endovascular abdominal aorta graft implantation cases in MS-DRG 238 is $3,973 higher than that for all cases in MS-DRG 238 ($28,484 compared to $24,511). Cases in MS-DRG 228 have average costs that are $7,417 higher than the endovascular abdominal aorta graft implantation cases in MS-DRG 237 ($52,315 compared to $44,898). MS-DRG 228 and MS-DRG 237 both contain cases with MCCs. Cases in MS-DRG 229, which contain a CC, have average costs that are $3,586 higher than average costs of the endovascular abdominal aorta graft implantation cases in MS-DRG 238, which do not contain an MCC ($32,070 compared to $28,484). Cases in MS-DRG 230, which have neither an MCC nor a CC, have average costs that are $797 higher than the endovascular abdominal aorta graft implantation cases in MS-DRG 238 ($29,281 compared to $28,484). While the average costs were higher for endovascular abdominal aorta graft implantation cases compared to all cases within MS-DRGs 237 and 238, each MS-DRG has some cases that are higher and some cases that are lower than the average costs for the entire MS-DRG. MS-DRGs were developed to capture cases that are clinically consistent with similar overall average resource requirements. This results in some cases within an MS-DRG having costs that are higher than the overall average and other cases having costs that are lower than the overall average. This may be due to specific types of cases included within the MS-DRGs or to the fact that some cases will simply require additional resources on a specific admission. However, taken as a whole, the hospital will be paid an appropriate amount for the group of cases that are assigned to the MS-DRG. We believe the endovascular abdominal aorta graft implantation cases are appropriately grouped with other procedures within MS-DRGs 237 and 238.

Our clinical advisors reviewed this issue and determined that the endovascular abdominal aorta graft implantation cases are appropriately classified within MS-DRGs 237 and 238 because they are clinically similar to the other procedures in MS-DRGs 237 and 238, which include other procedures on the aorta. While the endovascular abdominal aorta graft implantation cases have higher average costs than the average for all cases within MS-DRGs 237 and 238, our clinical advisors do not believe this justifies moving the cases to MS-DRGs 228, 229 and 230, which involve a different set of cardiothoracic surgeries.

As we stated in the FY 2015 IPPS/LTCH PPS proposed rule, based on the results of examination of the claims data and the recommendations of our clinical advisors, we did not believe that proposing to reclassify endovascular abdominal aorta graft implantation cases from MS-DRGs 237 and 238 was warranted. We proposed to maintain the current MS-DRG assignments for endovascular abdominal aorta graft implantation cases. We invited public comments on our proposal.

Comment: A number of commenters supported CMS' proposal to maintain the current MS-DRG assignments for endovascular abdominal aorta graft implantation cases. The commenters stated that the proposal was reasonable given the data and information provided. One commenter disagreed with the proposal and stated that endovascular abdominal aorta graft implantation cases should be reassigned to MS-DRGs 228, 229, and 230. The commenter stated that neither MS-DRGs 237 and 238 nor MS-DRGs 228, 229, and 230 have absolute clinical coherence and that there are a mix of procedures in both set of MS-DRGs. The commenter also expressed concern that CMS was prioritizing clinical coherence over total resource cost in deciding not to approve this request to assign procedure code 39.71 to MS-DRGs 228, 229, and 230. The commenter stated that if CMS is concerned about the perception regarding clinical coherence of the MS-DRG assignment for procedures represented by code 39.71, CMS should change the titles for these five MS-DRGs to accommodate the evolution of these procedures while also allowing for new indications of various types of grafts in the aorta and its branches. The commenter did not suggest specific new MS-DRG titles for MS-DRGs 228, 229, 230, 237, and 238.

Response: We appreciate the commenters' support for our proposal to maintain the current assignments for endovascular abdominal aorta graft implantation cases in MS-DRGs 237 and 238. We are not accepting the commenter's suggestion that we modify the titles of MS-DRGs 228, 229, 230, 237, and 238 in order to justify the reassignment of abdominal aorta graft procedures to MS-DRGs 228, 229, and 230. Our clinical advisors reviewed this issue and disagree with the commenters' statement that CMS puts too high a priority on the clinical coherence of the MS-DRGs. MS-DRGs were developed based on clinical similarities of groups of medical and surgical patients. We also consider average costs of these patients in evaluating the need to make modifications to the MS-DRGs. However, for the reasons described previously, we do not believe that the higher average costs for the endovascular abdominal aorta graft implantation cases as compared to the average for all cases within MS-DRGs 237 and 238 warrant reassigning these cases to MS-DRGs 228, 229, and 230. We will continue to evaluate the need to make updates to the MS-DRGs to better capture procedures of the aorta and its branches. We welcome any specific recommendations for refinements to better capture changes in medical treatment. Any requests for MS-DRG updates must be received by December 7, 2014, in order to be considered for the FY 2016 proposed rule.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current assignments for endovascular abdominal aorta graft implantation cases in MS-DRGs 237 and 238.

5. MDC 8 (Diseases and Disorders of the Musculoskeletal System and Connective Tissue)

a. Shoulder Replacement Procedures

We received a request to change the MS-DRG assignment for shoulder replacement procedures. This request involved the following two procedure codes:

  • 81.88 (Reverse total shoulder replacement); and
  • 81.97 (Revision of joint replacement of upper extremity).

With respect to procedure code 81.88, the requestor asked that reverse total shoulder replacements be reassigned from MS-DRGs 483 and 484 (Major Joint/Limb Reattachment Procedure of Upper Extremities with CC/MCC and without CC/MCC, respectively) to MS-DRG 483 only. The reassignment of procedure code 81.88 from MS-DRGs 483 and 484 was discussed previously in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50534 through 50536). The result of reassigning reverse shoulder replacements from MS-DRGs 483 and 484 to MS-DRG 483 only would be that this procedure would be assigned to MS-DRG 483 whether or not the case had a CC or an MCC. The requestor stated that reverse shoulder replacement procedures are more clinically cohesive with higher severity MS-DRGs due to the complexity and resource consumption of these procedures. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50534 through 50536) for a discussion of the reverse total shoulder replacement.

The requestor also recommended that we reassign what it described as another shoulder procedure involving procedure code 81.97, which is assigned to MS-DRGs 515, 516, and 517 (Other Musculoskeletal System and Connective Tissue O.R. Procedures with MCC, with CC, and without CC/MCC, respectively), to MS-DRG 483. We point out that MS-DRG 483 contains upper joint replacements, including shoulder replacements. MS-DRG 483 does not contain any joint revision procedures. Similar to the request for reassignment of procedure code 81.88, this would mean that procedure code 81.97 would be assigned to MS-DRG 483 whether or not the case had a CC or an MCC. If CMS did not support this recommendation for moving procedure code 81.97 to MS-DRG 483, the requestor recommended an alternative reassignment to MS-DRG 515 (Other Musculoskeletal System and Connective Tissue O.R. procedures with MCC) even if the case had no MCC.

We point out that, while the requestor refers to procedure code 81.97 as a shoulder procedure, the code description actually includes revisions of joint replacements of a variety of upper extremity joints, including those in the elbow, hand, shoulder, and wrist.

As stated earlier, reverse shoulder replacements are assigned to MS-DRGs 483 and 484. Revisions of upper joint replacements are assigned to MS-DRGs 515, 516, and 517. We examined claims data from the December 2013 update of the FY 2013 MedPAR file for MS-DRGs 483 and 484. The following table shows our findings of cases of reverse shoulder replacement.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 483—All cases14,2203.20$18,807
MS-DRG 483—Cases with procedure code 81.887,0863.1920,699
MS-DRG 484—All cases23,1831.9516,354
MS-DRG 484—Cases with procedure code 81.889,6332.0318,719
Proposed Revised MS-DRG 483 with all severity levels included37,4032.417,287

As the above table shows, MS-DRG 484 reverse shoulder replacement cases have similar average costs to those in MS-DRG 483 ($18,719 for reverse shoulder replacements in MS-DRG 484 compared to $18,807 for all cases in MS-DRG 483). However, in reviewing the data, we observed that the claims data no longer support two severity levels for MS-DRGs 483 and 484.

We use the five criteria established in FY 2008 (72 FR 47169) to review requests involving the creation of a new CC or MCC subgroup within a base MS-DRG. As outlined in the FY 2012 IPPS/LTCH PPS proposed rule (76 FR 25819), the original criteria were based on average charges but were later converted to average costs. In order to warrant creation of a CC or an MCC subgroup within a base MS-DRG, the subgroup must meet all of the following five criteria:

  • A reduction in variance of costs of at least 3 percent.
  • At least 5 percent of the patients in the MS-DRG fall within the CC or MCC subgroup.
  • At least 500 cases are in the CC or MCC subgroup.
  • There is at least a 20-percent difference in average costs between subgroups.
  • There is a $2,000 difference in average costs between subgroups.

We found through our examination of the claims data from the December 2013 update of the FY 2013 MedPAR file that the two severity subgroups of MS-DRG 483 and 484 no longer meet the fourth criterion of at least a 20-percent difference in average costs between subgroups. We found that there is a $2,453 difference in average costs between MS-DRG 483 and MS-DRG 484. The difference in average costs would need to be $3,761 to meet the fourth criterion. Therefore, our claims data support collapsing MS-DRGs 483 and 484 into a single MS-DRG. Our clinical advisors reviewed this issue and agreed that there is no longer enough difference between the two severity levels to justify separate severity subgroups for MS-DRGs 483 and 484, which include a variety of upper joint replacements. Therefore, our clinical advisors supported our recommendation to collapse MS-DRGs 483 and 484 into a single MS-DRG.

In the FY 2015 IPPS/LTCH PPS proposed rule, based on the results of examination of the claims data and the advice of our clinical advisors, we proposed to collapse MS-DRGs 483 and 484 into a single MS-DRG by deleting MS-DRG 484 and revising the title of MS-DRG 483 to read “Major Joint/Limb Reattachment Procedure of Upper Extremities”.

The following table shows our findings of cases of revisions of upper joint replacement from the December 2013 update of the FY 2013 MedPAR file.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 515—All cases3,4079.22$22,191
MS-DRG 515—Cases with procedure code 81.97885.6622,085
MS-DRG 516—All cases8,5025.3414,356
MS-DRG 516—Cases with procedure code 81.977992.8418,214
MS-DRG 517—All cases5,7943.2812,172
MS-DRG 517—Cases with procedure code 81.971,2562.0715,920
MS-DRG 483—All cases14,2203.2018,807

Cases identified by code 81.97 in MS-DRGs 515, 516, and 517 have lower average costs and shorter lengths of stay than all cases in MS-DRG 515. The average costs of cases in MS-DRG 515 are $3,977 higher than the average costs of the cases with procedure code 81.97 in MS-DRG 516 ($22,191 compared to $18,214). The average costs of cases in MS-DRG 515 are $6,271 higher than cases with procedure code 81.97 in MS-DRG 517 ($22,191 compared to $15,920).

The table above shows that the average costs of cases in MS-DRG 483 are $3,278 lower than the average costs of cases with procedure code 81.97 in MS-DRG 515 ($18,807 compared to $22,085). The average costs of cases in MS-DRG 483 are $593 higher than the average costs of cases with procedure code 81.97 in MS-DRG 516 ($18,807 compared to $18,214). The average costs of cases in MS-DRG 483 are $2,887 higher than the average costs of cases with procedure code 81.97 in MS-DRG 517 ($18,807 compared to $15,920).

The claims data did not support moving all procedure code 81.97 cases to MS-DRG 515 or MS-DRG 483, whether or not there is a CC or an MCC. We also pointed out once again that procedure code 81.97 is a nonspecific code that captures revisions to not only the shoulder, but also a variety of upper extremity joints including those in the elbow, hand, shoulder, and wrist. Therefore, we have no way of determining how many cases reporting procedure code 81.97 were actually shoulder procedures as opposed to procedures on the elbow, hand, or wrist.

Our clinical advisors reviewed this issue and determined that the revisions of upper joint replacement procedures are appropriately classified within MS-DRGs 515, 516, and 517, which include other joint revision procedures. They did not support moving revisions of upper joint replacement procedures to MS-DRG 515, whether or not there is an MCC. They supported the current classification, which bases the severity level on the presence of a CC or an MCC. They also did not support moving revisions of upper joint replacement procedures to MS-DRG 483, whether or not there is a CC or an MCC, because these revisions are not joint replacements. Based on the results of our examination and the advice of our clinical advisors, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose moving revisions of upper joint replacement procedures to MS-DRG 515 or MS-DRG 483, whether or not there is a CC or an MCC.

In summation, we proposed to collapse MS-DRGs 483 and 484 into a single MS-DRG by deleting MS-DRG 484 and revising the title of MS-DRG 483 to read “Major Joint/Limb Reattachment Procedure of Upper Extremities”. We proposed to maintain the current MS-DRG assignments for revisions of upper joint replacement procedures in MS DRGs 515, 516, and 517. We invited public comments on our proposals.

Comment: A number of commenters supported the proposal to collapse MS-DRGs 483 and 484 into a single MS-DRG by deleting MS-DRG 484 and revising the title of MS-DRG 483 to read “Major Joint/Limb Reattachment Procedure of Upper Extremities.” The commenters stated that the proposal was reasonable given the data and information provided.

One commenter stated that collapsing the two MS-DRGs is supported by claims data indicating little cost difference between cases in the current two severity levels. Several commenters stated that the new, single MS-DRG represented clinically cohesive procedures with similar complexity and resource consumption.

Response: We appreciate the commenters' support for our proposal to collapse MS-DRGs 483 and 484 into a single MS-DRG by deleting MS-DRG 484 and revising the title of MS-DRG 483 to read “Major Joint/Limb Reattachment Procedure of Upper Extremities”.

After consideration of the public comments we received, we are adopting as final, without modification, our proposal to collapse MS-DRGs 483 and 484 into a single MS-DRG by deleting MS-DRG 484 and revising the title of MS-DRG 483 to read “Major Joint/Limb Reattachment Procedure of Upper Extremities”.

Comment: A number of commenters supported the proposal to maintain the MS-DRG assignment for code 81.97 in MS-DRGs 515, 516, and 517. The commenters stated that the recommendation was reasonable give the data and information provided. One commenter disagreed with the proposal and stated that code 81.97 would be more accurately classified in MS-DRG 483 (Major Joint/Limb Reattachment of Upper Extremities with CC/MCC) because MS-DRG 483 includes upper extremity procedures.

Response: We appreciate the commenters' support for our proposal to maintain the current MS-DRG assignment for code 81.97 in MS-DRGs 515, 516, and 517. We disagree with the commenter that code 81.97 is similar to other procedures currently assigned to MS-DRG 483. MS-DRG 483 contains replacements, not revisions, of the wrist, shoulder, and elbow as well as reattachments of the forearm. Revision of the joint could include a variety of procedures to joints of the upper extremity. Procedure code 81.97 is a nonspecific code that captures revisions to not only the shoulder, but also a variety of upper extremity joints including those in the elbow, hand, shoulder, and wrist. Therefore, we have no way of determining how many cases reporting procedure code 81.97 were actually shoulder procedures as opposed to procedures on the elbow, hand, or wrist.

Our clinical advisors reviewed this issue and continue to advise that code 81.97 not be reassigned to MS-DRG 483 because the procedure is neither a replacement nor a reattachment procedure as are the current procedures within MS-DRG 483. In addition, the code captures a variety of joint revisions of the upper extremities and is not clinically similar to the replacements and reattachment procedures in MS-DRG 483. Our clinical advisors recommend that code 81.97 continue to be assigned to MS-DRG 515, 516, and 517.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current assignment of code 81.97 in MS-DRG 515, 516, and 517.

b. Ankle Replacement Procedures

We received a request to change the MS-DRG assignment for two ankle replacement procedures. The request involved the following two procedure codes:

  • 81.56 (Total ankle replacement); and
  • 81.59 (Revision of joint replacement of lower extremity, not elsewhere classified).

The reassignment of procedure code 81.56 from MS-DRGs 469 and 470 (Major Joint Replacement or Reattachment of Lower Extremity with MCC and without MCC, respectively) to a new MS-DRG or, alternatively, to MS-DRG 469 was discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50536 through 50537). We refer readers to this final rule for a discussion of ankle replacement procedures. The requestor asked that we again evaluate reassigning total ankle replacement procedures. The requestor also asked that we reassign what it referred to as another ankle replacement revision procedure captured by procedure code 81.59 (Revision of joint replacement of lower extremity, not elsewhere classified), which is assigned to MS-DRGs 515, 516, and 517 (Other Musculoskeletal System and Connective Tissue O.R. Procedures with MCC, with CC, and without CC/MCC, respectively).

The requestor asked that we reassign procedure code 81.56 from MS-DRGs 469 and 470 to MS-DRG 483 (Major Joint/Limb Reattachment Procedure of Upper Extremities with CC/MCC) and rename the MS-DRG to better capture the additional lower extremity cases. The requestor stated that the result would be assignment of lower joint procedures to an MS-DRG that currently captures only upper extremity cases and assignment to the highest severity level even if the case did not have a CC or an MCC. If CMS did not find this acceptable, the requestor made an alternative recommendation of assigning procedure code 81.56 to MS-DRG 469 and renaming the MS-DRG to better capture the additional cases. Cases would be assigned to the highest severity level whether or not the case had an MCC.

The requestor also recommended that procedure code 81.59, which is assigned to MS-DRGs 515, 516, and 517, be reassigned to MS-DRG 483 and that the MS-DRG be given a new title to better capture the additional lower extremity cases. The requestor stated that the result would be assignment of lower joint procedures to an MS-DRG that currently captures only upper extremity cases and assignment to the highest severity level even if the patient did not have a CC or an MCC. If CMS did not support this recommendation, the requestor suggested two additional recommendations. One involves moving procedure code 81.59 to MS-DRG 515 even when the case had no MCC. The other recommendation was to move procedure code 81.59 to MS-DRG 469, whether or not the case had a MCC.

We point out that while the requestor refers to procedure code 81.59 as a revision of an ankle replacement, the code actually includes revisions of joint replacements of a variety of lower extremity joints including the ankle, foot, and toe.

The following table shows the number of total ankle replacement cases, average length of stay, and average costs for procedure code 81.56 in MS-DRGs 469 and 470 found in claims data from the December 2013 update of the FY 2013 MedPAR file compared to all cases within MS-DRGs 469, 470, and 483.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 469—All cases25,9167.22$22,548
MS-DRG 469—Cases with procedure code 81.56326.1927,419
MS-DRG 470—All cases406,3443.2515,119
MS-DRG 470—Cases with procedure code 81.561,3792.1319,332
MS-DRG 48314,2203.2018,807

In summary, the requestor asked us to reassign procedure code 81.56 in MS-DRGs 469 and 470 to one of the following two options: MS-DRG 483 (highest severity level); or MS-DRG 469 (highest severity level).

As the table for total ankle replacement above shows, the average cost of cases with procedure code 81.56 in MS-DRG 469 is $27,419 and $19,332 in MS-DRG 470. This compares with the average costs of all cases in MS-DRGs 469 and 470 of $22,548 and $15,119, respectively. While the average cost of cases reporting procedure code 81.56 in MS-DRG 469 is $4,871 higher than the average cost for all cases in MS-DRG 469, we point out that there were only 32 cases. The relatively small number of cases may have been impacted by other factors such as complications or comorbidities. Several expensive cases could impact the average costs for a very small number of patients. The average cost of cases reporting procedure code 81.56 in MS-DRG 470 is $4,213 higher than the average cost for all cases in MS-DRG 470. While the average costs are higher, within all MS-DRGs, some cases have higher and some cases have lower average costs. MS-DRGs are groups of clinically similar cases that have similar overall costs. Within a group of cases, one would expect that some cases have costs that are higher than the overall average and some cases have costs that are lower than the overall average.

MS-DRG 469 ankle replacement cases have average costs that are $8,612 higher than the average costs of all cases in MS-DRG 483 ($27,419 compared to $18,807). Moving these cases (procedure code 81.56) to MS-DRG 483 would result in payment below average costs compared to the current MS-DRG assignment in MS-DRG 469. Furthermore, as noted earlier, moving total ankle replacement cases to MS-DRG 483 would result in a lower extremity procedure being added to what is now an upper extremity MS-DRG. This would significantly disrupt the clinical cohesion of MS-DRG 483.

The average costs of all cases in MS-DRG 469 are $3,216 higher than the average costs of those cases with procedure code 81.56 in MS-DRG 470 ($22,548 compared to $19,332). The data did not support moving procedure code 81.56 cases to MS-DRG 483 or 469 because it would not result in payments that more accurately reflect their current average costs. Our clinical advisors reviewed this issue and determined that the ankle replacement cases are appropriately classified within MS-DRGs 469 and 470 with the severity level leading to the MS-DRG assignment. They did not support moving these cases to MS-DRG 483 because ankle replacements, which are lower joint procedures, are not clinically similar to upper joint replacement procedures. Based on the results of examination of the claims data, the issue of clinical cohesion, and the recommendations from our clinical advisors, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to move total ankle procedures to MS-DRG 483 or MS-DRG 469 when there is no MCC. We proposed to maintain the current MS-DRG assignments for ankle replacement cases. We invited public comments on our proposal.

Comment: A number of commenters supported the proposal to maintain the current MS-DRG assignments for ankle replacement cases. The commenters stated the proposal was reasonable given the data and information provided. Several other commenters urged CMS to reconsider its decision and to create a new MS-DRG for total ankle replacements for FY 2015 that is more appropriate both in terms of resource utilization and clinical cohesiveness, and reassign ICD-9-CM procedure code 81.56 to the new MS-DRG. The commenters stated that, despite evidence that the current Medicare assignment results in payments to hospitals below the average costs for total ankle replacement procedures, and the greater clinical complexity of total ankle replacements relative to other procedures that map to these same MS-DRGs, CMS proposed to maintain the current MS-DRG assignment for total ankle replacement procedures. The commenters stated that total ankle replacement is a complex surgical procedure involving the replacement of the damaged parts of three bones (talus, tibia, and fibula) that make up the articulations of the ankle, as compared to two bones in most other total joint replacement procedures, including hips and knees. The commenters stated that the resources involved with total ankle replacement procedures are not comparable to other procedures in the major joint MS-DRG and that failure to establish a new MS-DRG that more appropriately reflects the higher cost will likely comprise patient access to this procedure.

One commenter acknowledged that there are a relatively small volume of total ankle replacement procedures compared to total hip and total knee replacements. However, the commenter suggested that this imbalance in case volume of total ankle replacements compared to total hip and knee replacements dampens the influence of actual hospital cost data for the total ankle replacements. The commenter recommended that all total ankle replacements be assigned to MS-DRG 469 even if the case does not have a MCC. This commenter acknowledged that the average cost of cases with procedure code 81.56 in MS-DRG 470 is $19,332 compared to average cost of $22,548 for all cases in MS-DRG of 469. However, the commenter suggested that moving all total ankle replacements to MS-DRG 469 was more appropriate than having cases assigned to MS-DRGs 469 and 470 based on the presence of an MCC. The commenter also acknowledged CMS' statement that under the MS-DRG system in general, some cases will have average costs higher than the overall average costs for the MS-DRG, while other cases will have lower average costs. However, the commenter stated that this was an insufficient rationale to apply to total ankle replacements. The commenter disagreed with the determination of the CMS clinical advisors that ankle replacement cases are appropriately classified within MS-DRGs 469 and 470, based on severity level. The commenter stated that total ankle replacement is a complicated surgery that involves the replacement of the damaged parts of the three bones that make up the ankle joint, as compared to two bones in hip and knee replacement procedures. The commenter stated that this surgery required a specialized skill set, operative technique, and level of operating room resource utilization that is vastly dissimilar from that of total hip and total knee replacements. The commenter recommended that CMS create a new MS-DRG for total ankle replacements or move all total ankle replacements to MS-DRG 469.

Response: We appreciate the commenters' support for our proposal to maintain the current MS-DRG assignment for total ankle replacements. We are not accepting the commenter's recommendation to create a new MS-DRG for total ankle replacements or to move all cases to MS-DRG 469. We point out that there were only 1,411 total ankle replacements with 32 cases in MS-DRG 469 and 1,379 cases in MS-DRG 470. Creating a new MS-DRG for this single procedure would not be appropriate. MS-DRGs were created to provide payment to hospitals for groups of clinically similar conditions and procedures. MS-DRGs were not created to provide payment for each single procedure. MS-DRGs 469 and 470 contain replacement and reattachment procedures of the lower extremity, including those of the hip, knee, ankle, foot, lower leg, and thigh. Within each MS-DRG, there will be cases with costs higher than the average costs and others with costs below the average costs. Basing a new MS-DRG on a small number of cases could lead to distortions in the relative payment weights for the MS DRG because several expensive cases could impact the overall relative payment weight. Having larger clinically cohesive groups within an MS-DRG provides greater stability for annual updates to the relative payment weights. We also point out that combining total ankle replacements into a single new MS-DRG would result in the same payment for cases with an MCC as those without an MCC. As indicated above, total ankle replacements with MCCs have average costs of $27,419 and those without MCCs have average costs of $19,332. Combining all total ankle replacements into a single, newly created MS-DRG would reduce the payment accuracy of cases with different severity levels.

We also disagree with the recommendation to move all total ankle replacement to MS-DRG 469. As stated earlier, total ankle replacements with MCCs have average costs of $27,419 and those without MCCs have average costs of $19,332. The average cost of all cases in MS-DRG 469 (which includes cases with MCCs) is $22,548. We point out again that, under the MS-DRGs, some cases will have average costs higher than the overall average costs for the MS-DRG while other cases will have lower average costs. The total ankle replacements are appropriately assigned to MS-DRGs 469 and 470 based on the presence of a MCC.

Our clinical advisors reviewed the public comments and clinical data and continue to support maintaining the current MS-DRG assignment for total ankle replacements. They advised that total ankle replacements are appropriately assigned to MS-DRGs 469 and 470 along with other major joint replacement and reattachment procedures of the lower extremities because they are all replacement and reattachment procedures of the lower extremities. Our clinical advisors noted that, whereas they consider average cost as one element of the decision, they expect the average cost of any subset to be different than the average cost of the MS-DRG, as that is inherent in a system of averages. They note that average length of stay, another metric of resource usage, is lower than the MS-DRG average for this subgroup. Even more importantly, they further noted that leaving these procedures in a MS-DRG with other lower extremity procedures promotes greater clinical consistency than could be achieved by moving the ankle procedures into an upper extremity DRG. They noted that, for the inpatient prospective system, clinical consistency includes not just technical considerations of the surgery or device costs but also consideration of pre- and post-operative patient care needs, medications, and care for common comorbid conditions, among other factors. Finally, our clinical advisors also pointed out that creating a new MS-DRG for total ankle replacements would result in combining cases with average length of stay of 6.19 days for cases with MCC and 2.13 days for cases without MCC. The cases are more appropriately assigned to MS-DRGs 469 and 470 with the two severity levels. Our clinical advisors do not support creating a new MS-DRG which would contain only total ankle replacements.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current MS-DRG assignment for total ankle replacements in MS-DRGs 469 and 470.

The following table shows our findings from examination of the claims data from the December 2013 update of the FY 2013 MedPAR file for the number of cases reporting procedure code 81.59 in MS-DRGs 515, 516, and 517 (revision of joint replacement of lower extremity) and their average length of stay and average costs as compared to all cases within MS-DRGs 515, 516, and 517 (where procedure code 81.59 is currently assigned), as well as data for MS-DRGs 469 and 483.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 515—All cases3,4079.22$22,191
MS-DRG 515—Cases with procedure code 81.5956.0016,988
MS-DRG 516—All cases8,5025.3414,356
MS-DRG 516—Cases with procedure code 81.59163.0016,998
MS-DRG 517—All cases5,7943.2812,172
MS-DRG 517—Cases with procedure code 81.59401.8013,704
MS-DRG 483—All cases25,91672222,548
MS-DRG 469—All cases14,2203.2018,807

The requestor asked that all cases with procedure code 81.59 in MS-DRGs 515, 516, and 517 be assigned to one of the following three choices:

  • MS-DRG 483 (highest severity level);
  • MS-DRG 515 (highest severity level) whether or not there is an MCC; or
  • MS-DRG 469 (highest severity level).

Our review of data from the above revision of joint replacement of lower extremity table shows that cases in MS-DRG 483 have average costs that are $5,560 higher than the average costs of cases with procedure code 81.59 in MS-DRG 515; $5,550 greater than those in MS-DRG 516; and $8,844 greater than those in MS-DRG 517 ($22,548 compared to $16,988; $22,548 compared to $16,998, and $22,548 compared to $13,704, respectively). As mentioned earlier, MS-DRG 483 is currently composed of only upper extremity procedures. Moving lower extremity procedures into this MS-DRG would disrupt the clinical cohesiveness of MS-DRG 483.

The average costs of all cases in MS-DRG 469 are $18,807, compared to average costs of $16,988, $16,998, and $13,703 for procedure code 81.59 cases in MS-DRGs 515, 516, and 517, respectively. The data did not support moving all procedure code 81.59 cases to MS-DRG 469 even when there is no MCC. We also point out that moving cases with procedure code 81.59 to MS-DRG 469 would disrupt the clinical cohesiveness of MS-DRG 469, which currently captures major joint replacement or reattachment procedures of the lower extremity. Procedure code 81.59 includes revisions of joint replacements of a variety of lower extremity joints including the ankle, foot, and toe. This nonspecific code would not be considered a major joint procedure. The code captures revisions of an ankle replacement as well as a more minor revision of the toe.

Our clinical advisors reviewed this issue and determined that the revision of joint replacement of lower extremity cases are appropriately classified within MS-DRGs 515, 516, and 517 where revisions of other joint replacements are captured. They supported the current severity levels in MS-DRGs 515, 516, and 517, which allow the presence of a CC or an MCC to determine the severity level assignment. They did not support moving these cases to MS-DRG 483, which is applied to upper extremity procedures because these procedures are not clinically consistent with revisions of lower joint procedures. They also did not support moving these cases to MS-DRG 469 when there is no MCC because these procedures are not joint replacement procedures. Based on the findings of our examination of the claims data, the issue of clinical cohesion, and the recommendations from our clinical advisors, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to move the revision of joint replacement of lower extremity cases to MS-DRGs 483 or 469, whether or not there is an MCC. We proposed to maintain the current MS-DRG assignments for revision of joint replacement of lower extremity cases.

In summary, we proposed to maintain the current MS-DRG assignment for total ankle replacements in MS-DRGs 469 and 470 and revision of joint replacement of lower extremity procedures in MS-DRGs 515, 516, and 517. We invited public comments on our proposals.

Comment: A number of commenters supported the proposal to maintain the current MS-DRG assignment for code 81.59. One commenter agreed with this proposal given the lack of specificity for this code which does not identify the specific joint being revised. The commenter recommended that CMS create the following new ICD-9-CM procedure code: 81.58 (Revision of ankle replacement, not otherwise specified). Once this code is created, the commenter recommended that this new code be assigned to MS-DRGs 466, 467, and 468 and that these MS-DRGs be renamed Revision of Hip, Knee or Ankle (with MCC, with CC, and without CC/MCC, respectively).

Response: We appreciate the commenters' support for our proposal not to change the MS-DRG assignment for code 81.59. We agree with the commenter who pointed out that code 81.59 does not identify the joint being revised and, therefore, code 81.59 should continue to be assigned to MS-DRGs 515, 516, and 517. ICD-10-PCS codes provide greater detail than do ICD-9-CM codes and provide the ability to identify the joint being revised. As mentioned earlier, the Secretary announced plans to release an interim final rule in the near future that will include a new compliance date to require the use of ICD-10 beginning October 1, 2015. The interim final rule will also require HIPAA covered entities to continue to use ICD-9-CM through September 30, 2015. Given this timeline, it will not be possible to create a new ICD-9-CM procedure code for the next annual update on October 1, 2015 because ICD-10 will be implemented on that date. However, ICD-10-PCS will provide the necessary level of detail.

After consideration of the public comments we received, we are finalizing our proposal to maintain the current MS-DRG assignment for total ankle replacements in MS-DRGs 469 and 470 and revision of joint replacement of lower extremity procedures in MS-DRGs 515, 516, and 517.

c. Back and Neck Procedures

We received a request to reassign cases identified with a complication or comorbidity (CC) in MS-DRG 490 (Back & Neck Procedures Except Spinal Fusion with CC/MCC or Disc Device/Neurostimulator) to MS-DRG 491 (Back & Neck Procedures Except Spinal Fusion without CC/MCC or Disc Device/Neurostimulator). The requester suggested that we create a new MS-DRG that would be subdivided based solely on the “with MCC or Disc Device/Neurostimulator” and the “without MCC” (and no device) criteria.

For the FY 2008 rulemaking cycle, we performed a comprehensive analysis of all the spinal DRGs as we proposed (72 FR 24731 through 24735) and finalized (72 FR 47226 through 47232) adoption of the MS-DRGs. With the revised spinal MS-DRGs, we were better able to identify a patient's level of severity, complexity of service, and utilization of resources. This was primarily attributed to the new structure for the severity level designations of “with MCC,” “with CC,” and “non-CC” (or without CC/MCC). Another contributing factor was that we incorporated specific procedures and technologies into the GROUPER logic for some of those spinal MS-DRGs. Specifically, as noted above, in the title of MS-DRG 490, we accounted for disc devices and neurostimulators because the data demonstrated that the procedures utilizing those technologies were more complex and required greater utilization of resources.

According to the requester, since that time, concerns have been expressed in the provider community regarding inadequate payment for MS-DRG 490 when these technologies are utilized. An analysis conducted by the requester alleged that the subset of patients identified in the “with MCC or disc device/neurostimulator” group are different with regard to resource use from the “without CC/MCC” (and no device) patient group.

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for MS-DRGs 490 and 491. The table below shows our findings.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 490—All cases16,9304.53$13,727
MS-DRG 491—All cases25,7782.208,151

As shown in the table above, there were a total of 16,930 cases in MS-DRG 490 with an average length of stay of 4.53 days and average costs of $13,727. For MS-DRG 491, there were a total of 25,778 cases with an average length of stay of 2.20 days and average costs of $8,151.

We then analyzed the data for MS-DRGs 490 and 491 by subdividing cases based on the “with MCC or Disc Device/Neurostimulator” and the “without MCC” (and no device) criteria. We found a total of 3,379 cases with an average length of stay of 6.6 days and average costs of $21,493 in the “with MCC or Disc Device/Neurostimulator” group and a total of 39,329 cases with an average length of stay of 2.8 days and average costs of $9,405 in the “without MCC” and no device group. Due to the wide range in the volume of cases, length of stay, and average costs between these two subgroups, we concluded that further analysis of the data using a separate “with CC” (and no device) subset of patients was warranted.

Therefore, we evaluated the data using a three-way severity level split that consisted of the three subgroups shown in the table below.

Additional Analysis for Back & Neck Procedures Except Spinal Fusion: Disc Device/Neurostimulator

Severity level splitNumber of casesAverage length of stayAverage costs
—With MCC or disc device/neurostimulator3,3796.6$21,493
—With CC13,5513.911,791
—Without CC/MCC25,7782.28,151

For the first subgroup, “with MCC or Disc Device/Neurostimulator,” we found a total of 3,379 cases with an average length of stay of 6.6 days and average costs of $21,493. In the second subgroup, “with CC” (no device), we found a total of 13,551 cases with an average length of stay of 3.9 days and average costs of $11,791. In the third subgroup, “without CC/MCC” (no device), we found a total of 25,778 cases with an average length of stay of 2.2 days and average costs of $8,151.

The results of this additional data analysis demonstrate a better distribution of cases with regard to length of stay and average costs. Our clinical advisors agreed that a patient's severity of illness is captured more appropriately with this subdivision. The data also meet the established criteria for creating subgroups within a base MS-DRG as discussed earlier.

As the subdivision of the claims data based on these subgroups better captures a patient's severity level and utilization of resources and is supported by our clinical advisors, in the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to create three new MS-DRGs and to delete MS-DRGs 490 and 491. We proposed that these proposed new MS-DRGs would be titled as follows and would be effective as of October 1, 2014:

  • Proposed new MS-DRG 518 (Back & Neck Procedures Except Spinal Fusion with MCC or Disc Device/Neurostimulator);
  • Proposed new MS-DRG 519 (Back & Neck Procedures Except Spinal Fusion with CC); and
  • Proposed new MS-DRG 520 (Back & Neck Procedures Except Spinal Fusion without CC/MCC).

We invited public comments on our proposal to create these proposed new MS-DRGs for FY 2015.

Comment: Several commenters supported the proposal to delete MS-DRGs 490 and 491 and to create three new MS-DRGs that better account for a patient's severity of illness and utilization of resources when disc devices and neurostimulators are involved. One commenter stated that the new MS-DRGs would enable CMS to assess utilization of resources for these services and ensure that “important innovation in device dependent neurosurgical procedures is adequately accounted for and reimbursed appropriately.” Another commenter expressed its appreciation for CMS' careful data analysis that resulted in the development of the proposal. This commenter noted “that the data presented by CMS make a compelling case for the proposed three subdivisions, because it would more appropriately compensate hospitals for the costs associated with implantation of a disc device or neurostimulator than the current two-division framework.” Another commenter applauded CMS' past efforts to assure MS-DRGs 490 and 491 reflect the most appropriate payment amounts for these procedures. This commenter stated “the proposed three-way split of cases in current MS-DRGs 490 and 491 demonstrates a better distribution of cases with regard to resource use. CMS should proceed with its proposed change to this MS-DRG category to improve the accuracy of the payments, consistent with its criteria for establishing severity levels within the MS-DRGs.” Another commenter noted that “subdividing the code set into three distinct MS-DRGs is not only a more accurate representation of the clinical condition experienced by the patient, but also better categorizes the resources expended by the facility, as evidenced by the supporting claims data.”

Response: We thank the commenters for their support. As noted in the FY 2015 IPPS/LTCH PPS proposed rule, the additional data analysis demonstrated a better distribution of cases with regard to length of stay and average costs. Our clinical advisors also agreed that a patient's severity of illness is captured more appropriately with this subdivision. Lastly, the data also meet the established criteria for creating subgroups within a base MS-DRG as discussed earlier.

After consideration of the public comments we received, for FY 2015 we are adopting as final our proposal to create new MS-DRG 518 (Back & Neck Procedures Except Spinal Fusion with MCC or Disc Device/Neurostimulator); MS-DRG 519 (Back & Neck Procedures Except Spinal Fusion with CC); and MS-DRG 520 (Back & Neck Procedures Except Spinal Fusion without CC/MCC).

6. MDC 10 (Endocrine, Nutritional and Metabolic Diseases and Disorders): Disorders of Porphyrin Metabolism

We received a comment on the FY 2014 IPPS/LTCH PPS proposed rule that we considered out of scope for the proposed rule. We stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50550) that we would consider this issue in future rulemaking as part of our annual review process. The request was for the creation of a new MS-DRG to better identify cases where patients with disorders of porphyrin metabolism exist, to recognize the resource requirements in caring for these patients, to ensure appropriate payment for these cases, and to preserve patient access to necessary treatments. This issue has been discussed previously in the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27904 and 27905) and final rule (77 FR 53311 through 53313).

Porphyria is defined as a group of rare disorders (“porphyrias”) that interfere with the production of hemoglobin that is needed for red blood cells. While some of these disorders are genetic (inborn) and others can be acquired, they all result in the abnormal accumulation of hemoglobin building blocks, called porphyrins, which can be deposited in the tissues where they particularly interfere with the functioning of the nervous system and the skin. Treatment for patients suffering from disorders of porphyrin metabolism consists of an intravenous injection of Panhematin® (hemin for injection). In 1984, this pharmaceutical agent became the first approved drug for a rare disease to be designated under the Orphan Drug Act. The requestor stated that it is the only FDA-approved prescription treatment for acute intermittent porphyria. ICD-9-CM diagnosis code 277.1 (Disorders of porphyrin metabolism) describes these cases, which are currently assigned to MS-DRG 642 (Inborn and Other Disorders of Metabolism).

We analyzed claims data from the December 2013 update of the FY 2013 MedPAR file for cases assigned to MS-DRG 642. Our findings are shown in the table below.

MS-DRGNumber of casesAverage length of stayAverage costs
MS-DRG 642—All cases1,4864.61$8,151
MS-DRG 642—Cases with principal diagnosis code 277.12995.9813,303

As shown in the table above, we found a total of 1,486 cases in MS-DRG 642, with an average length of stay of 4.61 days and average costs of $8,151. We then analyzed the data for cases reporting diagnosis code 277.1 as the principal diagnosis in this same MS-DRG. We found a total of 299 cases, with an average length of stay of 5.98 days and average costs of $13,303.

While the data show that the average costs for the 299 cases reporting a principal diagnosis code of 277.1 were higher than the average costs for all cases in MS-DRG 642 ($13,303 compared to $8,151), the number of cases is small. In the FY 2015 IPPS/LTCH PPS proposed rule, we stated that, given the small number of porphyria cases, we did not believe there is justification for creating a new MS-DRG. Basing a new MS-DRG on such a small number of cases could lead to distortions in the relative payment weights for the MS-DRG because several expensive cases could impact the overall relative payment weight. Having larger clinical cohesive groups within an MS-DRG provides greater stability for annual updates to the relative payment weights. In addition, as discussed earlier, one of the criteria we apply in evaluating whether to create new severity subgroups within an MS-DRG is whether there are at least 500 cases in the CC or MCC subgroup. While this criterion is used to evaluate whether to create a severity subgroup within an MS-DRG, applying it here suggests that creating a new MS-DRG for cases reporting a principal diagnosis of code 277.1 would not be appropriate. Our clinical advisors reviewed this issue and recommended no MS-DRG change for porphyria cases because they fit clinically within MS-DRG 642.

In summary, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to create a new MS-DRG for porphyria cases. We invited public comments on our proposal to maintain porphyria cases in MS-DRG 642.

Comment: Several commenters supported the proposal to maintain porphyria cases in MS-DRG 642 and to not create a new MS-DRG for these cases.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing our proposal to maintain porphyria cases in MS-DRG 642 and to not create a new MS-DRG for these cases.

7. MDC 15 (Newborns and Other Neonates With Conditions Originating in the Perinatal Period)

We received a request to evaluate the MS-DRG assignment of seven ICD-9-CM diagnosis codes in MS-DRG 794 (Neonate with Other Significant Problems) under MDC 15. The requestor stated that these codes have no bearing on the infant, and are not representative of a neonate with a significant problem. The requestor recommended that we change the MS-DRG logic so that the following seven ICD-9-CM codes would not lead to assignment of MS-DRG 794. The requestor recommended that the diagnoses be added to the “only secondary diagnosis” list under MS-DRG 795 (Normal newborn) so that the case would be assigned to MS-DRG 795 (Normal newborn).

  • V17.0 (Family history of psychiatric condition)
  • V17.2 (Family history of other neurological Diseases)
  • V17.49 (Family history of other cardiovascular diseases)
  • V18.0 (Family history of diabetes mellitus)
  • V18.19 (Family history of other endocrine and metabolic diseases)
  • V18.8 (Family history of infectious and parasitic diseases)
  • V50.3 (Ear piercing)

In the case of a newborn with one of these diagnosis codes reported as a secondary diagnosis, the case would be assigned to MS-DRG 794. The commenter believed that any of these seven diagnosis codes (noted above), when reported as a secondary diagnosis for a newborn case, should be assigned to MS-DRG 795 instead of MS-DRG 794.

Our clinical advisors reviewed this request and concurred with the commenter that the seven ICD-9-CM diagnosis codes noted above should not continue to be assigned to MS-DRG 794, as there is no clinically usable information reported in those codes identifying significant problems. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28017), we proposed to reassign these following seven diagnoses to the “only secondary diagnosis list” under MS-DRG 795 so that the case would be assigned to MS-DRG 795.

  • V17.0 (Family history of psychiatric condition)
  • V17.2 (Family history of other neurological diseases)
  • V17.49 (Family history of other cardiovascular diseases)
  • V18.0 (Family history of diabetes mellitus)
  • V18.19 (Family history of other endocrine and metabolic diseases)
  • V18.8 (Family history of infectious and parasitic diseases)
  • V50.3 (Ear piercing)

We invited public comments on this proposal.

Comment: Several commenters supported the proposal to reassign the identified seven diagnoses to the “only secondary diagnosis” list under MS-DRG 795 so that the case would be assigned to MS-DRG 795.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing our proposal to reassign the following seven diagnoses to the “only secondary diagnosis list” under MS-DRG 795 so that the case would be assigned to MS-DRG 795:

  • V17.0 (Family history of psychiatric condition)
  • V17.2 (Family history of other neurological diseases)
  • V17.49 (Family history of other cardiovascular diseases)
  • V18.0 (Family history of diabetes mellitus)
  • V18.19 (Family history of other endocrine and metabolic diseases)
  • V18.8 (Family history of infectious and parasitic diseases)
  • V50.3 (Ear piercing)

8. Medicare Code Editor (MCE) Changes

The Medicare Code Editor (MCE) is a software program that detects and reports errors in the coding of Medicare claims data. Patient diagnoses, procedure(s), and demographic information are entered into the Medicare claims processing systems and are subjected to a series of automated screens. The MCE screens are designed to identify cases that require further review before classification into an MS-DRG.

As discussed in section II.G.1.a. of the preamble of this final rule, we developed an ICD-10 version of the current MS-DRGs, which are based on ICD-9-CM codes. We refer to this version of the MS-DRGs as the ICD-10 MS-DRGs Version 31.0-R. In November 2013, we also posted a Definitions of Medicare Code Edits Manual of the ICD-10 MCE Version 31.0 on the ICD-10 MS-DRG Conversion Project Web site at: http://www.cms.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We produced mainframe and computer software for Version 31.0 of the MS-DRG GROUPER with Medicare Code Editor, which was made available to the public in December 2013. Information on ordering the mainframe and computer software through NTIS was posted on the CMS Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html under the “Related Links” section. This ICD-10 MS-DRG GROUPER with Medicare Code Editor Version 31.0 computer software facilitated additional review of the ICD-10 MS-DRGs conversion. We encouraged the public to submit to CMS any comments on areas where they believed the ICD-10 MS-DRG GROUPER and MCE did not accurately reflect the logic and edits found in the ICD-9-CM MS-DRG GROUPER and MCE Version 31.0.

We also have posted an ICD-10 version of the current MCE, which is based on ICD-9-CM codes, and refer to that version of the MCE as the ICD-10 MCE Version 31.0-R. Both of these documents are posted on our ICD-10 MS-DRG Conversion Project Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD10/ICD-10-MS-DRG-Conversion-Project.html. We will continue to share ICD-10 MS-DRG and MCE conversion activities with the public through this Web site.

In the FY 2015 IPPS/LTCH PPS proposed rule, for FY 2015, we proposed to remove extracranial-intracranial (EC-IC) bypass surgery from the “Noncovered Procedure” edit code list for Version 32.0 of the MCE. This procedure is identified by ICD-9-CM procedure code 39.28 (Extracranial-intracranial (EC-IC) vascular bypass).

Because of the complexity of appropriately classifying the circumstances under which the EC-IC bypass surgery may, or may not, be considered reasonable and necessary for certain conditions, we proposed to remove the MCE “Noncovered Procedure” edit for EC-IC bypass surgery from the “Noncovered Procedure” edit code list for Version 32.0 of the MCE. We invited public comments on this proposal.

Comment: Several commenters supported the proposal to remove the MCE “Noncovered Procedure” edit for EC-IC bypass surgery (procedure code 39.28) from the “Noncovered Procedure” edit code list for Version 32.0 of the MCE. The commenters stated that the proposal was reasonable given the information that was provided. Commenters also agreed that because of the complexity of appropriately classifying the circumstances under which the EC-IC bypass surgery may be considered reasonable and necessary for certain conditions, the Medicare noncovered procedure edit for EC-IC bypass surgery should be removed.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing our proposal to remove procedure code 39.28 (Extracranial-intracranial (EC-IC) vascular bypass) from the noncovered procedure edit effective FY 2015.

9. Changes to Surgical Hierarchies

Some inpatient stays entail multiple surgical procedures, each one of which, occurring by itself, could result in assignment of the case to a different MS-DRG within the MDC to which the principal diagnosis is assigned. Therefore, it is necessary to have a decision rule within the GROUPER by which these cases are assigned to a single MS-DRG. The surgical hierarchy, an ordering of surgical classes from most resource-intensive to least resource-intensive, performs that function. Application of this hierarchy ensures that cases involving multiple surgical procedures are assigned to the MS-DRG associated with the most resource-intensive surgical class.

Because the relative resource intensity of surgical classes can shift as a function of MS-DRG reclassification and recalibrations, for FY 2015, we reviewed the surgical hierarchy of each MDC, as we have for previous reclassifications and recalibrations, to determine if the ordering of classes coincides with the intensity of resource utilization.

A surgical class can be composed of one or more MS-DRGs. For example, in MDC 11, the surgical class “kidney transplant” consists of a single MS-DRG (MS-DRG 652) and the class “major bladder procedures” consists of three MS-DRGs (MS-DRGs 653, 654, and 655). Consequently, in many cases, the surgical hierarchy has an impact on more than one MS-DRG. The methodology for determining the most resource-intensive surgical class involves weighting the average resources for each MS-DRG by frequency to determine the weighted average resources for each surgical class. For example, assume surgical class A includes MS-DRGs 001 and 002 and surgical class B includes MS-DRGs 003, 004, and 005. Assume also that the average costs of MS-DRG 001 are higher than that of MS-DRG 003, but the average costs of MS-DRGs 004 and 005 are higher than the average costs of MS-DRG 002. To determine whether surgical class A should be higher or lower than surgical class B in the surgical hierarchy, we would weigh the average costs of each MS-DRG in the class by frequency (that is, by the number of cases in the MS-DRG) to determine average resource consumption for the surgical class. The surgical classes would then be ordered from the class with the highest average resource utilization to that with the lowest, with the exception of “other O.R. procedures” as discussed below.

This methodology may occasionally result in assignment of a case involving multiple procedures to the lower-weighted MS-DRG (in the highest, most resource-intensive surgical class) of the available alternatives. However, given that the logic underlying the surgical hierarchy provides that the GROUPER search for the procedure in the most resource-intensive surgical class, in cases involving multiple procedures, this result is sometimes unavoidable.

We note that, notwithstanding the foregoing discussion, there are a few instances when a surgical class with a lower average cost is ordered above a surgical class with a higher average cost. For example, the “other O.R. procedures” surgical class is uniformly ordered last in the surgical hierarchy of each MDC in which it occurs, regardless of the fact that the average costs for the MS-DRG or MS-DRGs in that surgical class may be higher than those for other surgical classes in the MDC. The “other O.R. procedures” class is a group of procedures that are only infrequently related to the diagnoses in the MDC, but are still occasionally performed on patients with cases assigned to the MDC with these diagnoses. Therefore, assignment to these surgical classes should only occur if no other surgical class more closely related to the diagnoses in the MDC is appropriate.

A second example occurs when the difference between the average costs for two surgical classes is very small. We have found that small differences generally do not warrant reordering of the hierarchy because, as a result of reassigning cases on the basis of the hierarchy change, the average costs are likely to shift such that the higher-ordered surgical class has lower average costs than the class ordered below it.

Based on the changes that we proposed to make for FY 2015, as discussed in sections II.G.4.c., II.G.5.a., and II.G.5.c. of the preamble of the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to revise the surgical hierarchy for MDC 5 (Diseases and Disorders of the Circulatory System) and MDC 8 (Diseases and Disorders of the Musculoskeletal System and Connective Tissue) as follows:

In MDC 5, we proposed to sequence proposed new MS-DRG 266 (Endovascular Cardiac Valve Replacement with MCC) and proposed new MS-DRG 267 (Endovascular Cardiac Valve Replacement without MCC) above MS-DRG 222 (Cardiac Defibrillator Implant with Cardiac Catheterization with AMI/HF/Shock with MCC).

In MDC 8, we proposed to delete MS-DRGs 490 (Back & Neck Procedures Except Spinal Fusion with CC/MCC or Disc Device/Neurostimulator) and MS-DRG 491 (Back & Neck Procedures Except Spinal Fusion without CC/MCC or Disc Device/Neurostimulator) from the surgical hierarchy. We proposed to sequence proposed new MS-DRG 518 (Back & Neck Procedure Except Spinal Fusion with MCC or Disc Device/Neurostimulator), proposed new MS-DRG 519 (Back & Neck Procedure Except Spinal Fusion with CC), and proposed new MS-DRG 520 (Back & Neck Procedure Except Spinal Fusion without CC/MCC) above MS-DRG 492 (Lower Extremity and Humerus Procedure Except Hip, Foot, Femur with MCC).

We invited public comments on our proposals.

Comment: We did not receive any public comments opposing our proposals for the surgical hierarchy. Commenters expressed general support for the proposals, noting they were reasonable given the information that was provided.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing our proposal for MDC 5 to sequence new MS-DRG 266 (Endovascular Cardiac Valve Replacement with MCC) and new MS-DRG 267 (Endovascular Cardiac Valve Replacement without MCC) above MS-DRG 222 (Cardiac Defibrillator Implant with Cardiac Catheterization with AMI/HF/Shock with MCC). We also are finalizing our proposal for MDC 8 to delete MS-DRG 490 (Back & Neck Procedures Except Spinal Fusion with CC/MCC or Disc Device/Neurostimulator) and MS-DRG 491 (Back & Neck Procedures Except Spinal Fusion without CC/MCC or Disc Device/Neurostimulator) from the surgical hierarchy. We are sequencing new MS-DRG 518 (Back & Neck Procedure Except Spinal Fusion with MCC or Disc Device/Neurostimulator), new MS-DRG 519 (Back & Neck Procedure Except Spinal Fusion with CC), and new MS-DRG 520 (Back & Neck Procedure Except Spinal Fusion without CC/MCC) above MS-DRG 492 (Lower Extremity and Humerus Procedure Except Hip, Foot, Femur with MCC), effective FY 2015.

10. Changes to the MS-DRG Diagnosis Codes for FY 2015

a. Major Complications or Comorbidities (MCCs) and Complications or Comorbidities (CC) Severity Levels for FY 2015

A complete updated MCC, CC, and Non-CC Exclusion List is available via the Internet on the CMS Web site at: http://cms.hhs.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html as follows:

  • Table 6I (Complete MCC list);
  • Table 6J (Complete CC list); and
  • Table 6K (Complete list of CC Exclusions).

b. Coronary Atherosclerosis Due to Calcified Coronary Lesion

We received a request that we change the severity level for ICD-9-CM diagnosis code 414.4 (Coronary atherosclerosis due to calcified coronary lesion) from a non-CC to an MCC. This issue was previously discussed in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27522) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50541 through 50542).

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for ICD-9-CM diagnosis code 414.4. The following chart shows our findings.

CodeDiagnosis descriptionCC levelCnt 1  Cnt 1 impactCnt 2  Cnt 2 impactCnt 3  Cnt 3 impact
414.4Coronary atherosclerosis due to calcified lesionNon-CC1,7961.163,0562.182,8353.01

We ran the above data as described in the FY 2008 IPPS final rule with comment period (72 FR 47158 through 47161). The C1 value reflects a patient with no other secondary diagnosis or with all other secondary diagnoses that are non-CCs. The C2 value reflects a patient with at least one other secondary diagnosis that is a CC, but none that is an MCC. The C3 value reflects a patient with at least one other secondary diagnosis that is an MCC.

The chart above shows that the C1 finding is 1.16. A value close to 1.0 in the C1 field suggests that the diagnosis produces the same expected value as a non-CC. A value close to 2.0 suggests the condition is more like a CC than a non-CC, but not as significant in resource usage as an MCC. A value close to 3.0 suggests the condition is expected to consume resources more similar to an MCC than a CC or a non-CC. The C2 finding was 2.18. A C2 value close to 2.0 suggests the condition is more like a CC than a non-CC, but not as significant in resource usage as an MCC when there is at least one other secondary diagnosis that is a CC but none that is an MCC. While the C1 value of 1.16 is above the 1.0 value for a non-CC, it does not support reclassification to an MCC. As stated earlier, a value close to 3.0 suggests the condition is expected to consume resources more similar to an MCC than a CC or a non-CC. The C2 finding of 2.18 also does not support reclassifying this diagnosis code to an MCC. Our clinical advisors reviewed the data and evaluated this condition. They recommended that we not change the severity level of diagnosis code 414.4 from a non-CC to an MCC. They did not believe that this diagnosis would increase the severity level of patients. They pointed out that a similar code, diagnosis code 414.2 (Chronic total occlusion of coronary artery), is a non-CC. Our clinical advisors believe that diagnosis code 414.4 represents patients who are less severe than diagnosis code 414.2. Considering the C1 and C2 ratings of diagnosis code 414.4 and the input from our clinical advisors, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to reclassify diagnosis code 414.4 to an MCC; the diagnosis code would continue to be considered a non-CC.

Therefore, based on the data and clinical analysis, we proposed to maintain diagnosis code 414.4 as a non-CC. We invited public comments on our proposal.

Comment: Several commenters supported the proposal to keep diagnosis code 414.4 as a non-CC. One commenter requested that diagnosis code 414.4, when present as a secondary diagnosis, be included on the MCC list. The commenter believed that treating calcified coronary lesions with atherectomy is underpaid by the Medicare program for patients requiring percutaneous coronary intervention when calcified coronary lesions prevent successful angioplasty and placement of coronary stents. The commenter further stated that treating coronary calcification is significantly more difficult to treat, requires more time and equipment, and has clinical outcomes that are much worse compared to treating noncalcified or mildly calcified coronary obstructions. Consequently, the commenter believed it costs hospitals more to treat patients with calcified coronary lesions and that hospitals should be compensated for their expense to treat coronary atherosclerosis in Medicare beneficiaries. The commenter recognized the opinion of our clinical advisors that patients with a code 414.4 diagnosis are less severe than those with a code 414.2 diagnosis, but disagreed with that opinion. The commenter believed that both disease states add substantial treatment time and costs to the providers, health care systems, and society and both are worthy of classification as an MCC.

Response: We appreciate the commenters' support for our proposal to maintain code 414.4 as a non-CC. We are not accepting the commenter's recommendation to change this code to an MCC because our clinical data do not support such a change. The data continue to support keeping diagnosis code 414.4 as a non-CC and do not support changing the code to an MCC, for the reasons described above.

We examined claims data from the December 2013 update of the FY 2013 MedPAR file for ICD-9-CM diagnosis code 414.2. The following chart shows our findings.

CodeDiagnosis descriptionCC levelCnt 1  Cnt 1 impactCnt 2  Cnt 2 impactCnt 3  Cnt 3 impact
414.2Chronic total occlusion of coronary arteryNon-CC15,8141.2521,4832.0919,9553.04

The chart above for diagnosis code 414.2 shows that the C1 finding is 1.25. A value close to 1.0 in the C1 field suggests that the diagnosis produces the same expected value as a non-CC. A value close to 2.0 suggests the condition is more like a CC than a non-CC, but not as significant in resource usage as an MCC. A value close to 3.0 suggests the condition is expected to consume resources more similar to an MCC than a CC or a non-CC. The C2 finding was 2.09. A C2 value close to 2.0 suggests the condition is more like a CC than a non-CC, but not as significant in resource usage as an MCC when there is at least one other secondary diagnosis that is a CC but none that is an MCC. While the C1 value of 1.25 is above the 1.0 value for a non-CC, it does not support reclassification to an MCC. As stated earlier, a value close to 3.0 suggests the condition is expected to consume resources more similar to an MCC than a CC or a non-CC. The C2 finding of 2.09 also does not support reclassifying this diagnosis code to an MCC.

Our clinical advisors reviewed the data and evaluated the severity level for both diagnosis code 414.4 and 414.2. They continue to recommend that we not change the severity level of diagnosis code 414.4 from a non-CC to an MCC. Furthermore, they recommend that we not change the severity level for diagnosis code 414.2. They do not believe that the diagnosis represented by either code would increase the severity level of patients. After reviewing the commenter's justification for changing diagnosis code 414.4 from a non-CC to an MCC, our clinical advisors continue to recommend that we not change the severity level of diagnosis code 414.4 from a non-CC to an MCC. They again pointed out that diagnosis code 414.2 is a similar code and is a non-CC. As noted, they also recommend maintaining diagnosis code 414.2 as a non-CC. Our clinical advisors continue to believe that diagnosis code 414.4 represents patients who are less severe than diagnosis code 414.2.

After consideration of the public comments we received, the C1 and C2 ratings in our claims data, and the input from our clinical advisors, we are finalizing our proposal to not reclassify diagnosis code 414.4 from a non-CC to an MCC; the diagnosis code will continue to be considered a non-CC.

11. Complications or Comorbidity (CC) Exclusions List

a. Background of the CC List and the CC Exclusions List

Under the IPPS MS-DRG classification system, we have developed a standard list of diagnoses that are considered CCs. Historically, we developed this list using physician panels that classified each diagnosis code based on whether the diagnosis, when present as a secondary condition, would be considered a substantial complication or comorbidity. A substantial complication or comorbidity was defined as a condition that, because of its presence with a specific principal diagnosis, would cause an increase in the length of stay by at least 1 day in at least 75 percent of the patients. However, depending on the principal diagnosis of the patient, some diagnoses on the basic list of complications and comorbidities may be excluded if they are closely related to the principal diagnosis. In FY 2008, we evaluated each diagnosis code to determine its impact on resource use and to determine the most appropriate CC subclassification (non-CC, CC, or MCC) assignment. We refer readers to sections II.D.2. and 3. of the preamble of the FY 2008 IPPS final rule with comment period for a discussion of the refinement of CCs in relation to the MS-DRGs we adopted for FY 2008 (72 FR 47152 through 47171).

b. CC Exclusions List for FY 2015

In the September 1, 1987 final notice (52 FR 33143) concerning changes to the DRG classification system, we modified the GROUPER logic so that certain diagnoses included on the standard list of CCs would not be considered valid CCs in combination with a particular principal diagnosis. We created the CC Exclusions List for the following reasons: (1) To preclude coding of CCs for closely related conditions; (2) to preclude duplicative or inconsistent coding from being treated as CCs; and (3) to ensure that cases are appropriately classified between the complicated and uncomplicated DRGs in a pair. As we indicated above, we developed a list of diagnoses, using physician panels, to include those diagnoses that, when present as a secondary condition, would be considered a substantial complication or comorbidity. In previous years, we have made changes to the list of CCs, either by adding new CCs or deleting CCs already on the list.

In the May 19, 1987 proposed notice (52 FR 18877) and the September 1, 1987 final notice (52 FR 33154), we explained that the excluded secondary diagnoses were established using the following five principles:

  • Chronic and acute manifestations of the same condition should not be considered CCs for one another;
  • Specific and nonspecific (that is, not otherwise specified (NOS)) diagnosis codes for the same condition should not be considered CCs for one another;
  • Codes for the same condition that cannot coexist, such as partial/total, unilateral/bilateral, obstructed/unobstructed, and benign/malignant, should not be considered CCs for one another;
  • Codes for the same condition in anatomically proximal sites should not be considered CCs for one another; and
  • Closely related conditions should not be considered CCs for one another.

The creation of the CC Exclusions List was a major project involving hundreds of codes. We have continued to review the remaining CCs to identify additional exclusions and to remove diagnoses from the master list that have been shown not to meet the definition of a CC.

We refer readers to the FY 1989 final rule (53 FR 38485, September 30, 1988) for the revision made for the discharges occurring in FY 1989; the FY 1990 final rule (54 FR 36552, September 1, 1989) for the FY 1990 revision; the FY 1991 final rule (55 FR 36126, September 4, 1990) for the FY 1991 revision; the FY 1992 final rule (56 FR 43209, August 30, 1991) for the FY 1992 revision; the FY 1993 final rule (57 FR 39753, September 1, 1992) for the FY 1993 revision; the FY 1994 final rule (58 FR 46278, September 1, 1993) for the FY 1994 revisions; the FY 1995 final rule (59 FR 45334, September 1, 1994) for the FY 1995 revisions; the FY 1996 final rule (60 FR 45782, September 1, 1995) for the FY 1996 revisions; the FY 1997 final rule (61 FR 46171, August 30, 1996) for the FY 1997 revisions; the FY 1998 final rule (62 FR 45966, August 29, 1997) for the FY 1998 revisions; the FY 1999 final rule (63 FR 40954, July 31, 1998) for the FY 1999 revisions; the FY 2001 final rule (65 FR 47064, August 1, 2000) for the FY 2001 revisions; the FY 2002 final rule (66 FR 39851, August 1, 2001) for the FY 2002 revisions; the FY 2003 final rule (67 FR 49998, August 1, 2002) for the FY 2003 revisions; the FY 2004 final rule (68 FR 45364, August 1, 2003) for the FY 2004 revisions; the FY 2005 final rule (69 FR 49848, August 11, 2004) for the FY 2005 revisions; the FY 2006 final rule (70 FR 47640, August 12, 2005) for the FY 2006 revisions; the FY 2007 final rule (71 FR 47870) for the FY 2007 revisions; the FY 2008 final rule (72 FR 47130) for the FY 2008 revisions; the FY 2009 final rule (73 FR 48510); the FY 2010 final rule (74 FR 43799); the FY 2011 final rule (75 FR 50114); the FY 2012 final rule (76 FR 51542); the FY 2013 final rule (77 FR 53315); and the FY 2014 final rule (78 FR 50541). In the FY 2000 final rule (64 FR 41490, July 30, 1999), we did not modify the CC Exclusions List because we did not make any changes to the ICD-9-CM codes for FY 2000.

In the FY 2015 IPPS/LTCH PPS proposed rule, for FY 2015, we did not propose any changes to the CC Exclusion List. Therefore, we did not develop or publish Tables 6G (Additions to the CC Exclusion List) or Table 6H (Deletions from the CC Exclusion List). We developed Table 6K (Complete List of CC Exclusions), which is available only via the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html. Because of the length of Table 6K, we are not publishing it in the Addendum to this final rule. Each of these principal diagnosis codes for which there is a CC exclusion is shown with an asterisk and the conditions that will not count as a CC are provided in an indented column immediately following the affected principal diagnosis. Beginning with discharges on or after October 1 of each year, the indented diagnoses are not recognized by the GROUPER as valid CCs for the asterisked principal diagnoses.

A complete updated MCC, CC, and Non-CC Exclusions List is available via the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html.

Because there were no proposed new, revised, or deleted diagnosis or procedure codes for FY 2015, we have not developed Table 6A (New Diagnosis Codes), Table 6B (New Procedure Codes), Table 6C (Invalid Diagnosis Codes), Table 6D (Invalid Procedure Codes), Table 6E (Revised Diagnosis Code Titles), and Table 6F (Revised Procedure Codes) to the final rule and they are not published as part of this final rule.

We did not propose any additions or deletions to the MS-DRG MCC List for FY 2015 nor any additions or deletions to the MS-DRG CC List for FY 2015. Therefore, as we proposed, for this final rule, we have not developed Tables 6I.1 (Additions to the MCC List), 6I.2 (Deletions to the MCC List), 6J.1 (Additions to the CC List), and 6J.2 (Deletions to the CC List), and they are not published as part of this final rule.

Alternatively, the complete documentation of the GROUPER logic, including the current CC Exclusions List, is available from 3M/Health Information Systems (HIS), which, under contract with CMS, is responsible for updating and maintaining the GROUPER program. The current MS-DRG Definitions Manual, Version 31.0, is available on a CD for $225.00. This manual may be obtained by writing 3M/HIS at the following address: 100 Barnes Road, Wallingford, CT 06492; or by calling (203) 949-0303, or by obtaining an order form at the Web site: http://www.3MHIS.com. Please specify the revision or revisions requested. Version 32.0 of this manual, which includes the final FY 2015 MS-DRG changes, is available on a CD for $225.00. This manual may be obtained by writing 3M/HIS at the address provided above; or by calling (203) 949-0303; or by obtaining an order form at the Web site at: http://www/3MHIS.com. Please specify the revision or revisions requested.

12. Review of Procedure Codes in MS DRGs 981 Through 983; 984 Through 986; and 987 Through 989

Each year, we review cases assigned to former CMS DRG 468 (Extensive O.R. Procedure Unrelated to Principal Diagnosis), CMS DRG 476 (Prostatic O.R. Procedure Unrelated to Principal Diagnosis), and CMS DRG 477 (Nonextensive O.R. Procedure Unrelated to Principal Diagnosis) to determine whether it would be appropriate to change the procedures assigned among these CMS DRGs. Under the MS-DRGs that we adopted for FY 2008, CMS DRG 468 was split three ways and became MS-DRGs 981, 982, and 983 (Extensive O.R. Procedure Unrelated to Principal Diagnosis with MCC, with CC, and without CC/MCC, respectively). CMS DRG 476 became MS-DRGs 984, 985, and 986 (Prostatic O.R. Procedure Unrelated to Principal Diagnosis with MCC, with CC, and without CC/MCC, respectively). CMS DRG 477 became MS-DRGs 987, 988, and 989 (Nonextensive O.R. Procedure Unrelated to Principal Diagnosis with MCC, with CC, and without CC/MCC, respectively).

MS-DRGs 981 through 983, 984 through 986, and 987 through 989 (formerly CMS DRGs 468, 476, and 477, respectively) are reserved for those cases in which none of the O.R. procedures performed are related to the principal diagnosis. These MS-DRGs are intended to capture atypical cases, that is, those cases not occurring with sufficient frequency to represent a distinct, recognizable clinical group. MS-DRGs 984 through 986 (previously CMS DRG 476) are assigned to those discharges in which one or more of the following prostatic procedures are performed and are unrelated to the principal diagnosis:

  • 60.0 (Incision of prostate);
  • 60.12 (Open biopsy of prostate);
  • 60.15 (Biopsy of periprostatic tissue);
  • 60.18 (Other diagnostic procedures on prostate and periprostatic tissue);
  • 60.21 (Transurethral prostatectomy);
  • 60.29 (Other transurethral prostatectomy);
  • 60.61 (Local excision of lesion of prostate);
  • 60.69 (Prostatectomy, not elsewhere classified);
  • 60.81 (Incision of periprostatic tissue);
  • 60.82 (Excision of periprostatic tissue);
  • 60.93 (Repair of prostate);
  • 60.94 (Control of (postoperative) hemorrhage of prostate);
  • 60.95 (Transurethral balloon dilation of the prostatic urethra);
  • 60.96 (Transurethral destruction of prostate tissue by microwave thermotherapy);
  • 60.97 (Other transurethral destruction of prostate tissue by other thermotherapy); and
  • 60.99 (Other operations on prostate).

All remaining O.R. procedures are assigned to MS-DRGs 981 through 983 and 987 through 989, with MS-DRGs 987 through 989 assigned to those discharges in which the only procedures performed are nonextensive procedures that are unrelated to the principal diagnosis.

The original list of the ICD-9-CM procedure codes for the procedures we consider nonextensive procedures, if performed with an unrelated principal diagnosis, was published in Table 6C in section IV. of the Addendum to the FY 1989 final rule (53 FR 38591). As part of the FY 1991 final rule (55 FR 36135), the FY 1992 final rule (56 FR 43212), the FY 1993 final rule (57 FR 23625), the FY 1994 final rule (58 FR 46279), the FY 1995 final rule (59 FR 45336), the FY 1996 final rule (60 FR 45783), the FY 1997 final rule (61 FR 46173), and the FY 1998 final rule (62 FR 45981), we moved several other procedures from DRG 468 to DRG 477, and some procedures from DRG 477 to DRG 468. No procedures were moved in FY 1999, as noted in the final rule (63 FR 40962), in the FY 2000 (64 FR 41496), in the FY 2001 (65 FR 47064), or in the FY 2002 (66 FR 39852). In the FY 2003 final rule (67 FR 49999), we did not move any procedures from DRG 477. However, we did move procedure codes from DRG 468 and placed them in more clinically coherent DRGs. In the FY 2004 final rule (68 FR 45365), we moved several procedures from DRG 468 to DRGs 476 and 477 because the procedures are nonextensive. In the FY 2005 final rule (69 FR 48950), we moved one procedure from DRG 468 to 477. In addition, we added several existing procedures to DRGs 476 and 477. In FY 2006 (70 FR 47317), we moved one procedure from DRG 468 and assigned it to DRG 477. In FY 2007, we moved one procedure from DRG 468 and assigned it to DRGs 479, 553, and 554. In FYs 2008, 2009, 2010, 2011, 2012, 2013, and 2014, no procedures were moved, as noted in the FY 2008 final rule with comment period (72 FR 46241), in the FY 2009 final rule (73 FR 48513), in the FY 2010 final rule (74 FR 43796), in the FY 2011 final rule (75 FR 50122), in the FY 2012 final rule (76 FR 51549), in the FY 2013 final rule (77 FR 53321), and in the FY 2014 final rule (78 FR 50545).

Our review of MedPAR claims data showed that there were no cases that merited movement or should logically be assigned to any of the other MDCs. Therefore, for FY 2015, we did not propose to change the procedures assigned among these MS-DRGs.

We did not receive any public comments on our proposal. Therefore, as we proposed, we are not making any changes to the procedures assigned to MS-DRGs 981 through 983, MS-DRGs 984 through 986, and MS-DRGs 987 through 989 for FY 2015.

a. Moving Procedure Codes From MS-DRGs 981 Through 983 or MS-DRGs 987 Through 989 Into MDCs

We annually conduct a review of procedures producing assignment to MS-DRGs 981 through 983 (Extensive O.R. procedure unrelated to principal diagnosis with MCC, with CC, and without CC/MCC, respectively) or MS-DRGs 987 through 989 (Nonextensive O.R. procedure unrelated to principal diagnosis with MCC, with CC, and without CC/MCC, respectively) on the basis of volume, by procedure, to see if it would be appropriate to move procedure codes out of these MS-DRGs into one of the surgical MS-DRGs for the MDC into which the principal diagnosis falls. The data are arrayed in two ways for comparison purposes. We look at a frequency count of each major operative procedure code. We also compare procedures across MDCs by volume of procedure codes within each MDC.

We identify those procedures occurring in conjunction with certain principal diagnoses with sufficient frequency to justify adding them to one of the surgical MS-DRGs for the MDC in which the diagnosis falls. As noted above, there were no cases that merited movement or that should logically be assigned to any of the other MDCs. Therefore, for FY 2015, we did not propose to remove any procedures from MS-DRGs 981 through 983 or MS-DRGs 987 through 989 into one of the surgical MS-DRGs for the MDC into which the principal diagnosis is assigned.

We did not receive any public comments on our proposal. Therefore, as we proposed, we are not removing any procedures from MS-DRGs 981 through 983 or MS-DRGs 987 through 989 into one of the surgical MS-DRGs into which the principal diagnosis is assigned for FY 2015.

b. Reassignment of Procedures Among MS-DRGs 981 Through 983, 984 Through 986, and 987 Through 989

We also annually review the list of ICD-9-CM procedures that, when in combination with their principal diagnosis code, result in assignment to MS-DRGs 981 through 983, 984 through 986 (Prostatic O.R. procedure unrelated to principal diagnosis with MCC, with CC, or without CC/MCC, respectively), and 987 through 989, to ascertain whether any of those procedures should be reassigned from one of these three MS-DRGs to another of the three MS-DRGs based on average costs and the length of stay. We look at the data for trends such as shifts in treatment practice or reporting practice that would make the resulting MS-DRG assignment illogical. If we find these shifts, we would propose to move cases to keep the MS-DRGs clinically similar or to provide payment for the cases in a similar manner. Generally, we move only those procedures for which we have an adequate number of discharges to analyze the data.

There were no cases representing shifts in treatment practice or reporting practice that would make the resulting MS-DRG assignment illogical, or that merited movement so that cases should logically be assigned to any of the other MDCs. Therefore, for FY 2015, we did not propose to move any procedure codes among these MS-DRGs.

We did not receive any public comments on our proposal. Therefore, as we proposed, we are not moving any procedure codes among these MS-DRGs for FY 2015.

c. Adding Diagnosis or Procedure Codes to MDCs

Based on the review of cases in the MDCs, as described above in sections II.G.2. through 7. of the preamble of this final rule, we did not propose to add any diagnosis or procedure codes to MDCs for FY 2015. We did not receive any public comments on our proposal. Therefore, as we proposed, we are not adding any diagnosis or procedure codes to MDCs for FY 2015.

13. Changes to the ICD-9-CM System

a. ICD-10 Coordination and Maintenance Committee

In September 1985, the ICD-9-CM Coordination and Maintenance Committee was formed. This is a Federal interdepartmental committee, co-chaired by the National Center for Health Statistics (NCHS), the Centers for Disease Control and Prevention, and CMS, charged with maintaining and updating the ICD-9-CM system. The final update to ICD-9-CM codes was to be made on October 1, 2013. Thereafter, the name of the Committee was changed to the ICD-10 Coordination and Maintenance Committee, effective with the March 19-20, 2014 meeting. The ICD-10 Coordination and Maintenance Committee will address updates to the ICD-10-CM, ICD-10-PCS, and ICD-9-CM coding systems. The Committee is jointly responsible for approving coding changes, and developing errata, addenda, and other modifications to the coding systems to reflect newly developed procedures and technologies and newly identified diseases. The Committee is also responsible for promoting the use of Federal and non-Federal educational programs and other communication techniques with a view toward standardizing coding applications and upgrading the quality of the classification system.

The official list of ICD-9-CM diagnosis and procedure codes by fiscal year can be found on the CMS Web site at: http://cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/codes.html. The official list of ICD-10-CM and ICD-10-PCS codes can be found on the CMS Web site at: http://www.cms.gov/Medicare/Coding/ICD10/index.html.

The NCHS has lead responsibility for the ICD-10-CM and ICD-9-CM diagnosis codes included in the Tabular List and Alphabetic Index for Diseases, while CMS has lead responsibility for the ICD-10-PCS and ICD-9-CM procedure codes included in the Tabular List and Alphabetic Index for Procedures.

The Committee encourages participation in the above process by health-related organizations. In this regard, the Committee holds public meetings for discussion of educational issues and proposed coding changes. These meetings provide an opportunity for representatives of recognized organizations in the coding field, such as the American Health Information Management Association (AHIMA), the American Hospital Association (AHA), and various physician specialty groups, as well as individual physicians, health information management professionals, and other members of the public, to contribute ideas on coding matters. After considering the opinions expressed at the public meetings and in writing, the Committee formulates recommendations, which then must be approved by the agencies.

The Committee presented proposals for coding changes for implementation in FY 2015 at a public meeting held on September 18-19, 2013, and finalized the coding changes after consideration of comments received at the meetings and in writing by November 15, 2013.

The Committee held its 2014 meeting on March 19-20, 2014. It was announced at this meeting that any new ICD-10-CM/PCS codes for which there was consensus of public support and for which complete tabular and indexing changes would be made by May 2014 would be included in the October 1, 2014 update to ICD-10-CM/ICD-10-PCS. For FY 2015, there are no new, revised, or deleted ICD-10-CM diagnosis codes or ICD-10-PCS procedure codes, and no new, revised, or deleted ICD-9-CM diagnosis or procedure codes.

Copies of the minutes of the procedure codes discussions at the Committee's September 18-19, 2013 meeting and March 19-20, 2014 meeting can be obtained from the CMS Web site at: http://cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/index.html?redirect=/icd9ProviderDiagnosticCodes/03_meetings.asp. The minutes of the diagnosis codes discussions at the September 18-19, 2013 meeting and March 19-20, 2014 meeting are found at: http://www.cdc.gov/nchs/icd/icd9cm.html. These Web sites also provide detailed information about the Committee, including information on requesting a new code, attending a Committee meeting, and timeline requirements and meeting dates.

We encourage commenters to address suggestions on coding issues involving diagnosis codes to: Donna Pickett, Co-Chairperson, ICD-10 Coordination and Maintenance Committee, NCHS, Room 2402, 3311 Toledo Road, Hyattsville, MD 20782. Comments may be sent by email to: dfp4@cdc.gov.

Questions and comments concerning the procedure codes should be addressed to: Patricia Brooks, Co-Chairperson, ICD-10 Coordination and Maintenance Committee, CMS, Center for Medicare Management, Hospital and Ambulatory Policy Group, Division of Acute Care, C4-08-06, 7500 Security Boulevard, Baltimore, MD 21244-1850. Comments may be sent by email to: patricia.brooks2@cms.hhs.gov.

In the September 7, 2001 final rule implementing the IPPS new technology add-on payments (66 FR 46906), we indicated we would attempt to include proposals for procedure codes that would describe new technology discussed and approved at the Spring meeting as part of the code revisions effective the following October.

Section 503(a) of Public Law 108-173 included a requirement for updating ICD-9-CM codes twice a year instead of a single update on October 1 of each year. This requirement was included as part of the amendments to the Act relating to recognition of new technology under the IPPS. Section 503(a) amended section 1886(d)(5)(K) of the Act by adding a clause (vii) which states that the “Secretary shall provide for the addition of new diagnosis and procedure codes on April 1 of each year, but the addition of such codes shall not require the Secretary to adjust the payment (or diagnosis-related group classification) . . . until the fiscal year that begins after such date.” This requirement improves the recognition of new technologies under the IPPS system by providing information on these new technologies at an earlier date. Data will be available 6 months earlier than would be possible with updates occurring only once a year on October 1.

While section 1886(d)(5)(K)(vii) of the Act states that the addition of new diagnosis and procedure codes on April 1 of each year shall not require the Secretary to adjust the payment, or DRG classification, under section 1886(d) of the Act until the fiscal year that begins after such date, we have to update the DRG software and other systems in order to recognize and accept the new codes. We also publicize the code changes and the need for a mid-year systems update by providers to identify the new codes. Hospitals also have to obtain the new code books and encoder updates, and make other system changes in order to identify and report the new codes.

The ICD-10 (previously the ICD-9-CM) Coordination and Maintenance Committee holds its meetings in the spring and fall in order to update the codes and the applicable payment and reporting systems by October 1 of each year. Items are placed on the agenda for the Committee meeting if the request is received at least 2 months prior to the meeting. This requirement allows time for staff to review and research the coding issues and prepare material for discussion at the meeting. It also allows time for the topic to be publicized in meeting announcements in the Federal Register as well as on the CMS Web site. The public decides whether or not to attend the meeting based on the topics listed on the agenda. Final decisions on code title revisions are currently made by March 1 so that these titles can be included in the IPPS proposed rule. A complete addendum describing details of all diagnosis and procedure coding changes, both tabular and index, is published on the CMS and NCHS Web sites in May of each year. Publishers of coding books and software use this information to modify their products that are used by health care providers. This 5-month time period has proved to be necessary for hospitals and other providers to update their systems.

A discussion of this timeline and the need for changes are included in the December 4-5, 2005 ICD-9-CM Coordination and Maintenance Committee Meeting minutes. The public agreed that there was a need to hold the fall meetings earlier, in September or October, in order to meet the new implementation dates. The public provided comment that additional time would be needed to update hospital systems and obtain new code books and coding software. There was considerable concern expressed about the impact this new April update would have on providers.

In the FY 2005 IPPS final rule, we implemented section 1886(d)(5)(K)(vii) of the Act, as added by section 503(a) of Public Law 108-173, by developing a mechanism for approving, in time for the April update, diagnosis and procedure code revisions needed to describe new technologies and medical services for purposes of the new technology add-on payment process. We also established the following process for making these determinations. Topics considered during the Fall ICD-10 (previously ICD-9-CM) Coordination and Maintenance Committee meeting are considered for an April 1 update if a strong and convincing case is made by the requester at the Committee's public meeting. The request must identify the reason why a new code is needed in April for purposes of the new technology process. The participants at the meeting and those reviewing the Committee meeting summary report are provided the opportunity to comment on this expedited request. All other topics are considered for the October 1 update. Participants at the Committee meeting are encouraged to comment on all such requests. There were no requests approved for an expedited April l, 2014 implementation of a code at the September 18-19, 2013 Committee meeting. Therefore, there were no new codes implemented on April 1, 2014.

ICD-9-CM addendum and code title information is published on the CMS Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/index.html?redirect=/icd9ProviderDiagnosticCodes/01overview.asp#TopofPage. ICD-10-CM and ICD-10-PCS addendum and code title information is published on the CMS Web site at http://www.cms.gov/Medicare/Coding/ICD10/index.html. Information on ICD-10-CM diagnosis codes, along with the Official ICD-10-CM Coding Guidelines, can also be found on the CDC Web site at: http://www.cdc.gov/nchs/icd/icd10cm.html. Information on new, revised, and deleted ICD-10-CM/ICD-10-PCS codes is also provided to the AHA for publication in the Coding Clinic for ICD-10. AHA also distributes information to publishers and software vendors.

CMS also sends copies of all ICD-9-CM coding changes to its Medicare contractors for use in updating their systems and providing education to providers.

The code titles are adopted as part of the ICD-10 (previously ICD-9-CM) Coordination and Maintenance Committee process. Therefore, although we publish the code titles in the IPPS proposed and final rules, they are not subject to comment in the proposed or final rules.

b. Code Freeze

In the January 16, 2009 ICD-10-CM and ICD-10-PCS final rule (74 FR 3340), there was a discussion of the need for a partial or total freeze in the annual updates to both ICD-9-CM and ICD-10-CM and ICD-10-PCS codes. The public comment addressed in that final rule stated that the annual code set updates should cease l year prior to the implementation of ICD-10. The commenters stated that this freeze of code updates would allow for instructional and/or coding software programs to be designed and purchased early, without concern that an upgrade would take place immediately before the compliance date, necessitating additional updates and purchases.

HHS responded to comments in the ICD-10 final rule that the ICD-9-CM Coordination and Maintenance Committee has jurisdiction over any action impacting the ICD-9-CM and ICD-10 code sets. Therefore, HHS indicated that the issue of consideration of a moratorium on updates to the ICD-9-CM, ICD-10-CM, and ICD-10-PCS code sets in anticipation of the adoption of ICD-10-CM and ICD-10-PCS would be addressed through the Committee at a future public meeting.

The code freeze was discussed at multiple meetings of the ICD-9-CM Coordination and Maintenance Committee and public comment was actively solicited. The Committee evaluated all comments from participants attending the Committee meetings as well as written comments that were received. The Committee also considered the delay in implementation of ICD-10 until October 1, 2014. There was an announcement at the September 19, 2012 ICD-9-CM Coordination and Maintenance Committee meeting that a partial freeze of both ICD-9-CM and ICD-10 codes will be implemented as follows:

  • The last regular annual update to both ICD-9-CM and ICD-10 code sets was made on October 1, 2011.
  • On October 1, 2012 and October 1, 2013, there will be only limited code updates to both ICD-9-CM and ICD-10 code sets to capture new technology and new diseases.
  • On October 1, 2014, there were to be only limited code updates to ICD-10 code sets to capture new technology and diagnoses as required by section 503(a) of Public Law 108-173. There were to be no updates to ICD-9-CM on October 1, 2014.
  • On October 1, 2015, one year after the originally scheduled implementation of ICD-10, regular updates to ICD-10 were to begin.

On May 15, 2014, CMS posted an updated Partial Code Freeze schedule on the CMS Web site at: http://www.cms.gov/Medicare/Coding/ICD10/ICD-9-CM-Coordination-and-Maintenance-Committee-Meetings.html. This updated schedule provided information on the extension of the partial code freeze until 1 year after the implementation of ICD-10. As stated earlier, on April 1, 2014, the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93) was enacted, which specified that the Secretary may not adopt ICD-10 prior to October 1, 2015. On May 1, 2014, the Department announced that it expects to release a interim final rule in the near future that will include a new compliance date to require the use of ICD-10 beginning October 1, 2015. The rule will also require HIPAA covered entities to continue to use ICD-9-CM through September 30, 2015. Accordingly, the updated schedule for the partial code freeze is as follows:

  • The last regular annual updates to both ICD-9-CM and ICD-10 code sets were made on October 1, 2011.
  • On October 1, 2012, October 1, 2013, and October 1, 2014, there will be only limited code updates to both the ICD-9-CM and ICD-10 code sets to capture new technologies and diseases as required by section 1886(d)(5)(K) of the Act.
  • On October 1, 2015, there will be only limited code updates to ICD-10 code sets to capture new technologies and diagnoses as required by section 1886(d)(5)(K) of the Act. There will be no updates to ICD-9-CM, as it will no longer be used for reporting.
  • On October 1, 2016 (1 year after implementation of ICD-10), regular updates to ICD-10 will begin.

The ICD-10 (previously ICD-9-CM) Coordination and Maintenance Committee announced that it would continue to meet twice a year during the freeze. At these meetings, the public will be encouraged to comment on whether or not requests for new diagnosis and procedure codes should be created based on the need to capture new technology and new diseases. Any code requests that do not meet the criteria will be evaluated for implementation within ICD-10 one year after the implementation of ICD-10, once the partial freeze is ended.

Complete information on the partial code freeze and discussions of the issues at the Committee meetings can be found on the ICD-10 Coordination and Maintenance Committee Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/meetings.html. A summary of the September 19, 2012 Committee meeting, along with both written and audio transcripts of this meeting, is posted on the Web site at: http://www.cms.hhs.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/ICD-9-CM-C-and-M-Meeting-Materials-Items/2012-09-19-MeetingMaterials.html.

This partial code freeze has dramatically decreased the number of codes created each year as shown by the following information.

Total Number of Codes and Changes in Total Number of Codes per Fiscal Year

ICD-9-CM codesICD-10-CM and ICD-10-PCS codes
Fiscal yearNo.ChangeFiscal yearNo.Change
FY 2009 (October 1, 2008):FY 2009:
Diagnoses14,025348ICD-10-CM68,069+5
Procedures3,82456ICD-10-PCS72,589−14,327
FY 2010 (October 1, 2009):FY 2010:
Diagnoses14,315290ICD-10-CM69,099+1,030
Procedures3,83814ICD-10-PCS71,957−632
FY 2011 (October 1, 2010):
Diagnoses14,432117ICD-10-CM69,368+269
Procedures3,85921ICD-10-PCS72,081+124
FY 2012 (October 1, 2011):FY 2012:
Diagnoses14,567135ICD-10-CM69,833+465
Procedures3,87718ICD-10-PCS71,918−163
FY 2013 (October 1, 2012):FY 2013:
Diagnoses14,5670ICD-10-CM69,832−1
Procedures3,8781ICD-10-PCS71,920+2
FY 2014 (October 1, 2013):FY 2014:
Diagnoses14,5670ICD-10-CM69,823−9
Procedures3,8824ICD-10-PCS71,924+4
FY 2015 (October 1, 2014):FY 2015:
Diagnoses14,5670ICD-10-CM69,8230
Procedures3,8820ICD-10-PCS71,9240

As mentioned earlier, the public is provided the opportunity to comment on any requests for new diagnosis or procedure codes discussed at the ICD-10 Coordination and Maintenance Committee meeting. The public has supported only a limited number of new codes during the partial code freeze, as can be seen by data shown above. We have gone from creating several hundred new codes each year to creating only a limited number of new ICD-9-CM and ICD-10 codes.

At the September 18-19, 2013 and March 19-20, 2014 Committee meetings, we discussed any requests we had received for new ICD-10-CM diagnosis and ICD-10-PCS procedure codes that were to be implemented on October 1, 2014. We did not discuss ICD-9-CM codes. The public was given the opportunity to comment on whether or not new ICD-10-CM and ICD-10-PCS codes should be created, based on the partial code freeze criteria. The public was to use the criteria as to whether codes were needed to capture new diagnoses or new technologies. If the codes do not meet those criteria for implementation during the partial code freeze, consideration was to be given as to whether the codes should be created after the partial code freeze ends one year after the implementation of ICD-10-CM/PCS. We invited public comments on any code requests discussed at the September 18-19, 2013 and March 19-20, 2014 Committee meetings for implementation as part of the October 1, 2014 update. The deadline for commenting on code proposals discussed at the September 18-19, 2013 Committee meeting was November 15, 2013. The deadline for commenting on code proposals discussed at the March 19-20, 2014 Committee meeting was April 18, 2014.

14. Public Comments on Issues Not Addressed in the Proposed Rule

We received three public comments regarding MS-DRG issues that were outside of the scope of the proposals included in the FY 2014 IPPS/LTCH PPS proposed rule. Below we summarize these public comments. However, because we consider these public comments to be outside of the scope of the proposed rule, we are not responding to them in this final rule. As stated in section II.G.1.b. of the preamble of this final rule, we encourage individuals with comments about MS-DRG classifications to submit these comments no later than December 7 of each year so they can be considered for possible inclusion in the annual proposed rule and, if included, may be subjected to public review and comment. We will consider these public comments for possible proposals in future rulemaking as part of our annual review process.

a. Request for Review and MS-DRG Reassignment for ICD-9-CM Diagnosis Code 784.7 Reported With Procedure Codes 39.75 and 39.76

One commenter expressed concern regarding specific procedure codes that are assigned to MS-DRGs 981 through 983; 984 through 986; and 987 through 989 in relation to our discussion of the annual review of these MS-DRGs in section II.G.12. of the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28020). The commenter noted that the endovascular embolization of the arteries of the branches of the internal maxillary artery is frequently performed for intractable posterior epistaxis. The commenter stated that, currently, diagnosis code 784.7 (Epistaxis) reported with procedure codes 39.75 (Endovascular embolization or occlusion of vessel(s) of head or neck using bare coils) and 39.76 (Endovascular embolization or occlusion of vessel(s) of head or neck using bioactive coils) groups to MS-DRG 981(Extensive O.R. Procedure Unrelated to Principal Diagnosis with MCC), MS-DRG 982 (Extensive O.R. Procedure Unrelated to Principal Diagnosis with CC), and MS-DRG 983 (Extensive O.R. Procedure Unrelated to Principal Diagnosis without CC/MCC). The commenter indicated that it also found this grouping with ICD-10 diagnosis code R04.0 (Epistaxis) reported with artery occlusion procedure codes. The commenter requested that CMS review these groupings and consider the possibility of reassigning these procedure codes into a more specific MS-DRG.

We consider this public comment to be outside of the scope of the FY 2015 IPPS/LTCH PPS proposed rule and therefore are not addressing it in this final rule. However, we will consider this public comment for possible proposals in future rulemaking as part of our annual review process.

b. Coding for Extracorporeal Membrane Oxygenation Procedures (ECMO)

Several commenters expressed concern that hospitals may not be correctly reporting extracorporeal membrane oxygenation (ECMO) and percutaneous cardiopulmonary bypass procedures. The commenters requested that CMS inform hospitals that they should appropriately code each procedure separately because each code captures different procedures.

We consider this coding issue to be outside of the scope of the FY 2015 IPPS/LTCH PPS proposed rule. We refer commenters to the American Hospital Association's Central Office on Coding, which has responsibility for providing coding advice on such specific coding issues through its publication Coding Clinic.

c. Adding Severity Levels to MS-DRGs 245 through 251

One commenter recommended including additional severity levels under MS-DRG 245 (AICD Generator Procedures); MS-DRG 246 (Percutaneous Cardiovascular Procedure with Drug-Eluting Stent with MCC or 4+ Vessels/Stents); MS-DRG 247 (Percutaneous Cardiovascular Procedure with Drug-Eluting Stent without MCC); MS-DRG 248 (Percutaneous Cardiovascular Procedure with Non-Drug-Eluting Stent with MCC or 4+ Vessels/Stents); MS-DRG 249 (Percutaneous Cardiovascular Procedure with Non-Drug-Eluting Stent without MCC); MS-DRG 250 (Percutaneous Cardiovascular Procedure without Coronary Artery Stent with MCC); and MS-DRG 251 (Percutaneous Cardiovascular Procedure without Coronary Artery Stent without MCC).

We consider this public comment to be outside of the scope of the FY 2015 IPPS/LTCH PPS proposed rule, and therefore are not addressing it in this final rule. However, we will consider the comment for possible proposals in future rulemaking as part of our annual review process.

H. Recalibration of the FY 2015 MS-DRG Relative Weights

1. Data Sources for Developing the Relative Weights

In developing the FY 2015 system of weights, we used two data sources: Claims data and cost report data. As in previous years, the claims data source is the MedPAR file. This file is based on fully coded diagnostic and procedure data for all Medicare inpatient hospital bills. The FY 2013 MedPAR data used in this final rule include discharges occurring on October 1, 2012, through September 30, 2013, based on bills received by CMS through March 31, 2014, from all hospitals subject to the IPPS and short-term, acute care hospitals in Maryland (which at that time were under a waiver from the IPPS under section 1814(b)(3) of the Act). The FY 2013 MedPAR file used in calculating the relative weights includes data for approximately 10,090,385 Medicare discharges from IPPS providers. Discharges for Medicare beneficiaries enrolled in a Medicare Advantage managed care plan are excluded from this analysis. These discharges are excluded when the MedPAR “GHO Paid” indicator field on the claim record is equal to “1” or when the MedPAR DRG payment field, which represents the total payment for the claim, is equal to the MedPAR “Indirect Medical Education (IME)” payment field, indicating that the claim was an “IME only” claim submitted by a teaching hospital on behalf of a beneficiary enrolled in a Medicare Advantage managed care plan. In addition, the March 31, 2014 update of the FY 2013 MedPAR file complies with version 5010 of the X12 HIPAA Transaction and Code Set Standards, and includes a variable called “claim type.” Claim type “60” indicates that the claim was an inpatient claim paid as fee-for-service. Claim types “61,” “62,” “63,” and “64” relate to encounter claims, Medicare Advantage IME claims, and HMO no-pay claims. Therefore, the calculation of the relative weights for FY 2015 also excludes claims with claim type values not equal to “60.” The data exclude CAHs, including hospitals that subsequently became CAHs after the period from which the data were taken. We note that the FY 2015 relative weights are based on the ICD-9-CM diagnoses and procedures codes from the MedPAR claims data, grouped through the ICD-9-CM version of the FY 2015 GROUPER (Version 32). The second data source used in the cost-based relative weighting methodology is the Medicare cost report data files from the HCRIS. Normally, we use the HCRIS dataset that is 3 years prior to the IPPS fiscal year. Specifically, we used cost report data from the March 31, 2014 update of the FY 2012 HCRIS for calculating the FY 2015 cost-based relative weights.

2. Methodology for Calculation of the Relative Weights

As we explain in section II.E.2. of the preamble of this final rule, we are calculating the FY 2015 relative weights based on 19 CCRs, as we did for FY 2014. The methodology we used to calculate the FY 2015 MS-DRG cost-based relative weights based on claims data in the FY 2013 MedPAR file and data from the FY 2012 Medicare cost reports is as follows:

  • To the extent possible, all the claims were regrouped using the FY 2015 MS-DRG classifications discussed in sections II.B. and II.G. of the preamble of this final rule.
  • The transplant cases that were used to establish the relative weights for heart and heart-lung, liver and/or intestinal, and lung transplants (MS-DRGs 001, 002, 005, 006, and 007, respectively) were limited to those Medicare-approved transplant centers that have cases in the FY 2012 MedPAR file. (Medicare coverage for heart, heart-lung, liver and/or intestinal, and lung transplants is limited to those facilities that have received approval from CMS as transplant centers.)
  • Organ acquisition costs for kidney, heart, heart-lung, liver, lung, pancreas, and intestinal (or multivisceral organs) transplants continue to be paid on a reasonable cost basis. Because these acquisition costs are paid separately from the prospective payment rate, it is necessary to subtract the acquisition charges from the total charges on each transplant bill that showed acquisition charges before computing the average cost for each MS-DRG and before eliminating statistical outliers.
  • Claims with total charges or total lengths of stay less than or equal to zero were deleted. Claims that had an amount in the total charge field that differed by more than $10.00 from the sum of the routine day charges, intensive care charges, pharmacy charges, special equipment charges, therapy services charges, operating room charges, cardiology charges, laboratory charges, radiology charges, other service charges, labor and delivery charges, inhalation therapy charges, emergency room charges, blood charges, and anesthesia charges were also deleted.
  • At least 92.2 percent of the providers in the MedPAR file had charges for 14 of the 19 cost centers. All claims of providers that did not have charges greater than zero for at least 14 of the 19 cost centers were deleted. In other words, a provider must have no more than five blank cost centers. If a provider did not have charges greater than zero in more than five cost centers, the claims for the provider were deleted. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50551) for the edit threshold related to FY 2014 and prior fiscal years).
  • Statistical outliers were eliminated by removing all cases that were beyond 3.0 standard deviations from the geometric mean of the log distribution of both the total charges per case and the total charges per day for each MS-DRG.
  • Effective October 1, 2008, because hospital inpatient claims include a POA indicator field for each diagnosis present on the claim, only for purposes of relative weight-setting, the POA indicator field was reset to “Y” for “Yes” for all claims that otherwise have an “N” (No) or a “U” (documentation insufficient to determine if the condition was present at the time of inpatient admission) in the POA field.

Under current payment policy, the presence of specific HAC codes, as indicated by the POA field values, can generate a lower payment for the claim. Specifically, if the particular condition is present on admission (that is, a “Y” indicator is associated with the diagnosis on the claim), it is not a HAC, and the hospital is paid for the higher severity (and, therefore, the higher weighted MS-DRG). If the particular condition is not present on admission (that is, an “N” indicator is associated with the diagnosis on the claim) and there are no other complicating conditions, the DRG GROUPER assigns the claim to a lower severity (and, therefore, the lower weighted MS-DRG) as a penalty for allowing a Medicare inpatient to contract a HAC. While the POA reporting meets policy goals of encouraging quality care and generates program savings, it presents an issue for the relative weight-setting process. Because cases identified as HACs are likely to be more complex than similar cases that are not identified as HACs, the charges associated with HAC cases are likely to be higher as well. Therefore, if the higher charges of these HAC claims are grouped into lower severity MS-DRGs prior to the relative weight-setting process, the relative weights of these particular MS-DRGs would become artificially inflated, potentially skewing the relative weights. In addition, we want to protect the integrity of the budget neutrality process by ensuring that, in estimating payments, no increase to the standardized amount occurs as a result of lower overall payments in a previous year that stem from using weights and case-mix that are based on lower severity MS-DRG assignments. If this would occur, the anticipated cost savings from the HAC policy would be lost.

To avoid these problems, we reset the POA indicator field to “Y” only for relative weight-setting purposes for all claims that otherwise have an “N” or a “U” in the POA field. This resetting “forced” the more costly HAC claims into the higher severity MS-DRGs as appropriate, and the relative weights calculated for each MS-DRG more closely reflect the true costs of those cases.

Once the MedPAR data were trimmed and the statistical outliers were removed, the charges for each of the 19 cost groups for each claim were standardized to remove the effects of differences in area wage levels, IME and DSH payments, and for hospitals located in Alaska and Hawaii, the applicable cost-of-living adjustment. Because hospital charges include charges for both operating and capital costs, we standardized total charges to remove the effects of differences in geographic adjustment factors, cost-of-living adjustments, and DSH payments under the capital IPPS as well. Charges were then summed by MS-DRG for each of the 19 cost groups so that each MS-DRG had 19 standardized charge totals. These charges were then adjusted to cost by applying the national average CCRs developed from the FY 2012 cost report data.

The 19 cost centers that we used in the relative weight calculation are shown in the following table. The table shows the lines on the cost report and the corresponding revenue codes that we used to create the 19 national cost center CCRs.

Cost center group name (19 total)MedPAR charge fieldRevenue codes contained in MedPAR charge fieldCost report line descriptionCost from HCRIS (Worksheet C, Part 1, Column 5 and line number) Form CMS-2552-10Charges from HCRIS (Worksheet C, Part 1, Column 6 & 7 and line number) Form CMS-2552-10Medicare charges from HCRIS (Worksheet D-3, Column & line number) Form CMS-2552-10
Routine DaysPrivate Room Charges011X and 014XAdults & Pediatrics (General Routine Care)C_1_C5_30C_1_C6_30D3_HOS_C2_30
Semi-Private Room Charges012X, 013X and 016X-019X
Ward Charges015X
Intensive DaysIntensive Care Charges020XIntensive Care UnitC_1_C5_31C_1_C6_31D3_HOS_C2_31
Coronary Care Charges021XCoronary Care UnitC_1_C5_32C_1_C6_32D3_HOS_C2_32
Burn Intensive Care UnitC_1_C5_33C_1_C6_33D3_HOS_C2_33
Surgical Intensive Care UnitC_1_C5_34C_1_C6_34D3_HOS_C2_34
Other Special Care UnitC_1_C5_35C_1_C6_35D3_HOS_C2_35
DrugsPharmacy Charges025X, 026X and 063XIntravenous TherapyC_1_C5_64C_1_C6_64D3_HOS_C2_64
C_1_C7_64
Drugs Charged To PatientC_1_C5_73C_1_C6_73D3_HOS_C2_73
C_1_C7_73
Supplies and EquipmentMedical/Surgical Supply Charges0270, 0271, 0272, 0273, 0274, 0277, 0279, and 0621, 0622, 0623Medical Supplies Charged to PatientsC_1_C5_71C_1_C6_71D3_HOS_C2_71
C_1_C7_71
Durable Medical Equipment Charges0290, 0291, 0292 and 0294-0299DME-RentedC_1_C5_96C_1_C6_96D3_HOS_C2_96
C_1_C7_96
Used Durable Medical Charges0293DME-SoldC_1_C5_97C_1_C6_97D3_HOS_C2_97
C_1_C7_97
Implantable Devices0275, 0276, 0278, 0624Implantable Devices Charged to PatientsC_1_C5_72C_1_C6_72D3_HOS_C2_72
C_1_C7_72
Therapy ServicesPhysical Therapy Charges042XPhysical TherapyC_1_C5_66C_1_C6_66D3_HOS_C2_66
C_1_C7_66
Occupational Therapy Charges043XOccupational TherapyC_1_C5_67C_1_C6_67D3_HOS_C2_67
C_1_C7_67
Speech Pathology Charges044X and 047XSpeech PathologyC_1_C5_68C_1_C6_68D3_HOS_C2_68
C_1_C7_68
Inhalation TherapyInhalation Therapy Charges041X and 046XRespiratory TherapyC_1_C5_65C_1_C6_65D3_HOS_C2_65
C_1_C7_65
Operating RoomOperating Room Charges036XOperating RoomC_1_C5_50C_1_C6_50D3_HOS_C2_50
C_1_C7_50
071XRecovery RoomC_1_C5_51C_1_C6_51D3_HOS_C2_51
C_1_C7_51
Labor & DeliveryOperating Room Charges072XDelivery Room and Labor RoomC_1_C5_52C_1_C6_52D3_HOS_C2_52
C_1_C7_52
AnesthesiaAnesthesia Charges037XAnesthesiologyC_1_C5_53C_1_C6_53D3_HOS_C2_53
C_1_C7_53
CardiologyCardiology Charges048X and 073XElectro-cardiologyC_1_C5_69C_1_C6_69D3_HOS_C2_69
C_1_C7_69
Cardiac Catheterization0481Cardiac CatheterizationC_1_C5_59C_1_C6_59D3_HOS_C2_59
C_1_C7_59
LaboratoryLaboratory Charges030X, 031X, and 075XLaboratoryC_1_C5_60C_1_C6_60D3_HOS_C2_60
C_1_C7_60
PBP Clinic Laboratory ServicesC_1_C5_61C_1_C6_61D3_HOS_C2_61
C_1_C7_61
074X, 086XElectro-EncephalographyC_1_C5_70C_1_C6_70D3_HOS_C2_70
C_1_C7_70
RadiologyRadiology Charges032X, 040XRadiology—DiagnosticC_1_C5_54C_1_C6_54D3_HOS_C2_54
C_1_C7_54
028x, 0331, 0332, 0333, 0335, 0339, 0342Radiology—TherapeuticC_1_C5_55C_1_C6_55D3_HOS_C2_55
0343 and 344RadioisotopeC_1_C5_56C_1_C6_56D3_HOS_C2_56
C_1_C7_56
Computed Tomography (CT) ScanCT Scan Charges035XComputed Tomography (CT) ScanC_1_C5_57C_1_C6_57D3_HOS_C2_57
C_1_C7_57
Magnetic Resonance Imaging (MRI)MRI Charges061XMagnetic Resonance Imaging (MRI)C_1_C5_58C_1_C6_58D3_HOS_C2_58
C_1_C7_58
Emergency RoomEmergency Room Charges045xEmergencyC_1_C5_91C_1_C6_91D3_HOS_C2_91
C_1_C7_91
Blood and Blood ProductsBlood Charges038xWhole Blood & Packed Red Blood CellsC_1_C5_62C_1_C6_62 C_1_C7_62D3_HOS_C2_62
Blood Storage/Processing039xBlood Storing, Processing, & TransfusingC_1_C5_63C_1_C6_63 C_1_C7_63D3_HOS_C2_63
Other ServicesOther Service Charge0002-0099, 022X, 023X, 024X, 052X, 053X
055X-060X, 064X-070X, 076X-078X, 090X-095X and 099X
Renal Dialysis0800XRenal DialysisC_1_C5_74C_1_C6_74D3_HOS_C2_74
ESRD Revenue Setting Charges080X and 082X-088XC_1_C7_74
Home Program DialysisC_1_C5_94C_1_C6_94D3_HOS_C2_94
C_1_C7_94
Outpatient Service Charges049XASC (Non Distinct Part)C_1_C5_75C_1_C6_75D3_HOS_C2_75
Lithotripsy Charge079XC_1_C7_75
Other AncillaryC_1_C5_76C_1_C6_76D3_HOS_C2_76
C_1_C7_76
Clinic Visit Charges051XClinicC_1_C5_90C_1_C6_90D3_HOS_C2_90
C_1_C7_90
Observation bedsC_1_C5_92.01C_1_C6_92.01D3_HOS_C2_92.01
C_1_C7_92.01
Professional Fees Charges096X, 097X, and 098XOther Outpatient ServicesC_1_C5_93C_1_C6_93D3_HOS_C2_93
C_1_C7_93
Ambulance Charges054XAmbulanceC_1_C5_95C_1_C6_95D3_HOS_C2_95
C_1_C7_95
Rural Health ClinicC_1_C5_88C_1_C6_88D3_HOS_C2_88
C_1_C7_88
FQHCC_1_C5_89C_1_C6_89D3_HOS_C2_89
C_1_C7_89

We refer readers to the FY 2009 IPPS/LTCH PPS final rule (73 FR 48462) for a discussion on the revenue codes included in the Supplies and Equipment and Implantable Devices CCRs, respectively.

3. Development of National Average CCRs

We developed the national average CCRs as follows:

Using the FY 2012 cost report data, we removed CAHs, Indian Health Service hospitals, all-inclusive rate hospitals, and cost reports that represented time periods of less than 1 year (365 days). We included hospitals located in Maryland because we include their charges in our claims database. We then created CCRs for each provider for each cost center (see prior table for line items used in the calculations) and removed any CCRs that were greater than 10 or less than 0.01. We normalized the departmental CCRs by dividing the CCR for each department by the total CCR for the hospital for the purpose of trimming the data. We then took the logs of the normalized cost center CCRs and removed any cost center CCRs where the log of the cost center CCR was greater or less than the mean log plus/minus 3 times the standard deviation for the log of that cost center CCR. Once the cost report data were trimmed, we calculated a Medicare-specific CCR. The Medicare-specific CCR was determined by taking the Medicare charges for each line item from Worksheet D-3 and deriving the Medicare-specific costs by applying the hospital-specific departmental CCRs to the Medicare-specific charges for each line item from Worksheet D-3. Once each hospital's Medicare-specific costs were established, we summed the total Medicare-specific costs and divided by the sum of the total Medicare-specific charges to produce national average, charge-weighted CCRs.

After we multiplied the total charges for each MS-DRG in each of the 19 cost centers by the corresponding national average CCR, we summed the 19 “costs” across each MS-DRG to produce a total standardized cost for the MS-DRG. The average standardized cost for each MS-DRG was then computed as the total standardized cost for the MS-DRG divided by the transfer-adjusted case count for the MS-DRG. The average cost for each MS-DRG was then divided by the national average standardized cost per case to determine the relative weight.

The FY 2015 cost-based relative weights were then normalized by an adjustment factor of 1.645837 so that the average case weight after recalibration was equal to the average case weight before recalibration. The normalization adjustment is intended to ensure that recalibration by itself neither increases nor decreases total payments under the IPPS, as required by section 1886(d)(4)(C)(iii) of the Act.

The 19 national average CCRs for FY 2015 are as follows:

GroupCCR
Routine Days0.489
Intensive Days0.407
Drugs0.192
Supplies & Equipment0.292
Implantable Devices0.349
Therapy Services0.344
Laboratory0.128
Operating Room0.212
Cardiology0.123
Cardiac Catheterization0.133
Radiology0.165
MRIs0.087
CT Scans0.043
Emergency Room0.195
Blood and Blood Products0.360
Other Services0.405
Labor & Delivery0.398
Inhalation Therapy0.181
Anesthesia0.114

Since FY 2009, the relative weights have been based on 100 percent cost weights based on our MS-DRG grouping system.

When we recalibrated the DRG weights for previous years, we set a threshold of 10 cases as the minimum number of cases required to compute a reasonable weight. In the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to use that same case threshold in recalibrating the MS-DRG relative weights for FY 2015. Using data from the FY 2013 MedPAR file, there were 8 MS-DRGs that contain fewer than 10 cases. Under the MS-DRGs, we have fewer low-volume DRGs than under the CMS DRGs because we no longer have separate DRGs for patients aged 0 to 17 years. With the exception of newborns, we previously separated some DRGs based on whether the patient was age 0 to 17 years or age 17 years and older. Other than the age split, cases grouping to these DRGs are identical. The DRGs for patients aged 0 to 17 years generally have very low volumes because children are typically ineligible for Medicare. In the past, we have found that the low volume of cases for the pediatric DRGs could lead to significant year-to-year instability in their relative weights. Although we have always encouraged non-Medicare payers to develop weights applicable to their own patient populations, we have received frequent complaints from providers about the use of the Medicare relative weights in the pediatric population. We believe that eliminating this age split in the MS-DRGs will provide more stable payment for pediatric cases by determining their payment using adult cases that are much higher in total volume. Newborns are unique and require separate MS-DRGs that are not mirrored in the adult population. Therefore, it remains necessary to retain separate MS-DRGs for newborns. All of the low-volume MS-DRGs listed below are for newborns. In FY 2015, because we do not have sufficient MedPAR data to set accurate and stable cost relative weights for these low-volume MS-DRGs, we proposed to compute relative weights for the low-volume MS-DRGs by adjusting their final FY 2014 relative weights by the percentage change in the average weight of the cases in other MS-DRGs. The crosswalk table is shown below:

Low-volume MS-DRGMS-DRG titleCrosswalk to MS-DRG
768Vaginal Delivery with O.R. Procedure Except Sterilization and/or D&CFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
789Neonates, Died or Transferred to Another Acute Care FacilityFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
790Extreme Immaturity or Respiratory Distress Syndrome, NeonateFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
791Prematurity with Major ProblemsFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
792Prematurity without Major ProblemsFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
793Full-Term Neonate with Major ProblemsFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
794Neonate with Other Significant ProblemsFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).
795Normal NewbornFinal FY 2014 relative weight (adjusted by percent change in average weight of the cases in other MS-DRGs).

We did not receive any public comments on this proposal and, therefore, are finalizing it for FY 2015 as proposed.

4. Bundled Payments for Care Improvement (BPCI) Initiative

The Bundled Payments for Care Improvement (BPCI) initiative, developed under the authority of section 3021 of the Affordable Care Act (codified at section 1115A of the Act), is comprised of four broadly defined models of care, which link payments for multiple services beneficiaries receive during an episode of care. Under the BPCI initiative, organizations enter into payment arrangements that include financial and performance accountability for episodes of care. On January 31, 2013, CMS announced the health care organizations selected to participate in the BPCI initiative. For additional information on the BPCI initiative, we refer readers to the CMS' Center for Medicare and Medicaid Innovation's Web site at http://innovation.cms.gov/initiatives/Bundled-Payments/index.html and to section IV.H.4. of the preamble of the FY 2013 IPPS/LTCH PPS final rule (77 FR 53341 through 53343) for a discussion on the BPCI initiative.

In the FY 2013 IPPS/LTCH PPS final rule, for FY 2013 and subsequent fiscal years, we finalized a policy to treat hospitals that participate in the BPCI initiative the same as prior fiscal years for the IPPS payment modeling and ratesetting process without regard to a hospital's participation within these bundled payment models (that is, as if a hospital were not participating in those models under the BPCI initiative). Therefore, for FY 2015, we proposed to continue to include all applicable data from subsection (d) hospitals participating in BPCI Models 1, 2, and 4 in our IPPS payment modeling and ratesetting calculations. We refer readers to the FY 2013 IPPS/LTCH PPS final rule for a complete discussion on our final policy for the treatment of hospitals participating in the BPCI initiative in our ratesetting process.

Comment: One commenter was concerned about the policy to treat all providers that participate in the BPCI initiative the same as prior fiscal years for the IPPS payment modeling and ratesetting process without regard to a hospital's participation within these bundled payment models. The commenter stated that while it is unlikely to have a demonstrable effect in FY 2015, the BPCI initiative has just begun and has few participants compared to the total number of PPS hospitals. The commenter further stated that the cohort is expected to expand dramatically, given the additional round of applications, and it expected participants to focus their cost reduction activities in select MS-DRGs, which could skew specific weights and inappropriately shift payments to other MS-DRGs. The commenter added that providers that are not part of the initiative cannot be expected to reach the same performance levels without the same tools available within the BPCI. The commenter recommended that CMS reconsider removing BPCI participants from the IPPS relative weight setting process.

Response: As the commenter stated, the BPCI initiative is unlikely to have a demonstrable effect for FY 2015. Accordingly, we are finalizing our proposal to continue to include all applicable data from subsection (d) hospitals participating in BPCI Models 1, 2, and 4 in our IPPS payment modeling and ratesetting calculations for FY 2015. However, we will monitor the possible impact that hospitals enrolled in the BPCI initiative may have on the MS-DRG relative weights in future fiscal years.

I. Add-On Payments for New Services and Technologies

1. Background

Sections 1886(d)(5)(K) and (L) of the Act establish a process of identifying and ensuring adequate payment for new medical services and technologies (sometimes collectively referred to in this section as “new technologies”) under the IPPS. Section 1886(d)(5)(K)(vi) of the Act specifies that a medical service or technology will be considered new if it meets criteria established by the Secretary after notice and opportunity for public comment. Section 1886(d)(5)(K)(ii)(I) of the Act specifies that a new medical service or technology may be considered for new technology add-on payment if, “based on the estimated costs incurred with respect to discharges involving such service or technology, the DRG prospective payment rate otherwise applicable to such discharges under this subsection is inadequate.” We note that beginning with discharges occurring in FY 2008, CMS transitioned from CMS-DRGs to MS-DRGs.

The regulations at 42 CFR 412.87 implement these provisions and specify three criteria for a new medical service or technology to receive the additional payment: (1) The medical service or technology must be new; (2) the medical service or technology must be costly such that the DRG rate otherwise applicable to discharges involving the medical service or technology is determined to be inadequate; and (3) the service or technology must demonstrate a substantial clinical improvement over existing services or technologies. Below we highlight some of the major statutory and regulatory provisions relevant to the new technology add-on payment criteria as well as other information. For a complete discussion on the new technology add-on payment criteria, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51572 through 51574).

Under the first criterion, as reflected in § 412.87(b)(2), a specific medical service or technology will be considered “new” for purposes of new medical service or technology add-on payments until such time as Medicare data are available to fully reflect the cost of the technology in the MS-DRG weights through recalibration. We note that we do not consider a service or technology to be new if it is substantially similar to one or more existing technologies. That is, even if a technology receives a new FDA approval, it may not necessarily be considered “new” for purposes of new technology add-on payments if it is “substantially similar” to a technology that was approved by FDA and has been on the market for more than 2 to 3 years. In the FY 2006 IPPS final rule (70 FR 47351) and the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43813 and 43814), we explained our policy regarding substantial similarity in detail.

Under the second criterion, § 412.87(b)(3) further provides that, to be eligible for the add-on payment for new medical services or technologies, the MS-DRG prospective payment rate otherwise applicable to the discharge involving the new medical services or technologies must be assessed for adequacy. Under the cost criterion, to assess the adequacy of payment for a new technology paid under the applicable MS-DRG prospective payment rate, we evaluate whether the charges for cases involving the new technology exceed certain threshold amounts. Table 10 that was released with the FY 2014 IPPS/LTCH PPS final rule contains the final thresholds that we use to evaluate applications for new technology add-on payments for FY 2015. We refer readers to the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/FY2014-IPPS-Final-Rule-Home-Page.html for a complete viewing of Table 10 from the FY 2014 IPPS/LTCH PPS final rule.

In the September 7, 2001 final rule that established the new technology add-on payment regulations (66 FR 46917), we discussed the issue of whether the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule at 45 CFR Parts 160 and 164 applies to claims information that providers submit with applications for new technology add-on payments. We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51573) for complete information on this issue.

Under the third criterion, § 412.87(b)(1) of our existing regulations provides that a new technology is an appropriate candidate for an additional payment when it represents “an advance that substantially improves, relative to technologies previously available, the diagnosis or treatment of Medicare beneficiaries.” For example, a new technology represents a substantial clinical improvement when it reduces mortality, decreases the number of hospitalizations or physician visits, or reduces recovery time compared to the technologies previously available. (We refer readers to the September 7, 2001 final rule for a more detailed discussion of this criterion (66 FR 46902).)

The new medical service or technology add-on payment policy under the IPPS provides additional payments for cases with relatively high costs involving eligible new medical services or technologies while preserving some of the incentives inherent under an average-based prospective payment system. The payment mechanism is based on the cost to hospitals for the new medical service or technology. Under § 412.88, if the costs of the discharge (determined by applying cost-to-charge ratios (CCRs) as described in § 412.84(h)) exceed the full DRG payment (including payments for IME and DSH, but excluding outlier payments), Medicare will make an add-on payment equal to the lesser of: (1) 50 percent of the estimated costs of the new technology (if the estimated costs for the case including the new technology exceed Medicare's payment); or (2) 50 percent of the difference between the full DRG payment and the hospital's estimated cost for the case. Unless the discharge qualifies for an outlier payment, the additional Medicare payment is limited to the full MS-DRG payment plus 50 percent of the estimated costs of the new technology.

Section 503(d)(2) of Public Law 108-173 provides that there shall be no reduction or adjustment in aggregate payments under the IPPS due to add-on payments for new medical services and technologies. Therefore, in accordance with section 503(d)(2) of Public Law 108-173, add-on payments for new medical services or technologies for FY 2005 and later years have not been subjected to budget neutrality.

In the FY 2009 IPPS final rule (73 FR 48561 through 48563), we modified our regulations at § 412.87 to codify our longstanding practice of how CMS evaluates the eligibility criteria for new medical service or technology add-on payment applications. That is, we first determine whether a medical service or technology meets the newness criterion, and only if so, do we then make a determination as to whether the technology meets the cost threshold and represents a substantial clinical improvement over existing medical services or technologies. We also amended § 412.87(c) to specify that all applicants for new technology add-on payments must have FDA approval or clearance for their new medical service or technology by July 1 of each year prior to the beginning of the fiscal year that the application is being considered.

The Council on Technology and Innovation (CTI) at CMS oversees the agency's cross-cutting priority on coordinating coverage, coding and payment processes for Medicare with respect to new technologies and procedures, including new drug therapies, as well as promoting the exchange of information on new technologies between CMS and other entities. The CTI, composed of senior CMS staff and clinicians, was established under section 942(a) of Public Law 108-173. The Council is co-chaired by the Director of the Center for Clinical Standards and Quality (CCSQ) and the Director of the Center for Medicare (CM), who is also designated as the CTI's Executive Coordinator.

The specific processes for coverage, coding, and payment are implemented by CM, CCSQ, and the local claims-payment contractors (in the case of local coverage and payment decisions). The CTI supplements, rather than replaces, these processes by working to assure that all of these activities reflect the agency-wide priority to promote high-quality, innovative care. At the same time, the CTI also works to streamline, accelerate, and improve coordination of these processes to ensure that they remain up to date as new issues arise. To achieve its goals, the CTI works to streamline and create a more transparent coding and payment process, improve the quality of medical decisions, and speed patient access to effective new treatments. It is also dedicated to supporting better decisions by patients and doctors in using Medicare-covered services through the promotion of better evidence development, which is critical for improving the quality of care for Medicare beneficiaries.

To improve the understanding of CMS' processes for coverage, coding, and payment and how to access them, the CTI has developed an “Innovator's Guide” to these processes. The intent is to consolidate this information, much of which is already available in a variety of CMS documents and in various places on the CMS Web site, in a user-friendly format. This guide was published in August 2008 and is available on the CMS Web site at: http://www.cms.gov/CouncilonTechInnov/Downloads/InnovatorsGuide5_10_10.pdf.

As we indicated in the FY 2009 IPPS final rule (73 FR 48554), we invite any product developers or manufacturers of new medical technologies to contact the agency early in the process of product development if they have questions or concerns about the evidence that would be needed later in the development process for the agency's coverage decisions for Medicare.

The CTI aims to provide useful information on its activities and initiatives to stakeholders, including Medicare beneficiaries, advocates, medical product manufacturers, providers, and health policy experts. Stakeholders with further questions about Medicare's coverage, coding, and payment processes, or who want further guidance about how they can navigate these processes, can contact the CTI at CTI@cms.hhs.gov.

We note that applicants for add-on payments for new medical services or technologies for FY 2016 must submit a formal request, including a full description of the clinical applications of the medical service or technology and the results of any clinical evaluations demonstrating that the new medical service or technology represents a substantial clinical improvement, along with a significant sample of data to demonstrate that the medical service or technology meets the high-cost threshold. Complete application information, along with final deadlines for submitting a full application, will be posted as it becomes available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/newtech.html. To allow interested parties to identify the new medical services or technologies under review before the publication of the proposed rule for FY 2016, the CMS Web site also will post the tracking forms completed by each applicant.

2. Public Input Before Publication of a Notice of Proposed Rulemaking on Add-On Payments

Section 1886(d)(5)(K)(viii) of the Act, as amended by section 503(b)(2) of Public Law 108-173, provides for a mechanism for public input before publication of a notice of proposed rulemaking regarding whether a medical service or technology represents a substantial clinical improvement or advancement. The process for evaluating new medical service and technology applications requires the Secretary to—

  • Provide, before publication of a proposed rule, for public input regarding whether a new service or technology represents an advance in medical technology that substantially improves the diagnosis or treatment of Medicare beneficiaries;
  • Make public and periodically update a list of the services and technologies for which applications for add-on payments are pending;
  • Accept comments, recommendations, and data from the public regarding whether a service or technology represents a substantial clinical improvement; and
  • Provide, before publication of a proposed rule, for a meeting at which organizations representing hospitals, physicians, manufacturers, and any other interested party may present comments, recommendations, and data regarding whether a new medical service or technology represents a substantial clinical improvement to the clinical staff of CMS.

In order to provide an opportunity for public input regarding add-on payments for new medical services and technologies for FY 2015 prior to publication of the FY 2015 IPPS/LTCH PPS proposed rule, we published a document in the Federal Register on November 29, 2013 (78 FR 71555 through 71557), and held a town hall meeting at the CMS Headquarters Office in Baltimore, MD, on February 12, 2014. In the announcement notice for the meeting, we stated that the opinions and alternatives provided during the meeting would assist us in our evaluations of applications by allowing public discussion of the substantial clinical improvement criterion for each of the FY 2015 new medical service and technology add-on payment applications before the publication of the FY 2015 proposed rule.

Approximately 91 individuals registered to attend the town hall meeting in person, while additional individuals listened over an open telephone line. We also live-streamed the town hall meeting and posted the town hall on the CMS YouTube Web page at: http://www.youtube.com/watch?v=WXyR_TILfKo&list=TLiu1B_AxXsinTW6EEn4BVUdR4iEM61eV4. We considered each applicant's presentation made at the town hall meeting, as well as written comments submitted on the applications that were received by the due date of January 21, 2014, in our evaluation of the new technology add-on payment applications for FY 2015 in the proposed rule.

In response to the published document and the New Technology Town Hall meeting, we received written comments regarding the applications for FY 2015 new technology add-on payments. We summarized these comments in the preamble of the proposed rule or, if applicable, indicated that there were no comments received, at the end of each discussion of the individual applications in the proposed rule.

A number of attendees at the New Technology Town Hall meeting provided comments that were unrelated to the “substantial clinical improvement” criterion. As explained above and in the Federal Register document announcing the New Technology Town Hall meeting (78 FR 71555 through 71557), the purpose of the meeting was specifically to discuss the substantial clinical improvement criterion in regard to pending new technology add-on payment applications for FY 2015. Therefore, we did not summarize those comments in the proposed rule. Commenters were informed that they were welcome to resubmit these comments during the comment period in response to proposals presented in the proposed rule. We summarize and respond to these comments under the applicable discussions within this final rule.

We also received public comments in response to the proposed rule relating to topics such as marginal cost factors for new technology add-on payments, mapping new technologies to the appropriate MS-DRG, deeming a new technology a substantial clinical improvement if it receives HDE approval from the FDA, and the use of external data in determining the cost threshold. Because we did not request public comments nor propose to make any changes to any of the issues above, we are not summarizing these public comments nor responding to them in this final rule.

Another commenter asked CMS to consider the implications of the new technology add-on payment policy on antibiotics that fall under the current IPPS and, in particular, the Hospital VBP Program for which the inclusion of the MRSA bacteremia measure and the C-difficile measure are proposed. The commenter was concerned that current payment policy will be inadequate and place further financial pressure on hospitals. The commenter stated that CMS must consider the evolving payment paradigm facing inpatient facilities (IQR, HAC, and VBP) and ensure that these various policies do not have competing goals. Although we agree with the commenter that CMS should consider the evolving payment paradigm facing inpatient facilities regarding payment reductions under the Hospital IQR Program, the HAC Reduction Program, and the Hospital VBP Program and ensure that these various policies do not have competing goals, we are not providing a detailed response because we did not present any policy proposals concerning these issues.

Comment: One commenter expressed concern that services identified as appropriate for new technology add-on payments do not receive the new technology add-on payment even when the claims for these services are correctly submitted to the Medicare administrative contractors (MACs). The commenter stated that the MACs are often unable to explain the reason for the failure to include the new technology add-on payment or answer inquiries regarding this issue. The commenter recommended that CMS provide additional education to the MACs regarding CMS regulations related to services available for new technology add-on payments.

Response: We encourage providers to work with their MACs to ensure that the new technology add-on payments are accurately and appropriately made. If MACs are having any issues, they can contact the CMS Central Office for further assistance. Also, the regulations at § 412.88 explain how the new technology add-on payments are made. We note that, under certain conditions, even if an approved new technology was billed on the claim, a new technology add-on payment may not be made, such as if the total payment for the claim without the new technology add-on payment exceeds the costs of the case. In addition, each year after the final rule, CMS issues a transmittal to the MACs listing the eligibility and maximum add-on payment for each approved new technology.

3. FY 2015 Status of Technologies Approved for FY 2014 Add-On Payments

a. Glucarpidase (Trade Brand Voraxaze®)

BTG International, Inc. submitted an application for new technology add-on payments for Glucarpidase (trade brand Voraxaze®) for FY 2013. Glucarpidase is used in the treatment of patients who have been diagnosed with toxic methotrexate (MTX) concentrations as of result of renal impairment. The administration of Glucarpidase causes a rapid and sustained reduction of toxic MTX concentrations.

Voraxaze® was approved by the FDA on January 17, 2012. Beginning in 1993, certain patients could obtain expanded access for treatment use to Voraxaze® as an investigational drug. Since 2007, the applicant has been authorized to recover the costs of making Voraxaze® available through its expanded access program. We describe expanded access for treatment use of investigational drugs and authorization to recover certain costs of investigational drugs in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53346 through 53350). Voraxaze® was available on the market in the United States as a commercial product to the larger population as of April 30, 2012. In the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27936 through 27939), we expressed concerns about whether Voraxaze® could be considered new for FY 2013. After consideration of all of the public comments received, in the FY 2013 IPPS/LTCH PPS final rule, we stated that we considered Voraxaze® to be “new” as of April 30, 2012, which is the date of market availability.

After evaluation of the newness, costs, and substantial clinical improvement criteria for new technology payments for Voraxaze® and consideration of the public comments we received in response to the FY 2013 IPPS/LTCH PPS proposed rule, we approved Voraxaze® for new technology add-on payments for FY 2013. Cases of Voraxaze® are identified with ICD-9-CM procedure code 00.95 (Injection or infusion of glucarpidase). The cost of Voraxaze® is $22,500 per vial. The applicant stated that an average of four vials is used per Medicare beneficiary. Therefore, the average cost per case for Voraxaze® is $90,000 ($22,500 × 4). Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the technology or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum new technology add-on payment for Voraxaze® is $45,000 per case.

As stated above, the new technology add-on payment regulations provide that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology (§ 412.87(b)(2)). Our practice has been to begin and end new technology add-on payments on the basis of a fiscal year, and we have generally followed a guideline that uses a 6-month window before and after the start of the fiscal year to determine whether to extend the new technology add-on payment for an additional fiscal year. In general, we extend add-on payments for an additional year only if the 3-year anniversary date of the product's entry on the market occurs in the latter half of the fiscal year (70 FR 47362).

With regard to the newness criterion for Voraxaze®, as stated above, we consider the beginning of the newness period to commence when Voraxaze® was first available on the market on April 30, 2012. Because the 3-year anniversary date for Voraxaze® will occur in the latter half of FY 2015 (April 30, 2015), we proposed to continue new technology add-on payments for this technology for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: Several public commenters supported the proposal to continue new technology add-on payments for Voraxaze® for FY 2015.

Response: We appreciate the commenters' support. Because the 3-year anniversary date for Voraxaze® will occur in the latter half of FY 2015 (April 30, 2015), we are finalizing our proposal to continue to make new technology add-on payments for Voraxaze® for FY 2015.

b. DIFICIDTM (Fidaxomicin) Tablets

Optimer Pharmaceuticals, Inc. submitted an application for new technology add-on payments for FY 2013 for the use of DIFICIDTM tablets. As indicated on the labeling submitted to the FDA, the applicant noted that Fidaxomicin is taken twice a day as a daily dosage (200 mg tablet twice daily = 400 mg per day) as an oral antibiotic. The applicant asserted that Fidaxomicin provides potent bactericidal activity against C. Diff., and moderate bactericidal activity against certain other gram-positive organisms, such as enterococcus and staphylococcus. Unlike other antibiotics used to treat CDAD, the applicant noted that the effects of Fidaxomicin preserve bacteroides organisms in the fecal flora. These are markers of normal anaerobic microflora. The applicant asserted that this helps prevent pathogen introduction or persistence, which potentially inhibits the re-emergence of C. Diff., and reduces the likelihood of overgrowths as a result of vancomycin-resistant Enterococcus (VRE). Because of this narrow spectrum of activity, the applicant asserted that Fidaxomicin does not alter this native intestinal microflora.

In the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 27939 through 27941), we expressed concern that DIFICIDTM may not be eligible for new technology add-on payments because eligibility is limited to new technologies associated with procedures described by ICD-9-CM codes. We further stated that drugs that are only taken orally (such as DIFICIDTM) may not be eligible for consideration for new technology add-on payments because there is no procedure associated with these drugs and, therefore, no ICD-9-CM code(s). In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53350 through 53358), after consideration of the public comments received, we revised our policy to allow the use of National Drug Codes (NDCs) to identify oral medications that have no inpatient procedure for the purposes of new technology add-on payments. The revised policy is effective for payments for discharges occurring on or after October 1, 2012. We refer readers to the FY 2013 IPPS/LTCH PPS final rule for a complete discussion on this issue.

With regard to the newness criterion, Fidaxomicin was approved by the FDA on May 27, 2011, for the treatment of CDAD in adult patients, 18 years of age and older. In the FY 2013 IPPS/LTCH PPS final rule, we established that the beginning of the newness period for this technology is its FDA approval date of May 27, 2011.

After evaluation of the newness, costs, and substantial clinical improvement criteria for new technology add-on payments for DIFICIDTM and consideration of the public comments we received in response to the FY 2013 IPPS/LTCH PPS proposed rule, we approved DIFICIDTM for new technology add-on payments for FY 2013. Cases of DIFICIDTM are identified with ICD-9-CM diagnosis code 008.45 (Intestinal infection due to Clostridium difficile) in combination with NDC code 52015-0080-01. Providers must report the NDC on the 837i Health Care Claim Institutional form (in combination with ICD-9-CM diagnosis code 008.45) in order to receive the new technology add-on payment. According to the applicant, the cost of DIFICIDTM is $2,800 for a 10-day dosage. The average cost per day for DIFICIDTM is $280 ($2,800/10). Cases of DIFICIDTM within the inpatient setting typically incur an average dosage of 6.2 days, which results in an average cost per case for DIFICIDTM of $1,736 ($280 × 6.2). Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the technology or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum new technology add-on payment for DIFICIDTM is $868.

As stated above, the new technology add-on payment regulations provide that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology (§ 412.87(b)(2)).

The manufacturer commented through a letter to CMS, prior to the publication of the proposed rule, requesting that CMS extend the eligibility for a third year of new technology add-on payments for DIFICIDTM in FY 2015. The manufacturer maintained that the technology still meets all three criteria for new technology add-on payments. Regarding the substantial clinical improvement criterion, the applicant stated that DIFICIDTM continues to remain the only FDA-approved treatment to demonstrate substantial clinical improvement over existing therapies. No new treatments for CDAD have been approved by the FDA since DIFICIDTM. The applicant further stated that a third year of new technology add-on payments for DIFICIDTM would continue to reduce access barriers in the acute care hospital inpatient setting, which would support the appropriate use of DIFICIDTM, a treatment that offers a substantial clinical improvement over existing therapies.

With respect to the cost criterion, the applicant stated that DIFICIDTM continues to meet the cost criterion. Using claims data from the FY 2012 MedPAR file, the applicant provided updated data from the two analyses described in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53350 through 53358), and demonstrated that the average case-weighted standardized charge per case exceeded the average case-weighted thresholds under both analyses. The applicant stated that the new technology add-on payment is intended to offer additional payments to support patient access and appropriate use of new technologies for a period of time until the MS-DRGs are adjusted to reflect the cost of the new technology. The applicant believed that the analyses conducted with the most recent MedPAR claims data available demonstrate that the MS-DRG recalibrations are insufficient to accommodate the cost associated with CDAD and new technologies to treat CDAD under the IPPS within the allotted timeframe of 2 years. According to the applicant, these payment amounts remain an obstacle for the appropriate use of new technologies for CDAD that demonstrate substantial clinical improvement over existing treatments, such as DIFICIDTM. The applicant concluded that a third year of new technology add-on payments for DIFICIDTM is needed to allow sufficient data for future MS-DRG recalibration analyses.

With regard to newness criterion, the manufacturer commented that it believed that the technology still meets the newness criterion for the following reason: § 412.87(b)(2) states that “A medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code assigned to the new service or technology (depending on when a new code is assigned and data on the new service or technology become available for DRG recalibration). After CMS has recalibrated the DRGs, based on available data, to reflect the costs of an otherwise new medical service or technology, the medical service or technology will no longer be considered `new' under the criterion of this section.” The manufacturer noted that DIFICIDTM was not assigned an ICD-9-CM procedure code and DIFICIDTM is the first product for which no inpatient procedure is associated to receive a new technology add-on payment since the implementation of the new technology add-on payment policy.

The manufacturer also cited the FY 2013 IPPS/LTCH PPS final rule (77 FR 53352), which indicated that “Hospitals currently code and report procedures and more invasive services such as surgeries, infusion of drugs, and specialized procedures such as cardiac catheterizations. Hospitals neither code nor report self-administered drugs.” Therefore, the manufacturer contended that, as an oral therapy, neither DIFICIDTM nor its administration was assigned an ICD-9-CM procedure code and, therefore, the technology should still be eligible for the new technology add-on payments.

The manufacturer further noted that, in the FY 2013 IPPS/LTCH PPS final rule, because an ICD-9-CM procedure code for the administration of an oral medication did not exist and hospitals had no other mechanism to report the use of DIFICIDTM, for FY 2013, CMS instructed hospitals to report the DIFICIDTM NDC on hospital inpatient claims to receive the new technology add-on payment for DIFICIDTM. Prior to October 1, 2012, hospitals did not use NDCs on hospital inpatient claims, which prevented CMS from isolating DIFICIDTM cases and their associated costs. The manufacturer further stated that the NDC methodology was a bold change in policy and inpatient billing processes, and it stands to reason that, because of hospitals unfamiliarity with reporting NDCs on inpatient claims, hospitals' use of the DIFICIDTM NDC would greatly lag behind the traditional use of ICD-9-CM procedure codes. As such, the manufacturer reasoned that any lag in hospital reporting would directly impact CMS' ability to track and analyze the cost data associated with DIFICIDTM cases.

The manufacturer also noted that on August 31, 2012, CMS issued Transmittal 2539, which is a change request for MACs concerning updates for the upcoming fiscal year. The manufacturer stated that because the new technology add-on heading was omitted in the transmittal, this change request did not highlight the NDC billing approach to ensure that hospitals recognized the important change, which may have caused hospitals to overlook the claim reporting instructions for DIFICIDTM.

The manufacturer added that Transmittal 2539 and a Medicare Learning Network® Matters (MLN) article were rescinded and replaced by Transmittal 2627 on January 4, 2013. The manufacturer noted that among CMS' reasons for replacing the transmittal was to insert the omitted new technology add-on section heading. The manufacturer stated that, although the original transmittal further supports that collection of DIFICIDTM-specific data did not begin until at least October 1, 2012, CMS' reissuance of the claims processing instructions, and the missing header in the initial instructions, effectively delayed implementation of the new technology add-on payments for 3 months past the October 2012 beginning date. The manufacturer also believed that the need to replace the transmittal underlies hospitals' difficulties instituting claims' reporting instructions to receive new technology add-on payments for DIFICIDTM at the hospital level.

The manufacturer noted that anecdotal feedback from hospitals, which was shared with CMS during a meeting in June 2013, suggests that some hospitals faced challenges implementing the appropriate billing and coding processes. The manufacturer was concerned that that these challenges were, in part, caused by the missing header, and that these challenges may have impacted whether eligible cases were properly billed and coded to receive the new technology add-on payment for DIFICIDTM. The manufacturer was further concerned that the effects of any lag or delay caused by unfamiliarity with reporting NDCs and the missing header would also impact the data available to CMS to recalibrate the MS-DRGs and, separately, to evaluate the impact of the new technology add-on payment for DIFICIDTM. The manufacturer further explained that, while DIFICIDTM was available to hospitals after its launch in July 2011, hospitals had no experience reporting NDCs until October 2012, and may not have recognized the opportunity to, or understood the mechanism for doing so, until after January 2013. For the purposes of inpatient data collection and ratesetting, the manufacturer believed that this meant that 2 complete years of DIFICIDTM costs would not be fully reflected in the Medicare claims data for the FY 2015 MS-DRG recalibrations.

The manufacturer also analyzed the 100 percent sample of the Standard Analytical File (SAF) for CY 2012, which contained first quarter claims data for FY 2013, the first 3 months that DIFICIDTM was eligible for the new technology add-on payments. The manufacturer found a total of 43,608 cases with a diagnosis of CDI. Of these 43,608 cases, the manufacturer found 38 cases across 26 hospitals that reported new technology add-on payments for DIFICIDTM on submitted claims. The manufacturer stated that this preliminary data suggests that the number of cases available for MS-DRG recalibrations for FY 2015 is limited. The manufacturer stated that it is currently attempting to secure FY 2013 MedPAR claims data and that it will likely provide further insights on these issues.

In addition, the manufacturer noted that prior new technology add-on payment application approvals have involved technologies with much narrower patient populations compared to DIFICIDTM, allowing the costs of those technologies to influence the MS-DRG relative payment weights for the small number of MS-DRGs with which they are associated. The manufacturer explained that, unlike other technologies approved for new technology add on payments, the DIFICIDTM therapeutic value, while limited to patients with CDAD, is used in patients across a wide range of MS-DRGs due to it being reported as a secondary diagnosis in two-thirds of the cases compared to other technologies, which are assigned to a relatively small number of MS-DRGs. For example, cases involving the Spiration IBV® Valve System, which was granted approval for new technology add-on payments in FY 2010, primarily mapped to three MS-DRGs: 163 (Major Chest Procedures with MCC); 164 (Major Chest Procedures with CC); and 165 (Major Chest Procedures without CC/MCC). In its analysis of the FY 2012 MedPAR data for the cost criterion, the manufacturer found cases using DIFICIDTM mapped to 544 unique MS-DRGs. Under the 100 percent sample of the SAF for CY 2012, the 38 cases mentioned above mapped to 20 different MS-DRGs. The manufacturer maintained that because of the diffuse nature of the DIFICIDTM cases mapping to many MS-DRGs, it believed an extension of the newness period is required for the costs to be adequately reflected in the MS-DRG relative payment weights. In the unique case of DIFICIDTM for the treatment of CDAD, the manufacturer stated that 2 years of new technology add-on payments is insufficient to allow the 544 MS-DRGs to be recalibrated to sufficiently reflect the cost of the use of DIFICIDTM, a treatment that offers significant clinical improvement over existing therapies.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28032 through 28033), we responded to the comments above. Specifically, with regard to the technology's newness, as discussed in the FY 2005 IPPS final rule (69 FR 49003), the timeframe that a new technology can be eligible to receive new technology add-on payments begins when data become available. Section 412.87(b)(2) clearly states that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology (depending on when a new code is assigned and data on the new service or technology become available for DRG recalibration). Section 412.87(b)(2) also states that after CMS has recalibrated the DRGs, based on available data, to reflect the costs of an otherwise new medical service or technology, the medical service or technology will no longer be considered “new” under the criterion of this section. Therefore, regardless of whether a technology can be individually identified by a separate ICD-9-CM code or whether it can only be identified using a NDC code, if the costs of the technology are included in the charge data, and the MS-DRGs have been recalibrated using that data, then the technology can no longer be considered “new” for the purposes of this provision. We further stated in that final rule that the period of newness does not necessarily start with the approval date for the medical service or technology, and does not necessarily start with the issuance of a distinct code. Instead, it begins with availability of the product on the U.S. market, which is when data become available. We have consistently applied this standard, and believe that it is most consistent with the purpose of new technology add-on payments.

In addition, similar to our discussion in the FY 2006 IPPS final rule (70 FR 47349), we do not believe that case volume is a relevant consideration for making the determination as to whether a product is “new.” Consistent with the statute, a technology no longer qualifies as “new” once it is more than 2 to 3 years old, irrespective of how frequently it has been used in the Medicare population. Similarly, this same determination is applicable no matter how many MS-DRGs the technology is spread across. Therefore, if a product is more than 2 to 3 years old, we consider its costs to be included in the MS-DRG relative weights whether its use in the Medicare population has been frequent or infrequent. We recognize that using an NDC was a novel billing practice under the IPPS. Nevertheless, even though hospitals may not have coded all uses of DIFICIDTM with the NDC, hospital bills would still include charges for all items and services furnished to a Medicare patient, including use of DIFICIDTM. Therefore, even though we may be not be able to identify all uses of DIFICIDTM in the Medicare charge data, hospital charges for the MS-DRGs would continue to reflect use of this technology.

With respect to the Transmittal 2539 omitting the header referenced above, as noted above, CMS corrected this issue as soon as possible by rescinding and reissuing this transmittal. Additionally, as noted by the manufacturer, this transmittal was meant for MACs and not hospitals. We believe the guidance issued in Transmittal 2539 clearly described to MACs how hospitals were to report the NDC on the inpatient claim in order to identify cases using DIFICIDTM for purposes of new technology add-on payments. Additionally, the MLN article that the manufacturer referred to above (MLN articles are typically a summary of transmittals for the general public) clearly indicated that DIFICIDTM was new for FY 2013 new technology add-on payments and clearly described how to properly code DIFICIDTM on the inpatient bill in order to receive the new technology add-on payment for FY 2013. The MLN article can be downloaded from the CMS Web site at: http://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNMattersArticles/downloads/MM8041.pdf.

After considering the manufacturer's comments above, as we explained in the FY 2015 IPPS/LTCH PPS proposed rule, we continue to consider the beginning of the newness period to commence when DIFICIDTM was first approved by the FDA on May 27, 2011. Because the 3-year anniversary date of the product's entry on the U.S. market occurred in the second half of the fiscal year (after April 1, 2014), we continued new technology add-on payments for DIFICIDTM for FY 2014. However, for FY 2015, the 3-year anniversary date of the product's entry on the U.S. market occurred on May 27, 2014, which is prior to the beginning of FY 2015. Therefore, we proposed to discontinue new technology add-on payments for DIFICIDTM for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: One commenter stated that CMS has the authority to grant a third year of new technology add-on payments for DIFICIDTM. The commenter stated that if Congress intended for the Secretary to begin the data collection period described in the statute based on the date of FDA approval, Congress would have done so. The commenter added that it agrees that, as a threshold matter, a product must be “new.” Specifically, the commenter reasoned that Congress did not intend to make available the new technology add-on payment for technologies that have been approved for years and received a unique code years later. The commenter believed that once a product is deemed “new,” the statute requires that data are to be collected for 2 to 3 years from the date of the ICD-9-CM code assignment. The commenter believed that CMS has the authority to first deem a product new and then collect data two to three years from the date of the inpatient code assignment. The commenter explained that sections 1886(d)(5)(K)(i) and 1886(d)(5)(K)(ii) of the Act mandate two separate legal requirements. The commenter further stated that this policy would mitigate the effect of older technologies that receive ICD-9-CM codes many years after their FDA approval date being eligible for new technology add-on payments. Therefore, the commenter stated that, under this policy, DIFICIDTM is eligible for a third year of new technology add-on payments.

The commenter also quoted the FY 2005 IPPS final rule (69 FR 49002 through 49003) where CMS stated the following: “Using the ICD-9-CM code alone is not an appropriate test of newness because technologies that are new to the market are automatically placed into the closest ICD-9-CM category when they first come on the market, unless the manufacturer requests the assignment of a new ICD-9-CM code because existing codes do not adequately reflect or describe the medical service or device. The services and technologies that have been placed into existing ICD-9-CM codes have been paid for using those descriptors.” The commenter believed that this policy is not relevant to oral drugs because hospitals do not typically code for oral medications. Therefore, the commenter stated that CMS must make a special exception for oral drugs and rely on the statutory authority to measure the length of time for data collection for new technology add-on payments based on the date of the “hospital inpatient code.”

Response: As discussed above, and as we stated in the FY 2005 IPPS final rule (69 FR 49003), the timeframe that a new technology can be eligible to receive new technology add-on payments begins when data become available. We have consistently applied this standard, and believe that it is most consistent with the purpose of new technology add-on payments. We refer readers to the discussion above and the FY 2005 IPPS final rule (69 FR 49002 through 49003) for further details regarding this issue. For these reasons, we disagree with the commenter that DIFICIDTM is eligible for a third year of new technology add-on payments.

With respect to the second comment, while oral drugs are not typically coded by hospitals, we maintain what we stated in the FY 2005 IPPS final rule that the services and technologies that have been assigned existing ICD-9-CM codes have been paid for using those descriptors. Although DIFICIDTM did not receive a specific ICD-9-CM code, it can be described or identified through additional ICD-9-CM procedure or diagnosis codes (such as diagnosis code 008.45, Intestinal infection due to Clostridium difficile). Moreover, as we noted above and in the proposed rule, hospital charges would include charges for all items and services furnished to a Medicare beneficiary, including use of DIFICIDTM. Therefore, we disagree with the commenter and continue to believe that DIFICIDTM is no longer new nor is any special exception warranted.

Comment: Several commenters reiterated the arguments made by the manufacturer as explained above and in the proposed rule that DIFICIDTM should be eligible for new technology add-on payments in FY 2015.

Response: After considering these comments, for the reasons stated above and in the proposed rule, we consider the beginning of the newness period to commence when DIFICIDTM was first approved by the FDA on May 27, 2011. The 3-year anniversary date of the product's entry on the U.S. market occurred on May 27, 2014, which is prior to the beginning of FY 2015. Therefore, we are finalizing our proposal to discontinue new technology add-on payments for DIFICIDTM for FY 2015.

c. Zenith® Fenestrated Abdominal Aortic Aneurysm (AAA) Endovascular Graft

Cook® Medical submitted an application for new technology add-on payments for the Zenith® Fenestrated Abdominal Aortic Aneurysm (AAA) Endovascular Graft (Zenith® F. Graft) for FY 2013. The applicant stated that the current treatment for patients who have had an AAA is an endovascular graft. The applicant explained that the Zenith® F. Graft is an implantable device designed to treat patients who have an AAA and who are anatomically unsuitable for treatment with currently approved AAA endovascular grafts because of the length of the infrarenal aortic neck. The applicant noted that, currently, an AAA is treated through an open surgical repair or medical management for those patients not eligible for currently approved AAA endovascular grafts.

With respect to newness, the applicant stated that FDA approval for the use of the Zenith® F. Graft was granted on April 4, 2012. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53360 through 53365), we stated that because the Zenith® F. Graft was approved by the FDA on April 4, 2012, we believed that the Zenith® F. Graft met the newness criterion as of that date.

After evaluation of the newness, costs, and substantial clinical improvement criteria for new technology add-on payments for the Zenith® F. Graft and consideration of the public comments we received in response to the FY 2013 IPPS/LTCH PPS proposed rule, we approved the Zenith® F. Graft for new technology add-on payments for FY 2013. Cases involving the Zenith® F. Graft that are eligible for new technology add-on payments are identified by ICD-9-CM procedure code 39.78 (Endovascular implantation of branching or fenestrated graft(s) in aorta). In the application, the applicant provided a breakdown of the costs of the Zenith® F. Graft. The total cost of the Zenith® F. Graft utilizing bare metal (renal) alignment stents was $17,264. Of the $17,264 in costs for the Zenith® F. Graft, $921 is for components that are used in a standard Zenith AAA Endovascular Graft procedure. Because the costs for these components are already reflected within the MS-DRGs (and are no longer “new”), in the FY 2013 IPPS/LTCH PPS final rule, we stated that we do not believe it is appropriate to include these costs in our calculation of the maximum cost to determine the maximum add-on payment for the Zenith® F. Graft. Therefore, the total maximum cost for the Zenith® F. Graft is $16,343 ($17,264—$921). Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the device or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum add-on payment for a case involving the Zenith® F. Graft is $8,171.50.

As stated above, the new technology add-on payment regulations provide that “a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology” (§ 412.87(b)(2)). With regard to the newness criterion for the Zenith® F. Graft, as stated above, we consider the beginning of the newness period to commence when the Zenith® F. Graft was approved by the FDA on April 4, 2012. Because the 3-year anniversary date of the entry of the Zenith® F. Graft on the U.S. market will occur in the second half of the fiscal year (April 4, 2015), we proposed to continue new technology add-on payments for this technology for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: Several commenters supported the proposal to continue new technology add-on payments for the Zenith® F. Graft ® for FY 2015.

Response: We appreciate the commenters' support. Because the 3-year anniversary date for Zenith® F. Graft will occur in the latter half of FY 2015 (April 4, 2015), we are finalizing our proposal to continue to make new technology add-on payments for the Zenith® F. Graft for FY 2015.

d. KcentraTM

CSL Behring submitted an application for new technology add-on payments for KcentraTM for FY 2014. KcentraTM is a replacement therapy for fresh frozen plasma (FFP) for patients with an acquired coagulation factor deficiency due to warfarin and who are experiencing a severe bleed. KcentraTM contains the Vitamin K dependent coagulation factors II, VII, IX and X, together known as the prothrombin complex, and antithrombotic proteins C and S. Factor IX is the lead factor for the potency of the preparation. The product is a heat-treated, non-activated, virus filtered and lyophilized plasma protein concentrate made from pooled human plasma. KcentraTM is available as a lyophilized powder that needs to be reconstituted with sterile water prior to administration via intravenous infusion. The product is dosed based on Factor IX units. Concurrent Vitamin K treatment is recommended to maintain blood clotting factor levels once the effects of KcentraTM have diminished.

KcentraTM was approved by the FDA on April 29, 2013. In the FY 2014 IPPS/LTCH PPS final rule, we approved new ICD-9-CM procedure code 00.96 (Infusion of 4-Factor Prothrombrin Complex Concentrate) which uniquely identifies KcentraTM.

In the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27538), we noted that we were concerned that KcentraTM may be substantially similar to FFP and/or Vitamin K therapy. In the FY 2014 IPPS/LTCH PPS final rule, in response to comments submitted by the manufacturer, we stated that we agree that KcentraTM may be used in a patient population that is experiencing an acquired coagulation factor deficiency due to Warfarin and who are experiencing a severe bleed currently but are ineligible for FFP, particularly for use by IgA deficient patients and other patient populations that have no other treatment option to resolve severe bleeding in the context of an acquired Vitamin K deficiency. In addition, FFP is limited because it requires special storage conditions while KcentraTM is stable for up to 36 months at room temperature thus allowing hospitals that otherwise would not have access to FFP (for example, small rural hospitals as discussed by the applicant in its comments) to keep a supply of KcentraTM and treat patients who would possibly have no access to FFP. We noted that FFP is considered perishable and can be scarce by nature (due to production and other market limitations) thus making some hospitals unable to store FFP, which limits access to certain patient populations in certain locations. Therefore, we stated that we believe that KcentraTM provides a therapeutic option for a new patient population and is not substantially similar to FFP. Also, we gave credence to the information presented by the manufacturer that KcentraTM provides a simple and rapid repletion relative to FFP and reduces the risk of a transfusion reaction relative to FFP because it does not contain ABO antibodies and does not require ABO typing. As a result, we concluded that KcentraTM is not substantially similar to FFP, and that it meets the newness criterion.

After evaluation of the newness, cost, and substantial clinical improvement criteria for new technology add-on payments for KcentraTM and consideration of the public comments we received in response to the FY 2014 IPPS/LTCH PPS proposed rule, we approved KcentraTM for new technology add-on payments for FY 2014 (78 FR 50575 through 50580). Cases involving KcentraTM that are eligible for new technology add-on payments are identified by ICD-9-CM procedure code 00.96. In the application, the applicant estimated that the average Medicare beneficiary would require an average dosage of 2500 International Units (IU). Vials contain 500 IU at a cost of $635 per vial. Therefore, cases of KcentraTM would incur an average cost per case of $3,175 ($635 x 5). Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the technology or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum add-on payment for a case of KcentraTM is $1,587.50 for FY 2014.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50579), we stated that new technology add-on payments for KcentraTM would not be available with respect to discharges for which the hospital received an add-on payment for a blood clotting factor administered to a Medicare beneficiary with hemophilia who is a hospital inpatient. Under section 1886(d)(1)(A)(iii) of the Act, the national adjusted DRG prospective payment rate is “the amount of the payment with respect to the operating costs of inpatient hospital services (as defined in subsection (a)(4) of this section)” for discharges on or after April 1, 1988. Section 1886(a)(4) of the Act excludes from the term “operating costs of inpatient hospital services” the costs with respect to administering blood clotting factors to individuals with hemophilia. The costs of administering a blood clotting factor to a Medicare beneficiary who has hemophilia and is a hospital inpatient are paid separately from the IPPS. (For information on how the blood clotting factor add-on payment is made, we refer readers to Section 20.7.3 of Chapter Three of the Medicare Claims Processing Manual, which can be downloaded from the CMS Web site at: http://cms.gov/Regulations-and-Guidance/Guidance/Manuals/Downloads/clm104c03.pdf.) In addition, we stated that if KcentraTM is approved by the FDA as a blood clotting factor, we believed that it may be eligible for blood clotting factor add-on payments when administered to Medicare beneficiaries with hemophilia. We make an add-on payment for KcentraTM for such discharges in accordance with our policy for payment of a blood clotting factor, and the costs would be excluded from the operating costs of inpatient hospital services as set forth in section 1886(a)(4) of the Act.

Section 1886(d)(5)(K)(i) of the Act requires the Secretary to “establish a mechanism to recognize the costs of new medical services and technologies under the payment system established under this subsection” beginning with discharges on or after October 1, 2001. We believe that it is reasonable to interpret this requirement to mean that the payment mechanism established by the Secretary recognizes only costs for those items that would otherwise be paid based on the prospective payment system (that is, “the payment system established under this subsection”). As noted above, under section 1886(d)(1)(A)(iii) of the Act, the national adjusted DRG prospective payment rate is the amount of payment for the operating costs of inpatient hospital services, as defined in section 1886(a)(4) of the Act, for discharges on or after April 1, 1988. We understand this to mean that a new medical service or technology must be an operating cost of inpatient hospital services paid based on the prospective payment system, and not excluded from such costs, in order to be eligible for the new technology add-on payment. We pointed out that new technology add-on payments are based on the operating costs per case relative to the prospective payment rate as described in § 412.88. Therefore, we believe that new technology add-on payments are appropriate only when the new technology is an operating cost of inpatient hospital services and are not appropriate when the new technology is excluded from such costs.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50579), we stated that we believe that hospitals may only receive new technology add-on payments for discharges where KcentraTM is an operating cost of inpatient hospital services. In other words, a hospital would not be eligible to receive the new technology add-on payment when it is administering KcentraTM in treating a Medicare beneficiary who has hemophilia. In those instances, KcentraTM is specifically excluded from the operating costs of inpatient hospital services in accordance with section 1886(a)(4) of the Act and paid separately from the IPPS. However, when a hospital administers KcentraTM to a Medicare beneficiary who does not have hemophilia, the hospital would be eligible for a new technology add-on payment because KcentraTM would not be excluded from the operating costs of inpatient hospital services. Therefore, discharges where the hospital receives a blood clotting factor add-on payment are not eligible for a new technology add-on payment for the blood clotting factor. We refer readers to Chapter Three, Section 20.7.3 of the Medicare Claims Processing Manual for a complete discussion on when a blood clotting factor add-on payment is made. The manual can be downloaded from the CMS Web site at: http://www.cms.gov/Regulations-and-Guidance/Guidance/Manuals/Downloads/clm104c03.pdf.

As stated above, the new technology add-on payment regulations provide that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology (§ 412.87(b)(2)). With regard to the newness criterion for KcentraTM, as stated above, we consider the beginning of the newness period to commence when KcentraTM was approved by the FDA on April 29, 2013. Because KcentraTM is still within the 3-year newness period, we proposed to continue new technology add-on payments for this technology for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: Several commenters supported the proposal to continue new technology add-on payments for KcentraTM for FY 2015.

Response: We appreciate the commenters' support. Because the 3-year anniversary date for KcentraTM will occur in the second half of FY 2016 (April 29, 2016), we are finalizing our proposal to continue to make new technology add-on payments for KcentraTM FY 2015.

e. Argus® II Retinal Prosthesis System

Second Sight Medical Products, Inc. submitted an application for new technology add-on payments for the Argus® II Retinal Prosthesis System (Argus® II System) for FY 2014. The Argus® II System is an active implantable medical device that is intended to provide electrical stimulation of the retina to induce visual perception in patients who are profoundly blind due to retinitis pigmentosa (RP). These patients have bare or no light perception in both eyes. The system employs electrical signals to bypass dead photo-receptor cells and stimulate the overlying neurons according to a real-time video signal that is wirelessly transmitted from an externally worn video camera. The Argus® II implant is intended to be implanted in a single eye, typically the worse-seeing eye. Currently, bilateral implants are not intended for this technology. According to the applicant, the surgical implant procedure takes approximately 4 hours and is performed under general anesthesia.

The Argus® II System consists of three primary components: (1) An implant which is an epiretinal prosthesis that is fully implanted on and in the eye (that is, there are no percutaneous leads); (2) external components worn by the user; and (3) a “fitting” system for the clinician that is periodically used to perform diagnostic tests with the system and to custom-program the external unit for use by the patient. We describe these components more fully below.

  • Implant: The retinal prosthesis implant is responsible for receiving information from the external components of the system and electrically stimulating the retina to induce visual perception. The retinal implant consists of: (a) A receiving coil for receiving information and power from the external components of the Argus® II System; (b) electronics to drive stimulation of the electrodes; and (c) an electrode array. The receiving coil and electronics are secured to the outside of the eye using a standard scleral band and sutures, while the electrode array is secured to the surface of the retina inside the eye by a retinal tack. A cable, which passes through the eye wall, connects the electronics to the electrode array. A pericardial graft is placed over the extra-ocular portion on the outside of the eye.
  • External Components: The implant receives power and data commands wirelessly from an external unit of components, which include the Argus II Glasses and Video Processing Unit (VPU). A small lightweight video camera and transmitting coil are mounted on the glasses. The telemetry coils and radio-frequency system are mounted on the temple arm of the glasses for transmitting data from the VPU to the implant. The glasses are connected to the VPU by a cable. This VPU is worn by the patient, typically on a belt or a strap, and is used to process the images from the video camera and convert the images into electrical stimulation commands, which are transmitted wirelessly to the implant.
  • “Fitting System”: To be able to use the Argus® II System, a patient's VPU needs to be custom-programmed. This process, which the applicant called “fitting”, occurs in the hospital/clinic shortly after the implant surgery and then periodically thereafter as needed. The clinician/physician also uses the “Fitting System” to run diagnostic tests (for example, to obtain electrode and impedance waveform measurements or to check the radio-frequency link between the implant and external unit). This “Fitting System” can also be connected to a “Psychophysical Test System” to evaluate patients' performance with the Argus® II System on an ongoing basis.

These three components work together to stimulate the retina and allow a patient to perceive phosphenes (spots of light), which they then need to learn to interpret. While using the Argus® II System, the video camera on the patient-worn glasses captures a video image. The video camera signal is sent to the VPU, which processes the video camera image and transforms it into electrical stimulation patterns. The electrical stimulation data are then sent to a transmitter coil mounted on the glasses. The transmitter coil sends both data and power via radio-frequency (RF) telemetry to the implanted retinal prosthesis. The implant receives the RF commands and delivers stimulation to the retina via an array of electrodes that is secured to the retina with a retinal tack.

In patients with RP, the photoreceptor cells in the retina, which normally transduce incoming light into an electro-chemical signal, have lost most of their function. The stimulation pulses delivered to the retina via the electrode array of the Argus® II System are intended to mimic the function of these degenerated photoreceptors cells. These pulses induce cellular responses in the remaining, viable retinal nerve cells that travel through the optic nerve to the visual cortex where they are perceived as phosphenes (spots of light). Patients learn to interpret the visual patterns produced by these phosphenes.

With respect to the newness criterion, according to the applicant, the FDA designated the Argus® II System a Humanitarian Use Device in May 2009 (HUD designation #09-0216). The applicant submitted a Humanitarian Device Exemption (HDE) application (#H110002) to the FDA in May 2011 to obtain market approval for the Argus® II System. The HDE was referred to the Ophthalmic Devices Panel of the FDA's Medical Devices Advisory Committee for review and recommendation. At the Panel's meeting held on September 28, 2012, the Panel voted 19 to 0 that the probable benefits of the Argus® II System outweigh the risks of the system for the proposed indication for use. The applicant received the HDE approval from the FDA on February 14, 2013. Currently there are no other approved treatments for patients with severe to profound RP. The Argus® II System has an IDE number of G050001 and is a Class III device. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50580 through 50583), we approved new ICD-9-CM procedure code 14.81 (Implantation of Epiretinal Visual Prosthesis), which uniquely identifies the Argus® II System. The other two codes approved by CMS are for removal, revision, or replacement of the device. More information on these codes can be found on the CMS Web site at: http://cms.gov/Medicare/Coding/ICD9ProviderDiagnosticCodes/ICD-9-CM-C-and-M-Meeting-Materials-Items/2013-03-05-MeetingMaterials.html.

After evaluation of the new technology add-on payment application and consideration of public comments received, we concluded that the Argus® II System met all of the new technology add-on payment policy criteria. Therefore, we approved the Argus® II System for new technology add-on payments in FY 2014 (78 FR 50580 through 50583). Cases involving the Argus® II System that are eligible for new technology add-on payments are identified by ICD-9-CM procedure code 14.81. We note that section 1886(d)(5)(K)(i) of the Act requires that the Secretary establish a mechanism to recognize the costs of new medical services or technologies under the payment system established under that subsection, which establishes the system for paying for the operating costs of inpatient hospital services. The system of payment for capital costs is established under section 1886(g) of the Act, which makes no mention of any add-on payments for a new medical service or technology. Therefore, it is not appropriate to include capital costs in the add-on payments for a new medical service or technology. In the application, the applicant provided a breakdown of the costs of the Argus® II System. The total operating cost of the Argus® II System is $144,057.50. Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the device or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum add-on payment for a case involving the Argus® II System for FY 2014 is $72,028.75.

As stated above, the new technology add-on payment regulations provide that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology (§ 412.87(b)(2)). With regard to the newness criterion for the Argus® II System, as stated above, we consider the beginning of the newness period to commence when the Argus® II System was approved by the FDA on February 14, 2013. Because the Argus® II System is still within the 3-year newness period, we proposed to continue new technology add-on payments for this technology for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: Several commenters supported the proposal to continue new technology add-on payments for the Argus® II System for FY 2015. Some commenters noted that, while the Argus® II System received FDA approval on February 14, 2013, it was not available on the U.S. market until December 20, 2013. The commenters explained that as part of this lengthy process, the manufacturer first had to submit a request to the Federal Communications Commission (FCC) for a waiver of section 15.209(a) of the FCC rules to allow the manufacturer to then apply for FCC authorization to utilize this specific RF band. The FCC granted the request for a waiver of the rules on November 30, 2011. After receiving the FCC waiver of section 15.209(a), the manufacturer was required to obtain a Grant of Equipment Authorization to utilize the specific RF band, which the FCC issued on December 20, 2013. Therefore, the commenters stated that the date the Argus® II System first became available for commercial sale in the United States was December 20, 2013.

Response: We appreciate the commenters' input and support. We agree with the commenters that due to the delay described above, the date of newness for the Argus® II System is now December 20, 2013, instead of February 14, 2013. Because the 3-year anniversary date for the Argus® II System will occur in the first half of FY 2017 (December 20, 2016), we are finalizing our proposal to continue to make new technology add-on payments for the Argus® II System for FY 2015.

f. Zilver® PTX® Drug Eluting Peripheral Stent

Cook® Medical submitted an application for new technology add-on payments for the Zilver® PTX® Drug Eluting Peripheral Stent (Zilver® PTX®) for FY 2014. The Zilver® PTX® is intended for use in the treatment of peripheral artery disease (PAD) of the above-the-knee femoropopliteal arteries (superficial femoral arteries). According to the applicant, the stent is percutaneously inserted into the artery(s), usually by accessing the common femoral artery in the groin. The applicant stated that an introducer catheter is inserted over the wire guide and into the target vessel where the lesion will first be treated with an angioplasty balloon to prepare the vessel for stenting. The applicant indicated that the stent is self-expanding, made of nitinol (nickel titanium), and is coated with the drug Paclitaxel. Paclitaxel is a drug approved for use as an anticancer agent and for use with coronary stents to reduce the risk of renarrowing of the coronary arteries after stenting procedures.

The applicant received FDA approval on November 15, 2012, for the Zilver® PTX®. The applicant maintains that the Zilver® PTX® is the first drug-eluting stent used for superficial femoral arteries. The technology is currently described by ICD-9-CM procedure code 00.60 (Insertion of drug-eluting stent(s) of the superficial femoral artery).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50583 through 50585), after evaluation of the new technology add-on payment application and consideration of the public comments received, we approved the Zilver® PTX® for new technology add-on payments in FY 2014. Cases involving the Zilver® PTX® that are eligible for new technology add-on payments are identified by ICD-9-CM procedure code 00.60. As explained in the FY 2014 IPPS/LTCH PPS final rule, to determine the amount of Zilver® PTX® stents per case, instead of using the amount of stents used per case based on the ICD-9-CM codes, the applicant used an average of 1.9 stents per case based on the Zilver® PTX® Global Registry Clinical Study. The applicant stated in its application that the anticipated cost per stent is approximately $1,795. Therefore, cases of the Zilver® PTX® would incur an average cost per case of $3,410.50 ($1,795 × 1.9). Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the device or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum add-on payment for a case of the Zilver® PTX® is $1,705.25 for FY 2014.

As stated above, the new technology add-on payment regulations provide that “a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology” (§ 412.87(b)(2)). With regard to the newness criterion for the Zilver® PTX®, as stated above, we consider the beginning of the newness period to commence when the Zilver® PTX® was approved by the FDA on November 15, 2012. Because the Zilver® PTX® is still within the 3-year newness period, we proposed to continue new technology add-on payments for this technology for FY 2015.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on this proposal.

Comment: Several commenters supported the proposal to continue new technology add-on payments for the Zilver® PTX® for FY 2015.

Response: We appreciate the commenters' support. Because the 3-year anniversary date for the Zilver® PTX® will occur in the first half of FY 2016 (November 12, 2015), we are finalizing our proposal to continue to make new technology add-on payments for the Zilver® PTX® FY 2015.

4. FY 2015 Applications for New Technology Add-On Payments

We received seven applications for new technology add-on payments for FY 2015, three of which were applications resubmitted from FY 2014. However, one applicant withdrew its application prior to the publication of the proposed rule. In addition, the applicant for the Watchman® System withdrew its application prior to the publication of this final rule. In accordance with the regulations under § 412.87(c), applicants for new technology add-on payments must have FDA approval by July 1 of each year prior to the beginning of the fiscal year that the application is being considered. A discussion of the five remaining applications is presented below.

Comment: One commenter stated that CMS was critical of evidence presented by the applicants to support their claims that the new technology represents a substantial clinical improvement. The commenter explained that CMS finds fault with peer-reviewed literature, registry data, meta-analysis of clinical trials, lack of long-term outcome data, age of clinical trial participants below the age of Medicare beneficiaries, single arm studies, non-inferiority studies, and weak primary efficacy results. The commenter urged CMS to avoid blanket judgments on what types of evidence are considered adequate and to carefully consider the totality of the circumstances associated with a particular product. The applicant concluded that, given the list of evidence cited by CMS, it would appear that only head to head trials are sufficient to show substantial clinical improvement over standard of care, but it is important to note that in the case of first in class products, such trials are not feasible.

Another commenter shared similar concerns and stated that a study may be designed to measure noninferiority when compared to conventional treatment, but the results of the study may demonstrate superiority in terms of other measures, such as reduced pain, decreased recovery time or shorter hospitalizations. In addition, the commenter stated that study data that provide information regarding patient outcomes may be more important than whether the study was designed as a superiority trial or a noninferiority trial. The commenter concluded that a policy to require superiority studies, or at least to question noninferiority studies, could have negative results, including delaying patient access to innovative treatments, improved care outcomes, curtailing innovation, and discouraging competition. The commenter stated that CMS should give great weight to the totality of the evidence, including non-inferiority studies and other methodological approaches, as it considers approval of applications for new technology add-on payments.

Some commenters stated that CMS has a precedent of accepting noninferiority studies to evaluate technologies under the substantial clinical improvement criterion. In particular, these commenters indicated that CMS approved new technology add-on payments for Fidaxomicin in FY 2013 (77 FR 53350-53358) and KcentraTM in FY 2014 (78 FR 50575-50580) and that both of these technologies submitted data from clinical trials demonstrating non-inferiority. One commenter stated that CMS' approval of Fidaxomicin for new technology add-on payments establishes a precedent for approval for a technology that shows non-inferiority for a primary end point in addition to the acceptance of other clinically important secondary analysis, and that precedent should be used to approve all technologies. Another commenter stated that CMS' approval of KcentraTM for new technology add-on payments is an example of how a technology can use data from randomized controlled trials demonstrating noninferiority to show that the technology represents a substantial clinical improvement.

One commenter stated that non-inferiority trials are a well-established and appropriately accepted standard, and noninferiority designs are the only affordable and ethical option for drug developers in researching acute bacterial skin and skin structure infections. The commenter also stated that primary focus for developing new agents targeted for acute bacterial skin and skin structure infection patients is not to improve clinical cure rates, but to “enhance the efficiency and cost effectiveness of achieving clinical cures, ease therapeutic administration (and, therefore, improve compliance) and limit avoidable exposure to healthcare acquired infections (which, when they occur, significantly increase costs and create patient safety risks).” The commenter urged CMS to clarify that it has not suggested or proposed to adopt a blanket judgment approach against technologies studied on a noninferiority basis.

Response: We appreciate the commenters' input and support. CMS always considers the totality of the clinical evidence whenever it makes a substantial clinical improvement determination. We agree with the commenters that we approved new technology add-on payments for Fidaxomicin and KcentraTM by determining that both of these technologies not only met the newness and cost criteria for new technology add-on payments, but also represented a substantial clinical improvement in the treatment options available for Medicare beneficiaries. We also appreciate that the commenter reviewed the policies we established in FY 2002 (66 FR 46902) with regard to the substantial clinical improvement criterion and clarified in FY 2008 (72 FR 47301). We continue to believe, as we did in FY 2008, that it is a reasonable concern that establishing specific data standards may make it more difficult for an applicant to qualify for a new technology add-on payment because such standards cannot account for the various types of new technologies that may become available in the future and the types of requirements that those novel technologies may or may not be able to meet. In other words, we clarify that we did not propose to establish nor are we establishing a blanket judgment approach against technologies studied on a non-inferiority basis. As we stated in the final rule that appeared in the Federal Register on September 7, 2001 (referred to hereinafter as the Inpatient New Technology Add-on Payment Final Rule), one of the ways to determine if a technology meets the substantial clinical improvement criterion is for the applicant to demonstrate that use of the technology significantly improves clinical outcomes for a patient population as compared with currently available treatments (66 FR 46914). In that rule, we finalized the policy that we would require applicants to submit evidence to demonstrate this. For the purposes of seeking additional payment from Medicare under the IPPS, we believe that it is preferable, when possible, for applicants to submit evidence that demonstrates superiority of the applicant technology as compared with currently available treatments. We note that this superiority can be derived, extrapolated, or inferred from noninferiority studies in which the results demonstrate a far greater delta than proposed in the power analysis. This belief is based on earlier experiences, which we described in the FY 2002 final rule: “[W]e would point out that various new technologies introduced over the years have been demonstrated to have been less effective than initially thought, or in some cases even potentially harmful. We believe it is in the best interest of Medicare beneficiaries to proceed very carefully with respect to the incentives created to quickly adopt new technology” (66 FR 46913). However, we point out that in that same rule, we provide two additional ways for an applicant technology to demonstrate substantial clinical improvement: if the device offers a treatment option for a patient population unresponsive to, or ineligible for, currently available treatments; or if the device offers the ability to diagnose a medical condition in a patient population where that medical condition is currently undetectable or offers the ability to diagnose a medical condition earlier in a patient population than allowed by currently available methods. There must also be evidence that the use of the device to make a diagnosis affects the management of the patient's care. (We refer readers to the Inpatient New Technology Add-on Payment Final Rule (66 FR 46914).) Similarly, for these two additional ways to meet the substantial clinical improvement criterion, we continue to believe that it is appropriate to require that applicants submit evidence that the technology in fact meets the criterion through one of these two ways. We do not require an applicant to meet the criterion in more than one of these ways, but emphasize that we require evidence to support an applicant's claim. If an applicant chooses to demonstrate that use of its technology significantly improves clinical outcomes, we believe that it is appropriate for CMS to consider all of the evidence presented in determining whether there is sufficient objective clinical evidence to determine if a new technology meets the substantial clinical improvement criterion.

a. Dalbavancin (Durata Therapeutics, Inc.)

Durata Therapeutics, Inc. submitted an application for new technology add-on payments for FY 2015 for the use of Dalbavancin. Dalbavancin is an intravenous (IV) lipoglycopeptide antibiotic administered as a once-weekly 30-minute infusion via a peripheral line for the treatment of patients with acute bacterial skin and skin structure infections, or ABSSSI. According to the applicant, Dalbavancin's unique pharmacokinetic profile demonstrates rapid bactericidal activity that is potent and sustained against serious gram-positive bacteria, including methicillin-resistant Staphylococcus aureus (MRSA).

With respect to the newness criterion, the applicant stated that Dalbavancin's once-weekly dosing, a simpler regimen than the current standard of care (Vancomycin) of daily or multiple-times daily intravenous dosing, allows for the discontinuation of IV access with its attendant risks of line-related thrombosis and infection. The applicant submitted a New Drug Approval Application (NDA) on September 26, 2013, and as stated in the FY 2015 IPPS/LTCH PPS proposed rule, anticipated FDA approval of Dalbavancin sometime in May of 2014. The applicant also applied for a new ICD-10-PCS code to describe the administration of Dalbavancin, which was presented at the March 19-20, 2014 ICD-10 Coordination and Maintenance Committee meeting. To date, no ICD-10-PCS code specifically describes the administration of Dalbavancin. However, if approved, the new ICD-10-PCS code will be effective on October 1, 2014. We also note in section II.G. of the preamble of this final rule that, per section 212 of the PAMA (Pub. L. 113-93), the Secretary announced plans to establish a new compliance date for ICD-10. We also discuss in that section the requests for ICD-10-PCS codes for FY 2015. We refer readers to section II.G. of the preamble of this final rule for a complete discussion of these issues.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the technology meets the newness criterion. However, we did not receive any public comments regarding whether the technology meets the newness criterion. After the publication of the FY 2015 IPPS/LTCH PPS proposed rule, we were informed that the applicant received FDA approval for the use of the technology on May 23, 2014. Therefore, for purposes of consideration for FY 2015 IPPS new technology add-on payments, we believe that the technology should be considered “new” as of May 23, 2014, when the technology received FDA approval.

We note that in the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43813 through 43814), we established criteria for evaluating whether a new technology is substantially similar to an existing technology, specifically: (1) whether a product uses the same or a similar mechanism of action to achieve a therapeutic outcome; (2) whether a product is assigned to the same or a different MS-DRG; and (3) whether the new use of the technology involves the treatment of the same or similar type of disease and the same or similar patient population. If a technology meets all three of the criteria above, it would be considered substantially similar to an existing technology and would not be considered “new” for purposes of new technology add-on payments.

In evaluating the first criterion, the applicant stated that Dalbavancin's mechanism of action is unique compared to other antibiotics as it involves the interruption of cell wall synthesis resulting in bacterial cell death. Furthermore, the applicant cited Dalbavancin's long half-life as the factor that differentiates itself from existing antibacterial agents active against MRSA. With respect to the second criterion, as we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28036), we believe that cases of ABSSSI that use Dalbavancin or other antibiotics for treatment would be assigned to the same MS-DRGs. Finally, with respect to the third criterion, we believe that Dalbavancin and other antibiotics used to treat cases of ABSSSI treat the same disease and patient population. Based on evaluation of the substantially similarity criteria, we stated in the FY 2015 IPPS/LTCH PPS proposed rule, it appears that Dalbavancin is not substantially similar to other antibiotics for the treatment of ABSSSI because it does not use the same or a similar mechanism of action to achieve a therapeutic outcome.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments regarding whether Dalbavancin is substantially similar to existing antibiotics and whether Dalbavancin meets the newness criterion. However, we did not receive any public comments discussing whether Dalbavancin is substantially similar to existing antibiotics in the context of the newness criterion. After further evaluation of the new technology add-on payment application, we believe that Dalbavancin is not substantially similar to other antibiotics for the treatment of ABSSSI because it does not use the same or a similar mechanism of action to achieve a therapeutic outcome.

According to the applicant, Dalbavancin is indicated to treat gram-positive ABSSSIs, such as cellulitis or erysipelas, and MRSA. These conditions may be a primary diagnosis, but are often secondary to an underlying condition such as diabetes, heart failure, and pressure ulcers, among others. Therefore, the technology is eligible to be used across all MS-DRGs. To demonstrate that it meets the cost criterion, the applicant searched the FY 2012 MedPAR file (across all MS-DRGs) for cases where at least one ABSSSI ICD-9-CM code was present on the claim, including those where MRSA was present on a claim with an ABSSSI diagnosis. Specifically, the applicant searched for cases with one of the following diagnosis codes: 035 (Erysipelas); 681.00 (Cellulitis and abscess of finger, unspecified); 681.01 (Felon); 681.02 (Onychia and paronychia of finger); 681.10 (Cellulitis and abscess of toe, unspecified); 681.11 (Onychia and paronychia of toe); 681.9 (Cellulitis and abscess of unspecified digit); 682.0-682.9 (Other cellulitis and abscess of face, neck, trunk, upper arm and forearm, hand except fingers and thumb, buttock, leg except foot, foot except toes, specified sites, unspecified sites); 686.00 (Pyoderma, unspecified); 686.01 (Pyoderma gangrenosum); 686.09 (Other pyoderma); 686.1 (Pyogenic granuloma of skin and subcutaneous tissue); 686.8 (Other specified local infections of skin and subcutaneous tissue); 686.9 (Unspecified local infection of skin and subcutaneous tissue); 958.3 (Posttraumatic wound infection not elsewhere classified); 998.51 (Infected postoperative seroma); and 998.59 (Other postoperative infection). The applicant believed that these cases represent potential cases eligible for the administration of Dalbavancin.

The applicant found 570,698 cases across 682 MS-DRGs and noted that almost 25 percent of the total number of cases would map to MS-DRGs 603 (Cellulitis without MCC), while the top 10 MS-DRGs accounted for almost half (or 49 percent) of the total number of cases. Of the 682 MS-DRGs, only 90 of these MS-DRGs accounted for 1,000 cases or more. The applicant standardized the charges for all 570,698 cases, which equated to an average case-weighted standardized charge per case of $46,138. We note that the applicant did not inflate the charges nor did it include charges for Dalbavancin in the average case-weighted standardized charge per case. The applicant calculated an average case-weighted threshold of $44,255 across all MS-DRGs. Therefore, the applicant asserted the average case-weighted standardized charge per case (without inflating and including charges for Dalbavancin) exceeds the average case-weighted threshold of $44,255 (as indicated in Table 10 of the FY 2014 IPPS/LTCH PPS final rule). Therefore, the applicant maintained that Dalbavancin meets the cost criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments regarding whether Dalbavancin meets the cost criterion, particularly with regard to the assumptions and methodology used in the applicant's analysis.

Comment: The applicant submitted a public comment maintaining that Dalbavancin meets the cost criterion requirement because the cost of the target cases exceeds the average case-weighted cost threshold requirement prior to accounting for an inflation factor, or including the costs of Dalbavancin. The applicant further stated that it also included the “costs of Dalbavancin in its analysis to further demonstrate that Dalbavancin exceeds the established NTAP cost threshold.”

Response: We appreciate the applicant's response. We reviewed the applicant's analysis. We note that, while the applicant's analysis included the charges associated with Dalbavancin in their final cost estimate, the applicant did not remove the charges for the current therapy for treating acute bacterial skin and skin structure infections. We agree that the applicant's analysis using data from all 570,698 cases across 682 MS-DRGs showed that Dalbavancin exceeds the average case-weighted threshold prior to the inclusion of inflation factors and charges associated with Dalbavancin.

We note that it is unclear to what degree Dalbavancin would be used in each of these cases across the specific MS-DRGs, in part, because a procedure code has not been established to identify the technology's use in the claims data. Therefore, we reviewed the additional analyses using the claims data submitted by the applicant to substantiate that the technology meets the cost criterion. For example, in the data submitted by the applicant, the top 10 MS-DRGs ranked by case volume constitute roughly half of the cases with at least one ICD-9-CM code associated with acute bacterial skin infections. These 10 MS-DRGs include: MS-DRG 0603 (Cellulitics Without MCC); MS-DRG 0602 (Cellulitics With MCC); MS-DRG 0871 (Septicemia or Severe Sepsis Without MV 96+ Hours With MCC); MS-DRG 0863 (Postoperative & Post-Traumatic Infections Without MCC); MS-DRG 0872 (Septicemia or Severe Sepsis Without MV 96+ Hours Without MCC); MS-DRG 0300 (Peripheral Vascular Disorders With CC); MS-DRG 0292 (Heart Failure & Shock with CC); MS-DRG 0862 (Postoperative & Post-Traumatic Infections With MCC); MS-DRG 0857 (Postoperative or Post-Traumatic Infections With O.R. Procedure With CC); and MS-DRG 0853 (Infectious and Parasitic Diseases With O.R. Procedure With MCC). An average case-weighted threshold and standardized charges could be calculated using these MS-DRGs and compared to determine if the standardized charges exceed the average case-weighted threshold for these top 10 MS-DRGs.

In summary, we agree with the applicant that the technology meets the cost criterion.

With regard to substantial clinical improvement, as previously stated by the applicant, Dalbavancin is a new intravenous (IV) lipoglycopeptide antibiotic administered as a once-weekly 30 minute infusion via a peripheral line for the treatment of patients with acute bacterial skin and skin structure infections, or ABSSSI. The applicant noted that, in the setting of continuing emergence of resistance among gram-positive pathogens worldwide, there is an increasing medical need for new antibacterial agents with enhanced gram-positive activity. The applicant cited the Infectious Diseases Society of America (IDSA), stating the need for a multi-pronged approach to address the impact of antibiotic resistance. In addition, the applicant stated the FDA has also designated MRSA as a pathogen of special interest which allows an antibiotic effective against this organism to be designated as a “Qualified Infectious Disease Product,” recognizing the medical need for drugs to treat infections caused by this pathogen. The applicant believed that having a medicinal agent with clinical efficacy against gram-positive pathogens, including MRSA and CA-MRSA, a favorable benefit/risk ratio, and a favorable pharmacokinetics profile allowing convenient dosing in inpatients and outpatients with the potential for minimizing patient noncompliance would be a valuable addition to the antibacterial armamentarium for the treatment of ABSSSI. The applicant also noted that, when taking Dalbavancin, there is no need for oral step-down therapy.

“Bad Bugs, No Drugs,” July 2004.

The applicant suggested that Dalbavancin offers treatment advantages over other available options for therapy for skin infections as a result of the following:

  • Improved potency against key bacterial pathogens with the concentration of Dalbavancin required to kill key target pathogens lower relative to other antibiotics commonly used to treat such pathogens;
  • Retained activity against staphylococcus aureus resistant to other antibiotics;
  • Improved safety profile as Dalbavancin exhibits more favorable tolerability and safety than alternative approved antibacterial drugs in areas such as no evidence of thrombocytopenia as seen with linezolid and tedezolid, superior infusion related tolerability relative to other antibiotics, an absence or reduction of drug specific toxicities, and once a week dosing of IV Dalbavancin avoids pitfalls of patient noncompliance with an oral medication;
  • Lack of drug interactions due to metabolic profile which minimizes risk of unexpected adverse events when co-administered with other compounds as seen with linezolid and quinupristin/dalfopristin;
  • Decreased requirement for therapeutic interventions, specifically the need for an intravenous catheter as Dalbavancin is administered once a week, thus reducing catheter related infection as well;
  • Reduced time to patient defined recovery;
  • Reduced mortality rate as demonstrated in the combined phase of the Discover 1 and Discover 2 clinical trials;
  • The potential for avoidance of admission to the hospital as Dalbavancin allows the utilization of a weekly treatment regimen, thus potentially increasing the convenience of outpatient therapy for patients.

The applicant conducted three phase three randomized, controlled, double blinded clinical trials. The first was the pivotal VER001-9 study with a total of 873 patients with ABSSSIs, which compared the safety and efficacy of IV Dalbavancin with possible switch to oral placebo to IV Linezolid with possible switch to oral Linezolid. According to the applicant, the primary efficacy endpoint of clinical response at test of 14 days with a plus or minus of 2 days after completion of therapy demonstrated comparable clinical efficacy to linezolid and met the requirement of statistical demonstration of non-inferiority. In the clinically evaluable population, 88.9 percent of patients who received Dalbavancin compared to 91.2 percent of patients who received vancomycin/linezolid were clinical successes. The applicant also noted that Dalbavancin had an improved safety profile compared to Linezolid as the overall incidence and percentage of adverse events and deaths were lower in the Dalbavancin group, which was statistically significant.

The second and third clinical trials were the Discover 1 and Discover 2 trials, which enrolled a total of 1,312 patients with ABSSSI and compared IV Dalbavancin with IV placebo every 12 hours to match Vancomycin with possible switch to oral Vancomycin to IV Vancomycin with IV placebo to match IV Dalbavancin with possible switch to oral Linezolid. The applicant reported that in both studies, the primary efficacy outcome measure was clinical response in 48 to 72 hours post-study drug initiation and a secondary outcome measure was clinical status at the end of treatment visit (day 14) in the Intent to Treat (ITT) and clinically evaluable at End of Treatment populations. Clinical status was also determined at the short-term follow-up and long-term follow-up visits.

According to the applicant, the Discover 1 trial demonstrated that 83.3 percent of patients in the ITT population who received Dalbavancin were responders at 48 to 72 hours after the start of therapy compared to 81.8 percent of patients who received Vancomycin/Linezolid. The applicant also noted that Dalbavancin was non-inferior to Vancomycin/Linezolid (Absolute Difference in Success Rates (95 percent confidence interval): −4.6 percent; 7.9 percent).

The applicant further noted that the Discover 2 trial showed similar results to the Discover 1 trial. Specifically, the trial demonstrated that 76.8 percent of patients in the ITT population who received Dalbavancin were responders at 48 to 72 hours after the start of therapy compared to 78.3 percent of patients who received Vancomycin/Linezolid. The applicant again noted that Dalbavancin was non-inferior to Vancomycin/Linezolid (Absolute Difference in Success Rates (95 percent confidence interval): −7.4 percent; 4.6 percent).

The applicant found Dalbavancin to be effective against MRSA and other gram-positive bacteria associated with ABSSSI. The applicant stated that 25 percent of patients in the study were treated without an inpatient admission.

We stated in the FY 2015 IPPS/LTCH PPS proposed rule that we are concerned with the details of the trial design and the primary efficacy endpoints used within those trials that were used to provide the clinical data supplied by the applicant. All of the trials were noninferiority studies, which prevent any determination as to substantial clinical improvement from the trial data. The primary efficacy endpoint was defined as having no increase in lesion size, and no fever 48 to 72 hours after drug initiation. The secondary endpoint was a >20 percent reduction in infection area at defined points in time. At neither endpoint is the patient oriented endpoint of resolution of infection increased. With these limitations in using efficacy data to establish substantial clinical improvement, the applicant suggested that the outpatient treatment, elimination of central lines and avoidance of hospitalization all may improve safety, avoid treatment-associated infections and improve patient satisfaction, and that these factors demonstrate substantial clinical improvement. While the factors mentioned may be true, the applicant did not present any evidence to support its assertions.

We invited public comments on whether Dalbavancin meets the substantial clinical improvement criterion, including public comments in response to our concern that the applicant has only provided efficacy data of noninferiority, and no data for the other suggested benefits.

Comment: Several commenters stated that Dalbavancin meets the substantial clinical improvement criteria and, therefore, CMS should approve the application for new technology add-on payments in FY 2015.

Response: We appreciate the commenters' input. We considered these public comments in our determination of whether this technology represents a substantial clinical improvement in the treatment options currently available to Medicare beneficiaries.

Comment: As previously summarized, some of the commenters stated that CMS has a precedent of accepting noninferiority studies to evaluate technologies under the substantial clinical improvement criterion. In particular, these commenters indicated that CMS approved new technology add-on payments for Fidaxomicin in FY 2013 (77 FR 53350 through 53358) and KcentraTM in FY 2014 (78 FR 50575 through 50580), and both of these technologies submitted data from clinical trials demonstrating non-inferiority. One commenter stated that CMS' approval of Fidaxomicin for new technology add-on payments establishes a precedent for approval for a technology that shows noninferiority for a primary end point in addition to the acceptance of other clinically important secondary analysis. The commenters believed that precedent should be used to approve the application for new technology add-on payments for Dalbavancin. Another commenter stated that CMS' approval of KcentraTM for new technology add-on payments is an example of how a technology can use data from randomized controlled trials demonstrating noninferiority to show that technology represents a substantial clinical improvement.

The applicant also provided additional data from its clinical trials on the degree to which patients who were improving were permitted to stop their treatment after 10 days. The data showed that patients randomized to Dalbavancin were more likely to stop therapy at 10 days, and less likely to continue treatment through 14 days. The applicant stated that by day 10 most patients were being treated on an outpatient basis on oral therapy (either with an oral placebo or oral linezolid), and that treatment was discontinued at the patient's discretion. The applicant further stated that “the implication of this finding is that, from the patient's perspective, resolution of the underlying infection was occurring more rapidly for those randomized to Dalbavancin.”

Response: We refer readers to section II.I.4. of the preamble of this final rule for our detailed response to commenters' concerns regarding noninferiority trials.

We believe that our preliminary assessment (and final determination described later in this section) with regard to Dalbavancin is consistent with prior determinations made with regard to other approved technologies, including the two technologies identified by the commenters, Fidaxomicin and KcentraTM. With regard to Fidaxomicin, we note that we stated that we believed that it represented a treatment option with the potential to decrease utilization, reduce the recurrence of clostridium-difficile associated disease (CDAD), and improve quality of life. We also note that we considered the information the applicant provided with regard to the endpoints in its clinical trial, which as the commenters point out, were indeed to demonstrate that the effects of administering Fidaxomicin were non-inferior to administering Vancomycin. (We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53357 through 53358).) Similarly, with regard to KcentraTM, we note that we stated that we believed that it provided a rapid beneficial resolution of the patient's blood clotting factor deficiency, decreases the risk of exposure to blood borne pathogens, and reduces the rate of transfusion-associated complications. These conclusions also were based on information the applicant provided with regard to the endpoints in its clinical trial. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50578 through 50579).) However, we note that in their clinical trials, these applicants were able to show a wider margin of difference between the treatment and control groups. The small margin of difference between the groups in this study leads us to conclude that any additional analysis of the trial data would be unlikely to demonstrate superiority of the treatment group.

With regard to the additional data the applicant provided regarding days of therapy, it is our understanding that most patients in both groups were on oral therapy by day 10 and that patients in both groups were allowed to discontinue their therapy at their discretion. The treatment group was more likely to discontinue use of Dalbavancin by day 10. We believe that it is difficult to assess the degree to which this implied that resolution of the underlying infection was occurring more rapidly, or would meet our definition of substantial clinical improvement. However, in light of the data from the applicant's non-inferiority trial, which did not show a wide margin of difference between the treatment and control groups, we do not believe that this is sufficient objective evidence to determine that Dalbavancin is a substantial clinical improvement in the treatment options available for Medicare beneficiaries.

Comment: Many commenters described how they believed that Dalbavancin's administration would improve patient safety and reduce adverse events, improve medication compliance, and reduce potential additional health care utilization.

With regard to patient safety and adverse events, many commenters asserted that using Dalbavancin does not require an indwelling IV access, unlike treatments using Vancomycin and, therefore, it is self-evident that the potential for catheter-associated infections is eliminated. Some of these commenters emphasized the importance of reducing catheter-associated infections, and noted that Dalbavancin could help achieve this goal.

In addition, with regard to patient safety and adverse events, the applicant provided references discussing the frequency of central venous catheter complications nationally. The applicant also provided data from their pivotal clinical trial showing the number and proportion of patients who died and those with adverse events, including drug-related adverse events and treatment-related serious adverse events. The applicant asserted that the data showed that fewer patients randomized to Dalbavancin died relative to the standard of care, showing that one patient (0.2 percent) treated with Dalbavancin died while 7 patients (1.1 percent) treated with Vancomycin/Linezolid died. Notably, while these data showed with a p value of 0.05 that 33 percent of patients treated with Dalbavancin had an adverse event compared to 38 percent of patients treated with Vancomycin or Linezolid, the data also showed that it was difficult to distinguish between the two groups in terms of drug-related adverse events and treatment-related serious adverse event. The data showed that 12 percent of patients treated with Dalbavancin experienced a drug-related adverse event compared to 14 percent of patients treated with Vancomycin/Linezolid with a p value of 0.45. The data also showed that 0.3 percent of patients treated with Dalbavancin experienced a treatment-related serious adverse event compared to 0.6 percent of patients treated with Vancomycin/Linezolid with a p value of 0.41. In addition to these data, the applicant also presented data collected in their clinical program that compared the infusion-related adverse events of patients receiving Dalbavancin to those of patients receiving commonly used alternative agents. These data showed that 2.2 percent of patients treated with Dalbavancin experienced an infusion-related adverse event, while 3.1 of comparator agent patients experienced an infusion-related adverse event.

One commenter, having reviewed the applicant's clinical trial data, concluded that while the safety profile to date of Dalbavancin appears similar to Vancomycin, the ultimate determination of safety must await broader clinical use. The commenter noted that future clinical trials are needed to define the safety profile of Dalbavancin.

Response: We appreciate commenters' input and the additional data submitted by the applicant.

We disagree with commenters that it is self-evident that the technology eliminates the potential for catheter-associated infections, particularly with respect to indwelling catheters. It is not clear if these patients already would have had indwelling catheters in place, whether for antibiotic administration or other purposes. Therefore, it is not evident that simply having the option of an antibiotic that does not require an indwelling catheter would eliminate the potential for catheter-associated infections. We agree with the commenters that the administration of Dalbavancin could reduce the potential for these infections in patients that otherwise would not have an indwelling catheter, but note that it was not possible to discern the degree to which this potential reduction occurs based on the data and comments provided.

As previously stated, we appreciate the applicant's submission of additional data from its trials regarding safety and adverse events. We agree with the applicant that Dalbavancin appears to be associated with fewer infusion-associated adverse events and patient deaths relative to the comparator group. We note that the applicant's data showed that drug-related and treatment-related serious adverse events appeared to be less frequent for patients treated with Dalbavancin relative to the comparator group, but that it was not clear to what degree the groups actually differed because the p values were in excess of 0.4. We also agree with the commenter that stated that it would appear that more clinical use and data should be gathered to more fully develop Dalbavancin's safety profile.

Comment: Many commenters stated that they believed that Dalbavancin would improve medication compliance and reduce potential additional health care utilization. Some commenters noted that patients diagnosed with acute bacterial skin and skin structure infections are often treated as inpatients. One commenter noted that the rate of these skin and skin structure infections are higher than they have ever historically been. One commenter described these hospitalizations as unnecessary. Another commenter stated that while Dalbavancin is not more efficacious than Vancomycin, it is easier to administer. The commenter concluded that Dalbavancin would make it possible to treat patients with complicated skin and skin structure infections that might otherwise require hospitalization on an outpatient basis without compromising efficacy and without the need for either laboratory monitoring or an indwelling intravenous catheter. Several commenters noted that less pharmacist monitoring time was required for the administration of Dalbavancin relative to Vancomycin. Several commenters stated that no additional data beyond the pivotal trials are needed to show that a single infusion involves fewer administrations and requires less health care resources than a course of therapy that lasts a week or more. One commenter described the importance of medication compliance in the context of treating a patient population that faces socioeconomic hardships. Specifically, the commenter noted that noncompliant patients are more likely to present to the emergency department with worsening infections and that Dalbavancin's dosing profile reduces the risk of noncompliance that is typically associated with oral therapy.

Response: We appreciate the commenters' input. We agree with the commenters that there is the possibility that Dalbavancin could make it possible for certain patients to be treated on an outpatient basis rather than as inpatients of a hospital. We further agree with commenters that there is the potential for treatment benefits for Medicare beneficiaries that would help avoid hospitalizations, including avoiding potential future iatrogenic events. However, we are concerned that neither the applicant, nor any of the commenters, provided specific information or data regarding the reduced resource use that they believe would occur. It is common that benefits from events that appear to be “self-evident,” as suggested by the commenters, prove to not be beneficial events when subjected to the rigors of a clinical trial.

After consideration of the public comments we received, we do not believe that Dalbavancin meets the substantial clinical improvement criterion to qualify the technology for new technology add-on payments under the IPPS in FY 2015. In particular, we do not believe there is sufficient objective clinical evidence to determine that Dalbavancin significantly improves clinical outcomes for Medicare beneficiaries in order for the technology to qualify for new technology add-on payments. While we recognize that Dalbavancin has met FDA standards for safety and effectiveness, the new technology add-on payment application process and approval requires a demonstration of a substantial clinical improvement, which is not inherent in the FDA's regulatory process. We recognize that the technology is the first drug designated as a Qualified Infectious Disease Product (QIDP) to receive FDA approval and was granted QIDP designation because it is an antibacterial or antifungal human drug intended to treat serious or life-threatening infections. We are equally committed to encouraging increased development and approval of new antibacterial drugs, providing physicians and patients with important new treatment options and will support this endeavor by providing payment for Dalbavancin through our prospective payment processes. However, in the case of this application, we do not believe that the technology meets the substantial clinical improvement criterion. Therefore, we are not approving new technology add-on payments for Dalbavancin for FY 2015.

b. Heli-FXTM EndoAnchor System (Aptus Endosystems, Inc.)

The Heli-FXTM EndoAnchor System is indicated for use in the treatment of patients whose endovascular grafts during treatment of aortic aneurysms have exhibited migrations or endoleaks, or in the treatment of patients who are at risk of such complications, and in whom augmented radial fixation and/or sealing is required to regain or maintain adequate aneurysm exclusion.

The Heli-FXTM EndoAnchor System is comprised of the following three components: (1) The EndoAnchor Implant; (2) the Heli-FXTM Applier; and (3) the Heli-FXTM Guide with Obturator. The Heli-FXTM EndoAnchor System is a mechanical fastening device that is designed to enhance the long-term durability and reduce the risk of repeat interventions in endovascular aneurysm repair (EVAR) and thoracic endovascular aneurysm repair (TEVAR). By deploying a small helical screw (the Heli-FXTM EndoAnchors) to connect the endograft to the aorta, the Heli-FXTM System seeks to provide a permanent seal and fixation, similar to the stability achieved with an open surgical anastomosis.

The original Heli-FXTM EndoAnchor System, designed for treating abdominal aortic aneurysms (AAA), was cleared by the FDA through the “de novo” 510(k) process on November 21, 2011 (reference K102333). The Heli-FXTM Thoracic System, which allows the expanded use of the Heli-FXTM EndoAnchor System technology to the treatment of thoracic aortic aneurysms (TAA), was cleared by the FDA on August 14, 2012 (reference K121168).

The applicant submitted two applications for approval for new technology add-on payment in FY 2015: one for the treatment of AAAs and the other for the treatment of TAA repair. We note that, as stated in the Inpatient New Technology Add-on Payment Final Rule (66 FR 46915), two applications are necessary in this instance, because patients that may be eligible for use of the technology under the first indication are not expected to be assigned to the same MS-DRGs as patients receiving treatment using the new technology under the second indication. Specifically, patients who have endovascular grafts implanted for the treatment of AAA map to MS-DRGs 237 (Major Cardiovascular Procedures with MCC) and 238 (Major Cardiovascular Procedures without MCC), while patients who have endovascular grafts implanted for the treatment of TAA map to MS-DRGs 219 (Cardiac Valve and Other Major Cardiothoracic Procedure without Cardiac Catheter with MCC), 220 (Cardiac Valve and Other Major Cardiothoracic Procedure without Cardiac Catheter with CC), and 221 (Cardiac Valve and Other Major Cardiothoracic Procedure without Cardiac Catheter without CC/MCC). Each indication/application must also meet the cost criterion and the substantial clinical improvement criterion in order to be eligible for new technology add-on payments beginning in FY 2015. We discuss both of these applications below.

(1) Heli-FXTM EndoAnchor System for the Treatment of AAA (Heli-FXTM AAA)

As mentioned above, the original Heli-FXTM EndoAnchor System, designed for treating patients diagnosed with AAA, was cleared by the FDA through the “de novo” 510(k) process on November 21, 2011 (reference K102333). According to the applicant, the device became available to Medicare beneficiaries following the product launch at the Society of Vascular Surgery (SVS) Annual Meeting held on June 7-9, 2012. Therefore, the applicant maintained that the Heli-FXTM AAA meets the “newness” criterion because the technology was not available on the U.S. market until June 2012. The applicant explained that the delay in the general market availability of the original Heli-FXTM AAA, following initial FDA clearance, was mainly because of the regulatory uncertainty inherent in the “de novo” 510(k) process. This uncertainty prevented the manufacturer from being able to secure the venture capital funding that was necessary to prepare for commercialization before obtaining market clearance. The ability to secure venture capital through the fundraising process was dependent upon the FDA clearance. According to the applicant, funding to commercially market the technology was not obtained until June 2012. In subsequent discussions with the applicant, the applicant confirmed that the Heli-FXTM AAA was available on the U.S. market as of November 2011. Further, the applicant acknowledged that four implantations were performed on Medicare beneficiaries between November 2011 and June 2012. Therefore, the Heli-FXTM AAA is considered “new” as of November 2011 when the technology was cleared by the FDA and became available on the U.S. market.

Section 412.87(b)(2) of the regulations state that a medical service or technology may be considered new within 2 or 3 years after the point at which data begin to become available reflecting the ICD-9-CM code assigned to the new service or technology. Our past practice has been to begin and end the eligibility for new technology add-on payments on a fiscal year basis. We have generally followed a guideline that uses a 6-month window, before and after the beginning of the fiscal year, to determine whether to still consider a technology “new” and extend approved new technology add-on payments for an additional fiscal year. In general, a technology is still considered “new” (and eligible to receive new technology add-on payments) only if the 3-year anniversary date of the product's entry on the market occurs in the latter half of the fiscal year. (We refer readers to 70 FR 47362.) With regard to the newness criterion for the Heli-FXTM AAA, as stated above, we consider the beginning of the newness period for the device to begin when the technology first became available on the U.S. market in November 2011. As previously stated, the applicant acknowledged that four implantations were performed on Medicare beneficiaries between November 2011 and June 2012. Therefore, the costs of the Heli-FXTM AAA are currently reflected in the MS-DRGs, and the 3-year anniversary date under the newness criterion for the product's entry on the U.S. market will occur during November 2014 (the first half of FY 2015). As such, we do not believe that the Heli-FXTM AAA meets the newness criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the Heli-FXTM AAA meets the newness criterion. We note that the applicant requested an ICD-10-PCS code, and presented comments at the March 2014 ICD-10 Coordination & Maintenance Committee meeting. We also note in section II.G. of the preamble of this final rule that, per section 212 of the PAMA (Pub. L. 113-93), the Secretary announced plans to establish a new compliance date for ICD-10-PCS. We also discuss in that section requests for ICD-10-PCS codes for FY 2015. We refer readers to section II.G. of the preamble of this final rule for a complete discussion of these issues.

Comment: The applicant submitted a public comment in response to the concerns that CMS presented in the FY 2015 IPPS/LTCH PPS proposed rule regarding the newness criterion. The applicant noted that questions raised by CMS centered solely on whether the Heli-FXTM AAA was charged to Medicare prior to the product launch in June 2012. Additionally, the applicant asserted that CMS did not reference the relevance of the April 1 date for purposes of determining whether a technology meets the newness criterion.

Based on CMS' concerns presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28039), the applicant conducted another review of the data previously provided to CMS. As previously submitted, there were four cases where the applicant was able to determine that the Heli-FX AAA was implanted in Medicare beneficiaries, and where charges were submitted to Medicare, prior to the product launch. These procedures occurred on April 24, 2012, May 7, 2012, May 23, 2012, and June 4, 2012. The applicant stated that because all of these cases were completed after April 1, 2012, it believes that the Heli-FXTM AAA meets the newness criterion for FY2015.

Response: In a further follow-up discussion to clarify the availability of the Heli-FXTM AAA, the applicant's representatives noted that, although not in large quantities, the Heli-FX AAA was available to patients prior to April 1, 2012. We appreciate the information the applicant provided regarding the newness criterion. As we explained in the FY 2015 IPPS/LTCH PPS proposed rule, in general, a new technology is still considered “new” (and eligible to receive new technology add-on payments) only if the 3-year anniversary date of the product's entry on the market occurs in the latter half of the fiscal year. Although the applicant has stated that the initial four implantations were after April 1, 2012, the technology was still available prior to April 1, 2012. Therefore, we still consider the beginning of the newness period for the device to begin when the technology first became available on the U.S. market in November 2011, which is prior to April 1, 2012. As stated in the FY 2015 IPPS/LTCH PPS proposed rule, the 3-year anniversary date under the newness criterion for the product's entry on the U.S. market will occur during November 2014 (the first half of FY 2015). As such, the Heli-FXTM AAA does not meet the newness criterion and, therefore, is not eligible for new technology add-on payments for FY 2015.

To demonstrate that the technology meets the cost criterion, the applicant researched claims data from the 100 percent sample of the 2012 Inpatient Hospital Standard Analytical File (SAF) for cases reporting either procedure code 39.71 (Endovascular implantation of other graft in abdominal aorta), or procedure code 39.79 (Other endovascular procedures on other vessels) in the first or second procedure position on the claim, in combination with one of the following primary diagnosis codes: 441.4 (Abdominal aneurysm without mention of rupture); 996.1 (Mechanical complication of other vascular device, implant, and graft); or 996.74 (Other complications due to other vascular device, implant, and graft). The applicant believed that this combination of ICD-9-CM codes identifies cases treated for AAA. We note that the 2012 SAF dataset includes all claims submitted from hospitals paid under the IPPS for calendar year 2012.

The applicant focused its analysis on MS-DRGs 237 and 238 because these are the MS-DRGs that cases treated with the implantation of endovascular grafts for AAAs would most likely map to. The applicant found a total of 8,142 cases, and noted that 9.35 percent of the total number of cases would map to MS-DRG 237, and 90.65 percent of the total number of cases would map to MS-DRG 238. The applicant standardized the charges for all 8,142 cases. Using the inflation factor of 1.47329 published in the FY 2014 IPPS/LTCH final rule (78 FR 50982), the applicant inflated the standardized charges by 14.88 percent (the applicant multiplied 1.47329 × 1.47329 × 1.47329 in order to inflate the charges from 2012 to 2015). The applicant then added the charges for the Heli-FXTM AAA to the standardized charges by dividing the cost of the Heli-FXTM AAA device by each individual hospital specific CCR from the FY 2012 impact file. This equated to an average case-weighted inflated standardized charge per case of $111,613. The applicant noted that the average case-weighted inflated standardized charge per case did not contain additional operating room charges that relate to the Heli-FXTM AAA. Therefore, the applicant determined that it was necessary to add an additional $1,440 for operating room charges, which was based on an additional half hour of operating room time from one hospital, to the average case-weighted standardized charge per case. This resulted in an average case-weighted standardized charge per case of $113,053. The applicant calculated an average case-weighted threshold of $86,278 across both MS-DRGs 237 and 238. The applicant noted that the average case-weighted standardized charge per case, computed without including the additional operating room charges that relate to the Heli-FXTM AAA, exceeded the average case-weighted threshold of $86,278. Therefore, the applicant maintained that the technology meets the cost criterion.

The applicant also submitted claims data from the ANCHOR (Aneurysm Treatment Using the Heli-FX Aortic Securement System Global Registry) study to demonstrate that the technology meets the cost criterion. A total of 51 cases were submitted with 11.76 percent of all the cases mapping to MS-DRG 237, and 88.24 percent of all the cases mapping to MS-DRG 238. The applicant standardized the charges for all 51 cases, and determined an average case-weighted standardized charge per case of $128,196. The applicant calculated an average case-weighted threshold of $87,118 across MS-DRGs 237 and 238. Therefore, because the average case-weighted standardized charge per case exceeds the average case-weighted threshold, the applicant maintained that the technology meets the cost criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the Heli-FXTM AAA meets the cost criterion, particularly with regard to the assumptions and methodology used in the applicant's analyses.

Comment: Some commenters believed that the high cost of the Heli-FXTM device would deter facilities from using it.

Response: As discussed above, because the Heli-FXTM AAA does not meet the newness criterion, it is not eligible for new technology add-on payments for FY 2015. Therefore, we are not summarizing the details of this comment nor are we responding to the issues presented in this discussion. However, we do address this comment in the later discussion of the Heli-FXTM EndoAnchor System for the Treatment of Thoracic Aortic Aneurysms.

We discuss whether the Heli-FXTM EndoAnchor System (for the treatment of AAA and TAA) represents a substantial clinical improvement over other treatments used for the repair of both abdominal and thoracic aortic aneurysms in one discussion below.

(2) Heli-FXTM EndoAnchor System for the Treatment of Thoracic Aortic Aneurysms (Heli-FXTM TAA)

The Heli-FXTM TAA, which allows the expanded use of the Heli-FXTM EndoAnchor System technology to TAA repair, was cleared by the FDA on August 14, 2012 (reference K121168). The new system consists of a longer delivery device with additional tip configurations to allow the helical EndoAnchor technology to treat TAA. A line extension to the original Heli-FXTM EndoAnchor System, allowing improved treatment of AAA patients with larger aortic neck diameters, was cleared by the FDA on April 12, 2013 (reference K130677).

With regard to the newness criterion for the Heli-FXTM TAA, we consider the newness period for the device to begin when the technology was approved by the FDA on August 14, 2012. Because the 3-year anniversary date of the product's entry on the U.S. market would occur in the second half of FY 2015 (August 14, 2015), we believe that the Heli-FXTM TAA meets the newness criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the Heli-FXTM TAA meets the newness criterion. As noted above, the applicant requested an ICD-10-PCS code, and presented comments at the March 2014 ICD-10 Coordination & Maintenance Committee meeting. We also note in section II.G. of the preamble of this final rule that, per section 212 of the PAMA (Pub. L. 113-93), the Secretary announced plans to establish a new compliance date for the ICD-10-PCS. We also discuss in that section requests for ICD-10-PCS codes for FY 2015. We refer readers to section II.G. of the preamble of this final rule for a complete discussion these issues. We did not receive any public comments on whether the Heli-FXTM TAA meets the newness criterion.

To demonstrate that the Heli-FXTM TAA meets the cost criterion, similar to the analysis performed for the Heli-FXTM AAA, the applicant researched claims data from the 100 percent sample of the 2012 SAF for cases reporting procedure code 39.73 (Endovascular implantation of graft in thoracic aorta) in the first or second procedure position on the claim, in combination with one of the following primary diagnosis codes: 404.93 (Hypertensive heart and chronic kidney disease, unspecified, with heart failure and chronic kidney disease stage V or end-stage renal disease); 441.01 (Dissection of aorta, thoracic); 441.03 (Dissection of aorta, thoracoabdominal); 441.2 (Thoracic aneurysm without mention of rupture); 441.4 (Abdominal aneurysm without mention of rupture); 441.7 (Thoracoabdominal aneurysm, without mention of rupture); 996.1 (Mechanical complication of other vascular device, implant, and graft); or 996.74 (Other complications due to other vascular device, implant, and graft). The applicant believed that this combination of ICD-9-CM codes identifies cases treated for TAA. We note that the 2012 SAF dataset includes all claims submitted from hospitals paid under the IPPS for CY 2012.

The applicant focused its analysis on MS-DRGs 219, 220, and 221 because these are the MS-DRGs to which cases treated with the implantation of endovascular grafts for TAA repair would most likely map. The applicant found a total of 642 cases, and noted that 27.88 percent of the total number of cases would map to MS-DRG 219, 40.50 percent of the total number of cases would map to MS-DRG 220, and 31.62 percent of the total number of cases would map to MS-DRG 221. The applicant standardized the charges for all 642 cases. Using the inflation factor of 1.47329 published in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50982), the applicant inflated the standardized charges by 14.88 percent (the applicant multiplied 1.47329 × 1.47329 × 1.47329 in order to inflate the charges from 2012 to 2015). The applicant then added the charges for the Heli-FXTM TAA to the standardized charges by dividing the cost of the Heli-FXTM TAA by each individual hospital specific CCR from the FY 2012 impact file. This equated to an average case-weighted inflated standardized charge per case of $156,625. The applicant noted that the average case-weighted inflated standardized charge per case did not contain additional operating room charges related to the use of this technology. Therefore, the applicant determined that it was necessary to add an additional $2,160 for operating room charges, which was based on an additional 45 minutes of operating room time from one hospital, to the average case-weighted standardized charge per case. This resulted in an average case-weighted standardized charge per case of $158,785. The applicant calculated an average case-weighted threshold of $141,194 across MS-DRGs 219, 220, and 221. The applicant noted that the average case-weighted standardized charge per case, without including charges for additional operating room time, exceeded the average case-weighted threshold of $141,194. Therefore, the applicant maintained that the technology meets the cost criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the Heli-FXTM TAA meets the cost criterion, particularly with regard to the assumptions and methodology used in the applicant's analysis.

Comment: Some commenters stated that the high cost of the Heli-FXTM device would deter facilities from using it. Therefore, the commenters supported the approval of the Heli-FXTM TAA for new technology add-on payment in order to assist with cost coverage so that more facilities would be willing to use the device in the treatment of their patients.

Response: We appreciate the commenters' input and support. We agree with the commenters that the Heli-FXTM TAA meets the cost criterion.

(3) Evaluation of the Substantial Clinical Improvement Criterion for the Heli-FXTM EndoAnchor System for the Treatment of Abdominal and Thoracic Aortic Aneurysms

The applicant stated that the Heli-FXTM EndoAnchor System represents a substantial clinical improvement for the following reasons: the technology improves overall rates of aneurysm exclusion and long-term success after EVAR by increasing the integrity and long-term durability of the proximal seal and fixation; the technology reduces the risk and rate of secondary interventions and readmissions due to aneurysm-related complications (for example, endoleaks, migration, aneurysm enlargement) caused by failure of the proximal seal; the technology improves the general applicability of EVAR to patients with a broader spectrum of aortoiliac anatomy, including those with hostile proximal neck anatomy; and the technology reduces the rigor of life-long imaging follow-up for EVAR patients by reducing the rate of late failure and increasing the post-EVAR rates of aneurysm sac regression due to complete, endoleak-free durable aneurysm exclusion.

While current devices and capabilities are greatly improved over the first generation of devices, the applicant noted that EVAR treatments using the first generation of devices has not proven to be as durable, anatomically applicable, or complication-free as open surgery. Several critical and life-threatening limitations continue to require improvement to these devices and procedures, including the need to reduce serious early and late device and procedure-related complications, such as loss of stability, and integrity and robustness of the clinical proximal aortic landing zone, and to offer an alternative method of EVAR to a broader segment of the patient population.

Abbruzzese, T.A., Kwolek, C.J., Brewster, DC, et al, “Outcomes following endovascular abdominal aortic aneurysm repair (EVAR): An anatomic and device-specific analysis,” Journal of Vascular Surgery, 2008, Vol. 48, pp. 19-28.

Dangas, G., O'Connor, D., Firwana, B., et al, “Open Versus Endovascular Stent Graft Repair of Abdominal Aortic Aneurysms: A Meta-Analysis of Randomized Trials,” JACC, 2012, Vol. 5 (10), pp. 1072-1080.

De Bruin, J.L., Baas, A.F., Buth, J., et al, “Long-Term Outcome of Open or Endovascular Repair of Abdominal Aortic Aneurysm,” New England Journal of Medicine, May 2010, Vol. 362(20), pp.1881-1889.

Greenhalgh, R.M., Brown, L.C., Powell, J.T., et al, “Endovascular versus open repair of abdominal aortic aneurysm,” New England Journal of Medicine, May 2010, Vol. 362(20), pp. 1863-1871.

The applicant provided literature, analyses of data from the “STAPLE-2” clinical trial and the ANCHOR Registry, and a meta-analysis of EVAR trials to demonstrate that the Heli-FXTM EndoAnchor System represents a substantial clinical improvement above current treatments available. We summarize the information provided by the applicant that supports the clinically beneficial results of using the Heli-FXTM EndoAnchor System.

The “STAPLE-2” clinical trial enrolled 155 patients at 25 U.S. centers between September 2007 and January 2009. Clinical (and imaging) data are available for 147, 139 and 125 patients at 1-year, 2-year, and 3-year follow-up, respectively, representing the complete data sets at these time points. Patients enrolled in the clinical trial and observed under the study will continue to be followed per protocol for 5 years following aneurysm repair. According to the applicant, the results of the trial and study demonstrate that the Heli-FXTM EndoAnchor System is associated with an extremely low rate of proximal neck-related issues in long-term follow-up. The applicant maintained that this determination results in improved outcomes for aortic aneurysm patients, and reduced rate of re-interventions, which are associated with hospital admissions, procedural risks, and reversions to increased follow-up frequency requiring more physician visits and radiographic imaging studies.

The data used for this analysis was extracted from the clinical database on February 1, 2013, and are identical to those used to generate the most recent Annual Progress Report (APR) submitted to the FDA, as required under the U.S. IDE regulations.

While the “STAPLE-2” clinical trial was conducted exclusively with the Aptus AAA endograft (which remains investigational), the applicant believed that the use of the Heli-FXTM EndoAnchor System-related data is applicable to the use of the anchor with the compatible Cook, Gore, and Medtronic manufactured endografts in treatment anatomies for AAA and TAA cases.

Through 3-year follow-up, the applicant noted that there have been no anchor fractures as observed by the core lab. Further, there have been no relative migrations of the Heli-FXTM EndoAnchor System as compared to other endografts reported by the core laboratory.

In the analysis of the “STAPLE-2” clinical trial data at 1-year follow-up, the applicant noted that the core lab observed no proximal migrations, and a single case of Type I endoleak. A single secondary intervention was required to address the Type I endoleak in a patient with a circumferentially incomplete proximal neck within the 1-year follow-up period.

The applicant further noted that no additional Type I endoleaks have been observed beyond the 1-year follow-up in any patient enrolled in the trial. In addition, there were no reported instances of aneurysm rupture, vessel perforation, vessel dissection, catheter embolization, enteric fistula, infection, Type III endoleak, conversion, allergic reactions, renal emboli, or patient death associated with the use of the Heli-FXTM EndoAnchor System. Further, there have been no reports of bleeding or hematoma at the EndoAnchor penetration locations in the aortic neck.

Beyond the 1-year follow-up, three patients have demonstrated proximal migrations less than 1 cm. None of these cases were associated with Type I endoleaks or aneurysm sac expansions.

The applicant then compared migrations and Type I endoleaks data from the “STAPLE-2” clinical trial to analogous data from five compatible AAA endografts that were not anchored (data taken from published SSE data obtained from the FDA's Web site). One year of data was compared because this timeframe is what is reported in a standard fashion from IDE trials of endografts. The applicant noted that the Heli-FXTM EndoAnchor System data compares favorably against the data obtained in U.S. pivotal trials of devices that did not employ discrete independent fixation means, particularly when viewed in light of the shorter average neck lengths treated in the “STAPLE-2” clinical trial versus those involving the Cook, Gore, and Medtronic manufactured endografts. According to the applicant, the number of proximal migrations were low across devices as reported in the SSE data, and an analysis using the Fisher's exact method demonstrated no statistically significant differences when compared to the anchored endografts used in the “STAPLE-2” clinical trial (all p=NS). The incidence of Type I endoleaks and the need for secondary interventions to address them was significantly lower for the Heli-FXTM EndoAnchor System endografts analyzed under the “STAPLE-2” clinical trial versus the Medtronic, AneuRx, and Talent manufactured endografts (p=0.026 versus AneuRx and p=0.015 versus Talent). The applicant stated that the applicability of post-hoc statistical analyses is limited. However, the applicant believed that because the data being compared under the analyses were collected through similar protocols and with the same endpoint definitions, post-hoc comparisons were deemed appropriate. The applicant further believed that the comparison of this data demonstrates that the Heli-FXTM EndoAnchor System is associated with very low rates of Type I endoleaks and migrations.

The applicant also provided data from the ANCHOR Registry, which is a post-market, prospective, observational, multi-center, international, dual-arm study designed to capture real-world data on the usage patterns and clinical results associated with the use of the Heli-FXTM EndoAnchor System as a method of treatment for patients in need of EVAR. The applicant explained that the ANCHOR Registry represents a growing body of data on the application of the Heli-FXTM EndoAnchor System used as a method of endovascular aortic aneurysm repair. The applicant noted that to its knowledge, the anatomical challenges present in the registry are greater than those in any large scale published series. The applicant further noted that, although long-term results are limited, the acute results demonstrate a high level of device safety, technical feasibility and acute success in a patient population with few viable options.

Primary safety for the ANCHOR Registry is being measured as a composite of freedom from device or procedure-related serious adverse events through 1-year follow-up following the Heli-FXTM EndoAnchor System implantation. Primary effectiveness is being measured as a composite of acute technical success and freedom from Type Ia endoleaks and endograft migrations through 1-year follow-up. Inclusion and exclusion criteria are minimal, essentially following the IFU requirements. Patients are being followed in the registry by their physician's standard of care for 5 years.

Enrollment in the ANCHOR Registry began in March 2012. Through August 2013, a total of 258 patients were enrolled at 40 participating centers (29 located in the United States and 11 located in the European Union), and data are available in the registry's database. Of these, 195 patients (76 percent) were enrolled in the primary arm, having the Heli-FXTM EndoAnchor System implanted at the time of their initial aneurysm treatment, either as a prophylactic measure, or to address an acute leak seen on completion arteriography. The remaining patients (63 or 24 percent) were enrolled in the revision arm, having the Heli-FXTM EndoAnchor Systems implanted at a secondary procedure to arrest migration, or address endoleaks discovered on follow-up in previously implanted endografts.

The applicant noted that physicians are choosing to apply the Heli-FXTM EndoAnchor System in a subset of patients that are at a higher risk for proximal neck-related complications during follow-up. The large average sac diameter in the revision arm suggested that these patients' initial treatments were unsuccessful and, as such, they have experienced continued sac expansion post-EVAR. These patients also represent a high-risk subset of patients.

Acute results are measured in terms of technical success. In the primary arm, 193 of 194 procedures were successful, and in the revision arm, 57 of 63 procedures were successful. All technical failures were persistence of Type Ia endoleaks. There has been a single re-intervention at 69 days post-Endoanchor implantation for a persistent Type Ia endoleak in one patient in the revision arm, in which the Heli-FXTM EndoAnchor System combined with a proximal cuff were unable to completely resolve the endoleak. There have been no device-related serious adverse events.

As mentioned above, because the “STAPLE-1”, and “STAPLE-2” clinical trials were single-arm studies, no data are available from them to assess the impact of the Heli-FXTM EndoAnchor System on endograft performance. To make this assessment, a meta-analysis was conducted. The meta-analysis combined long-term AAA endograft performance from endografts marketed in the United States, and compared these measures to those from long-term follow-up in the “STAPLE-2” trial.

Deaton, D.H., Mehla, M., Kasirajan, K., et al, “The Phase I Multi-center Trial (Staple-1) of the Aptus Endovascular Repair System: Results at 6 Months and 1 Year,” Journal of Vascular Surgery, 2009, Vol. 49, pp. 851-857 (discussion on pp. 857-858.)

According to the applicant, the key findings from the meta-analysis are as follows:

  • Heli-FXTM EndoAnchors reduced the proportion of treated aneurysms with enlargement greater than 5 mm at 3 years from 12.7 percent to 3.9 percent (p=.002).
  • Heli-FX EndoAnchor System reduced the proportion of leaks requiring treatment at 3 years from 12 percent to 1.3 percent (p.001).
  • Heli-FXTM EndoAnchor System reduced (all-cause) mortality at 3 years from 18.8 percent to 8.4 percent (p=.002). However, this does not appear to have been totally mediated by AAA-related mortality, which was reduced by the Heli-FXTM EndoAnchor System from 2.5 percent to 0.7 percent at 3 years (but was not statistically significant, p=.372).

According to the applicant, in general, patients in the ANCHOR Registry were similar to the patients in the AAA endograft studies. The applicant noted that the results of the analysis using the Fisher's Exact Tests were consistent between the All-Studies' comparisons and the IDE-Studies' comparisons: All-Cause Mortality, Leaks requiring Treatment, and Enlargement were all significantly lower at 3 years in the endografts implanted with the Heli-FXTM EndoAnchor System than in standard endografts.

The applicant asserted that the meta-analysis shows that there is objective evidence that the Heli-FXTM EndoAnchor System effectively reduces well-documented problems with endografts. By providing the endograft with better apposition to the native artery, the applicant noted that the Heli-FXTM EndoAnchor System reduces the rates of enlargement and endoleaks requiring treatment. The applicant further noted that these results were consistent in the All-Studies' and IDE Studies' meta-analyses. The applicant believed that lower rates of leaks requiring intervention would save payers money over the long term.

The applicant observed that, while there was no significant improvement in the rate of ruptures with the Heli-FXTM EndoAnchor System, this may be due to the fact that leaks were treated and, thereby, prevented any ruptures. The applicant believed that the higher rate of treated endoleaks in endografts implanted without the Heli-FXTM EndoAnchor System provides for this hypothesis. Also, migration did not appear to be significantly reduced by the Heli-FXTM EndoAnchor System (3.5 percent at 3 years in both groups; p=1.0).

Finally, the applicant concluded that, overall, the lower complication rates seen with the Heli-FXTM EndoAnchor System in the meta-analysis provide evidence of the clinical benefits and likely economic benefits associated with the use of the Heli-FXTM EndoAnchor System. The applicant believed that the technology may be especially helpful in patients with difficult anatomy, and that it may be reasonable to consider using the Heli-FXTM EndoAnchor System prophylactically in the treatment of all such patients.

In addition to the formal study data from the “STAPLE-2” trial, the Global ANCHOR Registry, and the meta-analysis based on these, the applicant provided published peer-reviewed literature that represent an early state of scientific data dissemination outside of non-company sponsored clinical studies, which is commensurate with the recent market approvals of the Heli-FXTM EndoAnchor System technology. The applicant believed that these data demonstrate strong initial physician enthusiasm and resulting favorable clinical results in their experience to date. The applicant noted that the general body of scientific literature is considered meaningful and growing for this early stage of market introduction. However, the applicant asserted that the literature supports the study and meta-analysis data above that documents that improved clinical outcomes were observed, including outcomes in a broader range of patients that are often ineligible for, or at greatest risk with, EVAR.

In the FY 2015 IPPS/LTCH PPS proposed rule, we stated that we are concerned that the three sources of data, the “STAPLE-2” clinical trial, the Anchor registry, and the literature review that the applicant submitted to support their application are not high quality evidence. The `STAPLE-2” study was a single-arm study and only used one endograft, the registry is an observational study, and the literature review does not provide clinical data. Also, the meta-analysis of all the submitted data is only as good as the data used. While the clinical data submitted suggests that some outcomes such as EVAR failure are improved, we stated that we are concerned that there is not enough clinical evidence to support the substantial clinical improvement criterion.

We invited public comments on whether the submitted data demonstrate that the Heli-FXTM EndoAnchor System represents a substantial clinical improvement in the treatment of Medicare beneficiaries, particularly in regard to the concerns we identified.

Comment: Several commenters stated that the Heli-FXTM System meets the substantial clinical improvement criterion and, therefore, CMS should approve the Heli-FXTM System for new technology add-on payments in FY 2015.

Response: We appreciate the commenters' support. We considered these comments in our determination of whether the Heli-FXTM System represents a substantial clinical improvement in the treatment options available to Medicare beneficiaries.

Comment: The applicant commented in response to CMS' concerns presented in the FY 2015 IPPS/LTCH PPS proposed rule regarding the lack of enough high quality evidence to support the substantial improvement criterion because the three sources of data submitted by the applicant were not considered to be `high quality evidence.' Specifically, CMS stated that it believed that the meta-analysis of submitted data is only as good as the data used, the STAPLE-2 Pivotal FDA Study was a single arm study and only used one Endograft, and the ANCHOR Registry is an observational study and the literature review does not provide clinical data. The applicant first outlined some basic background information into the EVAR regulatory process.

With respect to the concerns regarding the meta-analysis of submitted data being only as good as the data used, the applicant asserted that it has not attempted to substantiate the finding of substantial clinical improvement through a single source of information. The applicant believed that the entirety of evidence demonstrated that this criterion was met as stated in its application. Specifically, the applicant stated that the Heli-FXTM EndoAnchor System offers a treatment option for a patient population unresponsive to, or ineligible for, currently available treatments, including the primary cases with hostile necks and complex revisions (refer to the ANCHOR Registry data demonstrating 90.2 percent of hostile necks in the population). The technology has shown significantly improved clinical outcomes for the short proximal aortic neck patient population when compared to current available treatments (refer to STAPLE-2 average neck length of 22.1mm, shorter than any conventional Endograft IDE Study), and has been shown to reduce aneurysm related mortality (refer to the meta-analysis results). The applicant further stated that the Heli-FXTM has also been shown to reduce proximal neck related device complications and reduced subsequent therapeutic interventions (refer to STAPLE-2 where no late Type 1 endoleaks or proximal neck related revisions were required), and with previously unseen aneurysm sac regression (refer to STAPLE-2 which showed the highest reported at 81.7 percent at 3 years), indicating more rapid resolution of the disease process. Based on all of the above information, the applicant stated that it believes that the Heli-FXTM EndoAnchor System has met this evidentiary threshold for the substantial clinical improvement criterion.

The applicant also addressed CMS' concerns about the quality of evidence that the Aptus' single arm STAPLE-2 study may provide, specifically, that the STAPLE-2 Pivotal FDA Study was a single arm study and only used one Endograft. According to the applicant, the STAPLE-2 Study was a two arm study of patients treated with the Aptus Stent Graft in conjunction with the EndoAnchors versus an historical open surgical control (SVS Lifeline database). The applicant stated that this kind of trial design is typical for U.S. pre-market IDE EVAR Studies with current Endovascular stent grafts. According to the applicant, many of the recently approved endografts in the United States used a similar study design and the FDA has no requirement for a concurrent surgical control. The applicant noted that in no case for the device regulatory approval processes for recent endografts were randomization or blinding utilized.

The applicant also addressed CMS' concern that the STAPLE-2 Study utilized a single type of Endograft. According to the applicant, while the STAPLE-2 Study utilized a single type of Endograft, this may provide a uniquely compelling indication of substantial clinical improvement based on two aspects relating to STAPLE-2. While the Endograft was an entirely conventional design utilizing Polyester fabric supported by a Nitinol stent structure with infrarenal fixation and an unsupported main body (eliminating any contribution of columnar strength to aid in fixation), the applicant stated that this Endograft has no other means of fixation beyond the Aptus EndoAnchors. Despite this, the applicant stated that results indicated highly favorable proximal seal related outcomes in this most challenging proximal neck anatomy patient population. In this cohort, the proximal necks in STAPLE-2 patients contained the shortest average neck length of any conventional (non-Fenestrated) Endograft evaluated in a U.S. PMA trial to date. The applicant further stated that unlike other endografts, such as the Medtronic Endurant or the Gore Excluder, being utilized with Heli-FX currently both in the ANCHOR trial and commercially worldwide, the graft studied in STAPLE-2 has no inherent fixation, active or otherwise. The applicant explained that this is because there are no integral hooks, barbs, supra-renal fixation, “anatomical fixation” or “anchor pins” or other means to secure the Aptus Endograft beyond the fixation provided by the Heli-FXTM EndoAnchors. In effect, because the Heli-FXTM is the only source of fixation for the graft studied, the applicant stated that it represents a “worst case” and significant performance challenge of the clinical effectiveness of the Heli-FXTM EndoAnchors. Despite this worst-case aspect of no inherent fixation in the STAPLE-2 Endograft other than Heli-FXTM EndoAnchors for Endograft fixation and sealing to the aortic wall, the applicant reported that there were excellent clinical and technical results with respect to proximal neck seal and fixation. This was observed despite the very short proximal necks treated in the study cohort. The applicant noted that the aneurysm size regression is also among the most rapid and highest frequency seen with any Endograft U.S. IDE study. The applicant stated that in the setting of an Endograft with no means of fixation beyond the Heli-FXTM EndoAnchors, this is especially meaningful and indicative of the EndoAnchor capabilities with more advanced, current generation commercial Endografts.

With respect to CMS' concern that the ANCHOR Registry is an observational study, the applicant believed that the Anchor Registry provides important, highly valuable and meaningful evidence in support of the substantial clinical improvement criterion. The applicant stated that the ANCHOR Registry is a formal, Institutional Review Board (IRB) and Ethics Committee (EC) approved Post-Market Study that utilizes a Core Lab and a Safety Medical Reviewer for aneurysm related outcomes, anatomical adjudication for all patients at each follow-up time-point, as well as clinical outcomes acutely and in follow- up. The applicant further noted that the use of a Core Lab and a Safety Medical Reviewer in the setting of EVAR for both baseline and outcome data and the associated aneurysm anatomical aspects is extremely rare and, therefore, so far only the ANCHOR Registry has utilized this approach within the known EVAR Registries. The applicant stated that this optimizes the scientific rigor and robustness of this real-world study. The applicant further noted that there are currently 417 patients enrolled (there were 258 patients at the time of the application), with core lab analysis available for 311 subjects, and the data has continued to be highly favorable in what is now among the most hostile proximal necks studied in any Endograft population seen in the scientific literature. The applicant asserted that a key and applicable aspect where Heli-FXTM is having significant patient impact (including as seen in the patients' challenging proximal neck anatomy in STAPLE-2 and ANCHOR cohorts) is offering a treatment option for a patient population ineligible for currently available treatments. While the applicant acknowledged the important and favorable aneurysm exclusion results and expanded patient applicability provided by the recently FDA-approved Cook Zenith Fenestrated Endograft system, which expanded proximal neck capabilities as low as 4mm in length, there are situations affecting patients which limit access to this advanced Endograft technology. The applicant believed that these higher risk situations often require physicians to utilize Heli-FXTM EndoAnchors with conventional Endografts in sub-optimal proximal neck anatomy. The applicant asserted that this is especially applicable in patients deemed unsuitable for open surgical repair.

With respect to CMS' concern that the literature review did not provide clinical data, the applicant acknowledged that the non-STAPLE-2 and ANCHOR related Heli-FXTM peer-reviewed scientific literature did not constitute formal clinical data in themselves, but nonetheless the applicant believed that the information provided the manuscripts to highlight the various applicability and utility of the Heli-FXTM in various settings, including primary revision, in AAA and TAA.

Response: We appreciate the applicant's response to our concerns presented in the proposed rule. While we recognize that Heli-FXTM EndoAnchor System has received regulatory approval for marketing, therefore meeting FDA standards for safety and effectiveness, the new technology add-on payment process requires demonstration of a substantial clinical improvement, which is not inherent in the FDA's regulatory process. As previously stated, we believe that data used to support substantial clinical improvement should come from high quality evidence. For example, well-designed studies that compare the new technology to other similar services that the applicant is contending will be replaced by the new technology. We did not suggest that the comparative should have been an open, surgical procedure. The substantial clinical improvement criterion requires that technologies demonstrate substantial clinical improvement over existing technologies. In this case, we would have liked to have seen a randomized trial comparing the use of Heli-FXTM anchors with various endografts such as hooks, barbs, supra-renal fixation, anatomical fixation or anchor pins using the same brands of endografts. That data, if positive, would have been sufficient to demonstrate substantial clinical improvement over existing technologies.

Further, we also believe that the alternatives just mentioned—hooks, barbs, supra-renal fixation, anatomical fixation, or anchor pins—are alternatives to the Heli-FXTM System and the data submitted does not support that patients have no other alternatives. Therefore, based on the reasoning above, we do not believe that the Heli-FXTM System meets the substantial clinical improvement criterion.

After consideration of the public comments we received, and as discussed above, we conclude that the Heli-FXTM AAA does not meet the newness criterion and, therefore, the technology is not eligible for new technology add-on payments for FY 2015. The Heli-FXTM TAA meets the newness and cost criteria. However, as discussed above, the Heli-FXTM AAA and TAA do not meet the substantial clinical improvement criterion. Therefore, we are not approving new technology add-on payments for the Heli-FXTM TAA because the technology does not meet the substantial clinical improvement criterion.

c. CardioMEMSTM HF (Heart Failure) Monitoring System

CardioMEMS, Inc. submitted an application for new technology add-on payment for FY 2015 for the CardioMEMSTM HF (Heart Failure) Monitoring System, which is an implantable hemodynamic monitoring system comprised of an implantable sensor/monitor placed in the distal pulmonary artery. Pulmonary artery hemodynamic monitoring is used in the management of heart failure. The CardioMEMSTM HF Monitoring System measures multiple pulmonary artery pressure parameters for an ambulatory patient to measure and transmit data via a wireless sensor to a secure Web site.

The CardioMEMSTM HF Monitoring System utilizes radiofrequency (RF) energy to power the sensor and to measure pulmonary artery (PA) pressure and consists of three components: an Implantable Sensor with Delivery Catheter, an External Electronics Unit, and a Pulmonary Artery Pressure Database. The system provides the physician with the patient's PA pressure waveform (including systolic, diastolic, and mean pressures) as well as heart rate. The sensor is permanently implanted in the distal pulmonary artery using transcatheter techniques in the catheterization laboratory where it is calibrated using a Swan-Ganz catheter. PA pressures are transmitted by the patient at home in a supine position on a padded antenna, pushing one button which records an 18-second continuous waveform. The data also can be recorded from the hospital, physician's office or clinic.

The hemodynamic data, including a detailed waveform, are transmitted to a secure Web site that serves as the Pulmonary Artery Pressure Database, so that information regarding PA pressure is available to the physician or nurse at any time via the Internet. Interpretation of trend data allows the clinician to make adjustments to therapy and can be used along with heart failure signs and symptoms to adjust medications.

The applicant believed that a large majority of patients receiving the sensor would be admitted as an inpatient to a hospital with a diagnosis of acute or chronic heart failure, which is typically described by ICD-9-CM diagnosis code 428.43 (Acute or chronic combine systolic and diastolic heart failure) and the sensor would be implanted during the inpatient stay. The applicant stated that for safety considerations, a small portion of these patients may be discharged and the sensor would be implanted at a future date in the hospital outpatient setting. In addition, there would likely be a group of patients diagnosed with chronic heart failure who are not currently hospitalized, but who have been hospitalized in the past few months for which the treating physician believes that regular pulmonary artery pressure readings are necessary to optimize patient management. Depending on the patient's status, the applicant stated that these patients may have the sensor implanted in the hospital inpatient or outpatient setting.

The applicant received FDA approval on May 28, 2014. The CardioMEMSTM HF Monitoring System is currently described by ICD-9-CM procedure code 38.26 (Insertion of implantable pressure sensor without lead for intracardiac or great vessel hemodynamic monitoring).

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments regarding how the CardioMEMSTM HF System meets the newness criterion. We did not receive any public comments concerning how the CardioMEMSTM HF Monitoring System meets the newness criterion. Therefore, after evaluation of the information provided by the applicant, we believe that the CardioMEMSTM HF Monitoring System meets the newness criterion, and we consider the technology to be “new” as of May 28, 2014, when the technology received FDA approval.

With respect to cost criterion, the applicant submitted actual claims from the CHAMPION clinical trial. Of the 550 patients enrolled in the trial, the applicant received 310 hospital bills. The applicant excluded the following claims: incomplete or missing procedure codes, incomplete charge information and bills that were statistical outliers (three standard deviations away from the geometric mean). This resulted in a final cohort of 138 claims. The applicant noted that cases treated with the CardioMEMSTM HF Monitoring System would typically map to MS-DRG 264 (Other Circulatory System Operating Room Procedures). Using the 138 clinical trial claims, the applicant standardized the charges and added charges for the CardioMEMSTM HF Monitoring System (because the clinical trial claims did not contain charges for the CardioMEMSTM HF Monitoring System). This resulted in an average case-weighted standardized charge per case of $79,218.

Abraham WT, Adamson PB, Bourge RC, Aaron MF, Costanzo MR, Stevenson LW, Strickland W, Neelagaru S, Raval N, Krueger S, Weiner S, Shavelle D, Jeffries B, Yadav JS; for the CHAMPION Trial Study Group. Wireless pulmonary artery hemodynamic monitoring in chronic heart failure: a randomized controlled trial, Lancet, February 19, 2011, Vol. 377(9766), pp:658-666.

Using the FY 2014 Table 10 thresholds, the threshold for MS-DRG 264 is $60,172. Because the average case-weighted standardized charge per case exceeded the threshold amount, the applicant maintained that the CardioMEMSTM HF Monitoring System would meet the cost criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether or not the CardioMEMSTM HF System meets the cost criterion. We did not receive any public comments regarding whether or not the CardioMEMSTM HF System meets the cost criterion. Based on the analysis above, we believe the CardioMEMSTM HF System meets the cost criterion.

With regard to substantial clinical improvement, the applicant asserted that elevated PA pressures occur prior to signs and symptoms of heart failure and changes in PA pressures provide a sound physiologic basis for its management. The applicant also contended that, until the creation of the CardioMEMS wireless PA implant, knowledge of PA pressure was only feasible in the hospital with the performance of a right heart catheterization. According to the applicant, the CardioMEMS HF Monitoring System provides physicians knowledge of PA pressure while the patient is at home, allowing proactive management to prevent heart failure decompensation and hospitalization.

The applicant cited clinical data from the CHAMPION trial. The trial is a prospective, multicenter, randomized, single-blinded clinical trial conducted in the United States, designed to evaluate the safety and efficacy of the CardioMEMSTM HF Monitoring System in reducing heart failure-related hospitalizations in a subset of subjects suffering from heart failure. The applicant shared several major findings from the CHAMPION trial as described below.

The primary efficacy endpoint of the CHAMPION trial was the rate of HF hospitalizations during the first 6 months of randomized access. There were 84 heart failure hospitalizations in the treatment group compared with 120 heart failure hospitalizations in the control group. This difference between the groups represented a 28-percent reduction in the rate of hospitalization for heart failure in the treatment group (0.32 hospitalizations per patient in the treatment group versus 0.44 hospitalizations per patient in the control group, p=0.0002). Although not a primary end point, the rate of HF hospitalizations after 18 months was 33 percent lower in the treatment group than in the control group.

According to the applicant, secondary endpoints of the CHAMPION trial are changes in pulmonary artery pressures, proportion of subjects hospitalized, days alive outside of the hospital, quality of life (QOL), and heart failure management which demonstrated the following results:

  • Pulmonary Artery Pressures: At baseline, both treatment and control patients had similar PA mean pressures. The change in pressure over the first 6 months was evaluated by integrating the area under the pressure curve (AUC). At 6 months of follow-up, the treatment group had a significantly greater reduction in AUC of −155.7 mmHg days compared to the control group which had an increase in AUC of +33.1 mmHg-days; p=0.0077.
  • Proportion of Subjects Hospitalized: During the 6-month follow-up period, the proportion of subjects hospitalized for 1 or more HF hospitalizations was significantly lower in the treatment group (55 out of 270 patients) than in the control group (80 out of 280 patients) (20.4 percent versus 28.6 percent; p=0.0292).
  • Days Alive Outside of the Hospital: At 6 months, treatment patients had a nonsignificant and clinically not meaningful increase in days alive outside of the hospital (174.4 versus 172.1; p=0.0280) and fewer average days in the hospital (2.2 versus 3.8; p=0.0246) compared to control patients.
  • Quality of Life: The heart failure specific quality of life was assessed with the MLHFQ total score at 6 months. The average total score in the treatment group was 45.2 ± 26.4 which was significantly better than the average total score in the control group 50.6 ± 24.8 (p=0.0236). The difference in total quality of life was primarily due to the physical domain. The average physical score for the treatment group (19.8 ± 11.2) was significantly better than the control group (22.4 ± 10.9) (p=0.0096). There was also a significant difference in the emotional domain with an average score of 9.5 ± 8.1 for the treatment group and 11.0 ± 7.7 for the control group (p=0.0398).
  • Heart Failure Management: Physicians responded to treatment of patients' elevated PA pressures by making medication changes to lower PA pressures and reduce the risk for HF hospitalization. Physicians documented all medication changes for all patients and indicated whether the change was made in response to PA pressures or standard of care information. During the 6-month follow-up period, physicians made approximately one additional HF medication change per patient per month in the treatment group when compared to the control group. Specifically, treatment patients had 1.55 medication changes per month on average compared to control patients having 0.65 medication changes per month (p<0.0001). The difference in HF management between the treatment and control group was due to HF medication changes made in response to PA pressures.

The study met the two primary safety endpoints: (1) freedom from device/system related complications (DSRC); and (2) freedom from sensor failure. The protocol pre-specified objective performance criterion (OPC) were that at least 80 percent of patients were to be free from DSRC and at least 90 percent were to be free from pressure sensor failure. Of the 575 patients in the safety population, 567 (98.6 percent) were free from DSRC at 6 months (lower confidence limit 97.3 percent, p<0.0001). This lower limit of 97.3 percent is greater than the pre-specified OPC of 80 percent. There were no sensor explants or repeat implants and all sensors were operational at 6 months for a freedom from sensor failure of 100 percent (lower confidence limit 99.3 percent, p<0.0001). This lower limit of 99.3 percent is greater than the pre-specified OPC of 90 percent.

The applicant also noted that the CardioMEMSTM HF System reduces the occurrence of HF hospitalizations in NYHA Class III heart failure patients. According to the applicant, the device had very few device and system related complications occurring over the course of the clinical trial. All primary and secondary study endpoints were successfully achieved. In addition, the CHAMPION trial suggests the safety and effectiveness of the device was maintained during longer term follow-up.

After reviewing the information provided by the applicant, we stated in the FY 2015 IPPS/LTCH PPS proposed rule that we have the following concerns. The applicant did not discuss long-term outcomes, specifically death. We stated that we believe additional long-term outcome information and information regarding how the technology changes long-term outcomes would further assist in our determination of whether the technology represents a substantial clinical improvement. With regard to the clinical trial, information from the randomized access period and the open access period did not include the total number of deaths in each group. While the data support a reduction in total hospitalizations, the rate of hospitalization in each group (0.32 versus 0.44) does not appear to be clinically meaningful. This is supported by total days alive out of the hospital being virtually identical in both groups. Finally, we stated that we are concerned about the cause of the significant dropouts in the Kaplan Meier curves which further demonstrates lack of impact on survival.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether or not the CardioMEMSTM HF Monitoring System technology represents a substantial clinical improvement in the treatment options available to Medicare beneficiaries.

Comment: Several commenters, including various physicians, supported the approval of new technology add-on payment for the CardioMEMSTM HF Monitoring System.

Response: We appreciate the commenters' support. We considered these comments in our determination of whether the CardioMEMSTM HF Monitoring System represents a substantial clinical improvement.

Comment: The applicant submitted a public comment, which included responses to each of CMS' concerns presented in the proposed rule. CMS' major concern outlined in the FY 2015 IPPS/LTCH PPS proposed rule was the lack of mortality data to support the improvement seen in the specified endpoint, hospitalizations. The applicant provided information that the Randomized Access Period includes approximately 800 patient-years of follow-up, with an average patient follow-up of 18 months. The primary endpoint of the CHAMPION trial was HF hospitalizations because it remains a major clinical and public health problem, which is inadequately addressed by current treatment options. Although the trial was not powered to assess mortality, the applicant stated that the data showed strong favorable trends for reduced mortality, and a highly significant reduction for HF hospitalization or mortality. During the first 6 months of follow-up, the applicant stated that the proportion of patients who died that were enrolled in the treatment group (n=15, 5.6 percent) was lesser than in the proportion patients who died that were enrolled in the control Group (n=20, 7.1 percent), with a nonsignificant but favorable relative risk reduction rate of 23 percent (HR 0.77, 95 percent CI 0.40-1.51, p=0.4484). During the entire Randomized Access Period, the applicant stated that the proportion of patients who died that were enrolled in the treatment group (n=50, 18.5 percent) was lesser than the proportion of patients that were enrolled in the control group (n=64, 22.9 percent), with a nonsignificant but favorable relative risk reduction rate of 20 percent (HR 0.80, 95 percent CI 0.55-1.15, p=0.2303).

The applicant further stated that in measuring the combined impact of mortality and HF hospitalizations on the study population, analysis of the time to death or first HF hospitalization is frequently used. During the first 6 months of the Randomized Access Period, the applicant noted that the proportion of patients who died or that had at least one HF hospitalization that were enrolled in the treatment group (n=63, 23.3 percent) was lesser than the proportion of patients who died or that had at least one HF hospitalization that were enrolled in the control group (n=91, 32.5 percent), with a significant relative risk reduction rate of 31 percent (HR 0.69, 95 percent CI 0.50-0.95; p=0.0239). During the entire Randomized Access Period, the applicant noted that the proportion of patients who died or had at least one HF hospitalization that were enrolled in the treatment group (n=121, 44.8 percent) was lesser than the proportion of patients who died or had at least one HF hospitalization that were enrolled in the control group (n=145, 51.8 percent), with a significant relative risk reduction rate of 23 percent (HR 0.77, 95 percent CI 0.60-0.98, p=0.0330). The applicant further noted that other endpoints other than time to event analyses are event rate analyses for repeat events, including HF hospitalization rates (primary efficacy endpoint) and all cause hospitalization rates. The applicant also indicated that event rate analyses for composite events also are frequently used to assess the impact of both mortality and HF hospitalizations (combined deaths and HF hospitalization rates) and total morbidity and mortality (combined deaths and all cause hospitalizations rates). According to the applicant, the large treatment effect size on long-term outcomes and the low number needed to treat and prevent hospitalizations and deaths demonstrated that CardioMEMSTM HF Monitoring System represents a substantial clinical improvement.

CMS also was concerned that while the data supported a reduction in total hospitalizations, the rate of hospitalization in each group (0.32 versus 0.44) does not appear to be clinically meaningful. The applicant stated in response that the days alive outside of the hospital (DAOH) endpoint was a secondary endpoint in the CHAMPION trial. The applicant further stated that the endpoint is used in clinical trials as an alternative measure for evaluating the combined impact of mortality and hospitalizations on the study population. Endpoints that are traditionally used to measure this combined effect include time to event analyses (for example, time to death or first HF hospitalization) and composite event rate analyses (for example, rate of death and repeat HF hospitalizations). The applicant noted that, for many HF drug and device trials, these more traditional analyses are frequently used as the primary or co-primary efficacy endpoints. The applicant further stated that the DAOH endpoint is susceptible to many influences including variable follow-up time (that is, patients with longer follow-up time have the potential for more DAOH than patients with shorter follow-up time), the length of the study duration interval for which the DAOH endpoint is being analyzed, and differences in proportion of patients experiencing a mortality or hospitalization event relative to the proportion of patients not experiencing a mortality or hospitalization event (that is, a shorter duration interval will have a greater proportion of patients without any events when compared to a longer duration interval where the proportion of patients experiencing events increases over time). In response to CMS' concerns in regard to the numerical similarity of DAOH between the treatment and control groups which is based on the shorter follow-up interval of 6 months, the applicant stated that during this shorter follow-up interval, approximately 70 percent of the patients did not experience a mortality or HF hospitalization event. The applicant stated that indication skews the dataset because these patients are experiencing 100 percent in measurement of DAOH. Despite this fact, the applicant stated that there was a statistically significant difference of 2.3 days in favor of the treatment group. The applicant asserted that a treatment effect that increases the number of DAOH by 2.3 days over a 6-month period is clinically meaningful to this patient population, as evidenced by the improved quality of life of the patients that were enrolled in the treatment group. DAOH rates were also analyzed over a longer period of follow-up during the Randomized Access Period. To reduce the effects of variable follow-up time and to have a consistent study duration interval, DAOH was analyzed over the first 12 months of follow-up. Patients enrolled in the treatment group being managed using the CardioMEMSTM HF Monitoring System experienced 6.1 more DAOH than the patients that were enrolled in the control group after 12 months of follow-up. The applicant believed that this increase represents a substantial clinical improvement with respect to current treatment options available to Medicare beneficiaries.

In regard to CMS' concern about the cause of the significant dropouts in the Kaplan Meier curves, which further demonstrates lack of impact on survival, the applicant provided the following information in response. According to the applicant, the dropout rates in the CHAMPION trial were low; the patients transitioning from Randomized to Open Access are being misconstrued as dropouts. The applicant reported that CHAMPION enrolled 550 patients from September 2007 to October 2009. In addition, all of the patients remained in their randomized groups until the last patient enrolled in the CHAMPION trial completed at least 6 months of follow-up. As result of this enrollment over time, the applicant stated that the average patient follow-up in the Randomized Access Period was significantly longer at 18 months. The applicant further indicated that patients with a lower enrollment number and implanted earlier in 2008 had the potential for longer follow-up times in the Randomized Access Period than patients with a higher enrollment number and implanted later in 2009. As a result, the applicant believed that these patients are being construed as dropouts on the Kaplan Meier curve, but actually are patients being censored at the time of their transition to the Open Access Period. According to the applicant, because the maximum follow-up for the Randomized Access Period was already achieved, patients in this category were not eligible or “at risk” for the longer follow-up periods represented in the Kaplan Meier curve understanding that the follow-up time is now part of the Open Access Period.

In response to CMS' invitation for public comments on whether or not the CardioMEMSTM HF Monitoring System technology represents a substantial clinical improvement in the Medicare population, the applicant stated that heart failure is a significant clinical burden to Medicare beneficiaries, their caregivers, and hospitals throughout the U.S. health care system. The applicant believed that rising HF hospitalizations rates and the increasing cost of care for Medicare beneficiaries diagnosed with HF and the detrimental effect the condition is having on the U.S. health care system is not sustainable.

The applicant believed that the CardioMEMSTM HF Monitoring System technology represents a substantial clinical improvement treatment options available to Medicare beneficiaries. In the CHAMPION trial, 245 patients (45 percent) were 65 years or older at the time of sensor implantation (120 in the treatment group and 125 in the control group). Patients who were enrolled in the treatment group and managed on the basis of PA pressure information obtained from the CardioMEMSTM HF Monitoring System had a significantly reduced HF hospitalization rate (0.34 events/patient-year) compared to patients who were enrolled in the control group (0.67 events/patient-year) and managed according to best available practices (HR 0.51, 95 percent CI 0.37-0.70, p<0.0001).

Response: We appreciate the applicant's response to each of CMS' concerns and the additional data provided. Other than data indicating that the primary endpoint of reduced hospitalizations was met, additional longer term data demonstrated improved mortality. Therefore, we believe that the data indicates that the CardioMEMSTM Monitoring System meets the substantial clinical improvement criterion.

After consideration of the public comments we received, we believe that the CardioMEMSTM HF Monitoring System meets all of the new technology add-on payment policy criteria. Therefore, we are approving the CardioMEMSTM HF Monitoring System for new technology add-on payments in FY 2015. Cases involving the CardioMEMSTM HF Monitoring System that are eligible for new technology add-on payments will be identified by ICD-9-CM procedure code 38.26 (Insertion of implantable wireless pressure sensor for intracardiac or great vessel hemodynamic monitoring), which was effective October 1, 2011. With the new technology add-on payment application, the applicant stated that the total operating cost of the CardioMEMSTM HF Monitoring System is $17,750. Under § 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the device or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum payment for a case involving the CardioMEMSTM HF Monitoring System is $8,875 for FY 2015.

d. MitraClip® System

Abbott Vascular submitted an application for new technology add-on payments for the MitraClip® System for FY 2015. (We note that the applicant submitted an application for new technology add-on payments for FY 2014 but failed to receive FDA approval by the July 1 deadline.) The MitraClip® System is a transcatheter mitral valve repair system that includes a MitraClip® device implant, a Steerable Guide Catheter, and a Clip Delivery System. It is designed to perform reconstruction of the insufficient mitral valve for high-risk patients who are not candidates for conventional open mitral valve repair surgery.

Mitral regurgitation (MR), also referred to as mitral insufficiency or mitral incompetence, occurs when the mitral valve fails to close completely causing the blood to leak or flow backwards (regurgitate) into the left ventricle. If the amount of blood that leaks backwards into the left ventricle is minimal, then intervention is usually not necessary. However, if the amount of blood that is regurgitated becomes significant, this can cause the left ventricle to work harder to meet the body's need for oxygenated blood. Severity levels of MR can range from grade 1+ through grade 4+. If left untreated, severe MR can lead to heart failure and death. The American College of Cardiology (ACC) and the American Heart Association (AHA) issued practice guidelines in 2006 that recommended intervention for moderate/severe or severe MR (grade 3+ to 4+). The applicant stated that the MitraClip® System is “indicated for percutaneous reduction of significant mitral regurgitation . . . in patients who have been determined to be at prohibitive risk for mitral value surgery by a heart team, which includes a cardiac surgeon experienced in mitral valve surgery and a cardiologist experienced in mitral valve disease and in whom existing comorbidities would not preclude the expected benefit from correction of the mitral regurgitation.”

The MitraClip® System mitral valve repair procedure is based on the double-orifice surgical repair technique that has been used as a surgical technique in open chest, arrested-heart surgery for the treatment of MR since the early 1990s. According to the applicant, in utilizing “the double-orifice technique, a portion of the anterior leaflet is sutured to the corresponding portion of the posterior leaflet using standard techniques and forceps and suture, creating a point of permanent cooptation (“approximation”) of the two leaflets. When the suture is placed in the middle of the valve, the valve will have a functional double orifice during diastole.”

With regard to the newness criterion, the MitraClip® System received a premarket approval from the FDA on October 24, 2013. The MitraClip® System is indicated “for the percutaneous reduction of significant symptomatic mitral regurgitation (MR ≥ 3+) due to primary abnormality of the mitral apparatus (degenerative MR) in patients who have been determined to be at prohibitive risk for mitral valve surgery by a heart team, which includes a cardiac surgeon experienced in mitral valve surgery and a cardiologist experienced in mitral valve disease, and in whom existing comorbidities would not preclude the expected benefit from reduction of the mitral regurgitation.” The MitraClip® System became immediately available on the U.S. market following FDA approval. The MitraClip® System is a Class III device, and has an investigational device exemption (IDE) for the EVEREST study (Endovascular Valve Edge-to-Edge Repair Study)—IDE G030061, and for the COAPT study (Cardiovascular Outcomes Assessment of the MitraClip Percutaneous Therapy for Health Failure Patients with Functional Mitral Regurgitation)—IDE G120024. Effective October 1, 2010, ICD-9-CM procedure code 35.97 (Percutaneous mitral valve repair with implant) was created to identify and describe the MitraClip® System technology.

CMS received a formal National Coverage Decision (NCD) request from the Society of Thoracic Surgeons (STS), the American College of Cardiology Foundation (ACCF), the Society for Cardiovascular Angiography and Interventions (SCAI), and the American Association for Thoracic Surgery (AATS) jointly asking that CMS cover Transcatheter Mitral Valve Repair procedures using a system that has received FDA premarket approval (PMA) for the treatment of MR when performed according to an FDA-approved indication. We refer readers to the CMS Web site at: http://www.cms.gov/medicare-coverage-database/details/nca-tracking-sheet.aspx?NCAId=273 for information related to this ongoing NCD. The tracking sheet for this National Coverage Analysis (NCA) indicates an expected NCA completion date of August 13, 2014, which is after the FY 2015 IPPS/LTCH PPS final rule is scheduled to be published. The processes for evaluation and determination of an NCD, and the processes for evaluation and approval of an application for new technology add-on payments are made independent of each other. However, any payment made under the Medicare program for services provided to a beneficiary would be contingent on CMS' coverage of the item, and any restrictions on the coverage would apply.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on how the MitraClip® System meets the newness criterion for purposes of new technology add-on payments and the issues that may arise from concurrent NCD requests and new technology add-on payment application review and approval processes.

Comment: The applicant stated that the technology is a first in kind and is not substantially similar to any FDA approved technology on the market. Therefore, the applicant believed that the technology meets the newness criterion. Several other public comments believed that the MitraClip® System meets the newness criterion.

Response: We appreciate the commenters' input. After consideration of the application, we agree with the commenters that the MitraClip® System meets the newness criterion. Therefore, for purposes of determining eligibility for FY 2015 IPPS new technology add-on payments, we consider the technology to be “new” as of October 24, 2013, and will use ICD-9-CM procedure code 35.97 (Percutaneous mitral valve repair with implant) to identify the technology for new technology add-on payments.

Comment: One commenter noted that the application to request a NCD was not made by the applicant, as stated in the proposed rule. Rather, the commenter stated that this request was made by a coalition of four national physician specialty societies that specialize in treating patients diagnosed with valve disease.

Response: We appreciate the commenter's input concerning this clarification.

With regard to the cost criterion, the applicant conducted two analyses. The applicant noted that, while ICD-9-CM procedure code 35.97 maps to MS-DRGs 246 (Percutaneous Cardiovascular Procedure with Drug-Eluting Stent with Major Complication or Comorbidity (MCC) or 4+ Vessels/Stents), 247 (Percutaneous Cardiovascular Procedure with Drug-Eluting Stent without MCC), 248 (Percutaneous Cardiovascular Procedure with Non-Drug-Eluting Stent with MCC or 4+ Vessels/Stents), 249 (Percutaneous Cardiovascular Procedure with Non-Drug-Eluting Stent without MCC), 250 (Percutaneous Cardiovascular Procedure without Coronary Artery Stent or AMI with MCC), and 251 (Percutaneous Cardiovascular Procedure without Coronary Artery Stent or AMI without MCC), clinical experience with the MitraClip® System device has demonstrated that it is extremely rare for a patient to receive stents concurrently during procedures using the MitraClip® System device. The applicant further cited the FY 2013 IPPS/LTCH PPS final rule (77 FR 53308) which stated, “According to the Food and Drug Administration's (FDA's) terms of the clinical trial for MitraClip® System, the device is to be implanted in patients without any additional surgeries performed. Therefore, based on these terms, we stated that while the procedure code is assigned to MS-DRGs 246 through 251, the most likely MS-DRG assignments would be MS-DRGs 250 and 251.” As a result, the applicant stated that it conducted its analyses solely for MS-DRGs 250 and 251 to demonstrate that the cases involving the MitraClip® System device meet the incremental cost thresholds provided in Table 10 for those MS-DRGs.

The applicant researched the FY 2012 MedPAR file for claims for cases reporting ICD-9-CM procedure code 35.97. Under the first analysis and methodology, the applicant noted that this search yielded actual claims for cases in which the MitraClip® System device was used in procedures performed in an IDE study type setting, and hospitals obtained the MitraClip® System device at a reduced investigational price. The applicant further stated that it is likely that hospitals did not report the charges for the investigational device, or submitted claims for charges that were significantly less than the actual device acquisition costs (we refer readers to the explanation below). The applicant found 57 cases in MS-DRG 250 (29.38 percent of the total number of cases), and 137 cases in MS-DRG 251 (70.61 percent of the total number of cases), which resulted in an average case-weighted standardized charge per case of $232,670.

The applicant standardized the charges using the FY 2014 IPPS final rule impact file, and inflated the result using three different inflation factors. We note that, since the applicant used FY 2012 MedPAR data, we believe it is appropriate to use comparable data for standardization. Therefore, we believe use of the FY 2012 final rule impact file is more appropriate rather than the FY 2014 final rule impact file. The first analysis and methodology used an inflation factor of 4.57 percent, which was based on data from the BLS' non-seasonally adjusted CPI for all urban consumers between January 2011 and January 2013. This resulted in an average case-weighted standardized charge per case of $94,517. The second methodology under the first analysis used an inflation factor of 9.92 percent, which was based on the 2-year charge inflation factor listed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50982). This resulted in an average case-weighted standardized charge per case of $96,199. The third methodology used under the first analysis used an inflation factor of 4.63 percent, which was based on the Medicare Economic Index (MEI) from the IPPS market basket update between the third quarter of 2012 projected through the third quarter of 2014. This resulted in an average case-weighted standardized charge per case of $91,570. The applicant noted that all three methodologies used under the first analysis to determine each respective average case-weighted standardized charge per case were calculated without any adjustments to reflect the reduced investigational price, or inadequate hospital claim reporting and billing.

Using the FY 2014 IPPS Table 10 thresholds, the average case-weighted threshold for MS-DRGs 250 and 251 is $71,467 (all calculations above were performed using unrounded numbers). Because the average case-weighted standardized charge per case for the applicable MS-DRGs calculated under each methodology under the first analysis discussed above exceeds the average case-weighted threshold amount, the applicant maintained that the technology meets the cost criterion.

Under the second analysis, which used the same premise as the first analysis, the applicant researched the FY 2012 MedPAR file for claims for cases reporting procedure code 35.97 that mapped to MS-DRGs 250 and 251, except that the applicant excluded charges related to the MitraClip® System by removing all charges from the claim that would map to the implantable cost center on the cost report. The applicant then standardized the charges, inflated the result using the three inflation factors above, and added a fixed amount of commercial charges based on post-FDA approval pricing. This resulted in an average case weighted standardized charge per case of $139,536 under the first inflation factor (4.57 percent), $142,364 under the second inflation factor (9.2 percent), and $139,568 under the third inflation factor (4.63 percent).

Using the FY 2014 IPPS Table 10 thresholds, the average case-weighted threshold for MS-DRGs 250 and 251 is $71,467 (all calculations above were performed using unrounded numbers). Because the average case-weighted standardized charge per case for the applicable MS-DRGs calculated under all three methodologies discussed above exceeds the average case-weighted threshold amount, the applicant maintained that the MitraClip® System meets the cost criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether or not the MitraClip® System meets the cost criterion. In addition, we invited public comments on the methodologies used by the applicant in its two analyses.

Comment: In response to CMS' statement in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28049) that it believed use of the FY 2012 final rule impact file is more appropriate rather than the FY 2014 final rule impact file for standardization, the applicant submitted the following supplemental data updating its cost analyses.

With regard to the second analysis, the applicant submitted revised data using the FY 2012 MedPAR file and the FY 2012 impact file to standardize the charges. We note that in the proposed rule we inadvertently listed $232,670 as the average case-weighted standardized charge per case. This amount is the average case-weighted non-standardized charge per case. Based on the revised data, the corrected average case-weighted standardized charge per case is $151,111.

Using the same methodology described above and the FY 2012 impact file, under the second analysis, the applicant determined an inflated average case-weighted standardized charge per case of $136,479 under the first inflation factor (4.57 percent), $139,151 under the second inflation factor (9.2 percent), and $139,509 under the third inflation factor (4.63 percent). The applicant compared these amounts to the average case-weighted threshold of $71,467 for MS-DRGs 250 and 251 (all calculations above were performed using unrounded numbers). Because the inflated average case-weighted standardized charge per case for the applicable MS-DRGs calculated under all three methodologies discussed above exceeds the average case-weighted threshold amount of $71,467, the applicant maintained that the MitraClip® System meets the cost criterion.

The applicant also revised the second analysis using FY 2013 MedPAR and the FY 2013 impact file. Based on this data, similar to above, the applicant searched the FY 2013 MedPAR file for claims for cases reporting ICD-9-CM procedure code 35.97. The applicant found 43 cases in MS-DRG 250 (28.66 percent of the total number of cases), and 107 cases in MS-DRG 251 (71.33 percent of the total number of cases), which resulted in an average case-weighted standardized charge per case of $149,725.

The first methodology used an inflation factor of 3.20 percent, which was based on data from the BLS' non-seasonally adjusted CPI for all urban consumers between January 2012 and January 2013. This resulted in an inflated average case-weighted standardized charge per case of $152,945 (which included a fixed amount of commercial charges based on post-FDA approval pricing). The second methodology used an inflation factor of 11.46 percent (second quarter of FY 2012 through first quarter of FY 2014), which was based on the outlier inflation factor in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28321). This resulted in an inflated average case-weighted standardized charge per case of $158,425 (which included a fixed amount of commercial charges based on post-FDA approval pricing). The third methodology used an inflation factor of 4.53 percent, which was based on the MEI from the IPPS market basket update between the third quarter of 2013 projected through the third quarter of 2015. This resulted in an average case-weighted standardized charge per case of $153,827 (which included a fixed amount of commercial charges based on post-FDA approval pricing).

Using the FY 2014 IPPS Table 10 thresholds, the average case-weighted threshold for MS-DRGs 250 and 251 is $75,772 (all calculations above were performed using unrounded numbers). Because the inflated average case-weighted standardized charge per case for the applicable MS-DRGs calculated under each methodology under this analysis discussed above exceeds the average case-weighted threshold amount, the applicant maintained that the technology meets the cost criterion.

Several other commenters believed that the MitraClip® System meets the cost criterion.

Response: We appreciate the applicant's submission of the supplemental data. We agree with the commenters that the MitraClip® System meets the cost criterion. We note that in section II.I.4.b. of the preamble of this final rule, we denied the applicant's request to reassign cases reporting a TMVR using the MitraClip® System from MS-DRGs 250 and 251 to MS-DRGs 216 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with MCC), 217 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with CC), 218 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization without CC/MCC), 219 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with MCC), 220 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with CC), and 221 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization without CC/MCC). We also denied the applicant's request to create a new base MS-DRG for transcatheter valve therapies. We refer readers to section II.G. for a complete discussion on these requests.

The applicant asserted that the MitraClip® System meets the substantial clinical improvement criterion. Severe MR is associated with significant morbidity and mortality rates, and is a progressive condition. For symptomatic patients diagnosed with significant MR, surgical repair or replacement is considered the gold standard—offering improvements in symptoms and longer survival rates. However, the applicant explained that studies have indicated that a significant proportion of patients are not eligible for mitral valve repair and/or replacement surgery because of risk factors, including reduced left ventricular function, significant comorbidities, and advanced age. As a result, the applicant stated that there is a significant unmet clinical need for patients diagnosed with severe MR who are too high-risk for surgery, who are receiving palliative medical management.

The applicant also stated that the MitraClip® System meets the substantial clinical improvement criterion based on clinical studies that have consistently shown that procedures performed using the MitraClip® System device lead to a significant reduction of MR; improvements in left ventricular (LV) function including LV volumes and dimensions; improved patient outcomes as measured by improvements in New York Heart Association (NYHA) functional class, improvement in health-related quality of life measures, and reductions in heart-failure related hospitalizations; and significantly lower mortality rates than predicted surgical mortality rates.

Feldman, et al., “Percutaneous Repair or Surgery for Mitral Regurgitation,” New England Journal of Medicine, 2011, Vol. 364, pp. 1395-1406.

Foster, et al., “Percutaneous Mitral Valve Repair in the Initial EVEREST Cohort: Evidence of Reverse Left Ventricular Remodeling,” Circulation in Cardiovascular Imaging, July 2013, Vol. 6(4), pp. 522-530.

Grayburn, et al., “The Relationship between the Magnitude of Reduction in Mitral Regurgitation Severity and Left Ventricular and Left Atrial Reverse Remodeling after MitraClip Therapy,” Circulation in Cardiovascular Imaging, September 2013, epub, September 6, 2013.

Lim, et al., “Improved Functional Status and Quality of Life in Prohibitive Surgical Risk Patients With Degenerative Mitral Regurgitation Following Transcatheter Mitral Valve Repair with the MitraClip® System,” Journal of American College of Cardiology, 2013, In Press, Accepted Manuscript, Available online, October 31, 2013.

Maisano, F., et al., “Percutaneous Mitral Valve Interventions in the Real World: Early and One Year Results From the ACCESS-EU, a Prospective, Multicenter, Non-Randomized Post-Approval Study of the MitraClip Therapy in Europe,” Journal of American College of Cardiology, 2013, doi: 10.1016/j.jacc.2013.02.094.

Mauri, et al., “4-Year Results of a Randomized Controlled Trial of Percutaneous Repair Versus Surgery for Mitral Regurgitation,” Journal of American College of Cardiology, Volume 62, Issue 4, 2013, p. 317-328.

Munkholm, et al., “Asystemic Review on the Safety and Efficacy of Percutaneousedge-to-edge Mitral Valve Repair with the MitraClip System for high surgical risk candidates,” Heart, June 27, 2013.

Reichenspurner, H., et al., “Clinical Outcomes Through 12 Months in Patients With Degenerative Mitral Regurgitation Treated With the MitraClip Device in the ACCESS-Europe Phase I Trial,” European Journal of Cardiology-and Thoracic Surgy, 2013, Vol. 15, pp. 919-927.

Whitlow, et al,. “Acute And 12-Month Results With Catheter-Based Mitral Valve Leaflet Repair: The EVEREST II (Endovascular Valve Edge-to-Edge Repair) High Risk Study,” Journal of American College of Cardiology, 2012, Vol. 59, pp. 130-139.

The applicant cited clinical data from the EVEREST II High-Risk Study and the EVEREST II (REALISM) Continued Access Study/Registry. The applicant also cited clinical data from a high-risk cohort of patients (the EVEREST II High-Risk Cohort), which is an integrated analysis of the following: (1) patients within the EVEREST II High-Risk Study who met eligibility criteria for being too high-risk to undergo mitral valve repair surgery; and (2) patients within the EVEREST II (REALISM) Continued Access Study/Registry who were too high-risk for surgery using identical eligibility inclusion criteria. The applicant also cited data from the Prohibitive Risk Degenerative Mitral Regurgitation (DMR) Cohort, which is an analysis of retrospectively evaluated high-risk patients diagnosed with DMR enrolled in the EVEREST II studies that had 1-year follow-up available.

In addition to the published clinical experience from the EVEREST studies, the applicant cited data on the use of the MitraClip® System device in a “real-world” setting published recently by a select number of European centers as part of their individual and/or multi-center commercial experience or enrollment in the MitraClip® System device group of the ACCESS-EU post-approval clinical trial in Europe. The European use of the MitraClip® System device is focused on patients who are too high-risk for surgery, and patients who are selected for therapy using a multi-disciplinary “heart team” approach.

The applicant stated that published reports on the MitraClip® System device and the procedures in which the device was used have consistently demonstrated a significant reduction in MR incidents that have been durable out to 1, 2, 3, and 4 years. The applicant cited the EVEREST II High-Risk Study (an analysis of 78 patients diagnosed with degenerative or functional MR enrolled in the trial), which stated that “objective measures of MR grade improved in the MitraClipTM group, including MR grade of ≤2+ in 78 percent of surviving patients at 1 year. These patients also experienced clinically significant improvements in left ventricular volume measurements. The clinical significance of these improvements is reflected in the NYHA class improvements. At baseline, 89 percent of patients were NYHA III/IV, improving to Class I/II in 74 percent of surviving patients at 12 months. Quality of life scores also improved significantly. Finally, the number of admissions for heart failure was significantly reduced compared to the year prior to MitraClipTM therapy.”

The applicant cited clinical outcomes from the Prohibitive Risk DMR cohort. These results are the basis of the FDA premarket approval. Major effectiveness endpoints evaluated at 12 months demonstrated clinically important improvements in MR severity, with MR severity grades of 3+/4+ decreasing from 90.4 percent at baseline to 16.7 percent at 1 year; NYHA Class III/IV decreasing from 86.6 percent at baseline to 13.1 percent at 1 year; and the SF-36 Physical/Mental scale measuring 33.4/46.6 at baseline increasing to 39.4/52.2 at 1 year.

The applicant stated in its new technology add-on payment application that, “Heart failure hospitalizations were reduced by 73 percent in the 12 months post MitraClipTM procedure from the 12 month pre-MitraClipTM procedure . . .,” and “the primary safety analysis indicated low procedural (30-day) mortality (6.3 percent) after MitraClipTM in comparison with the STS predicted surgical mortality risk score for these patients (13.2 percent).”

The applicant discussed published results “assessing the relationship between the magnitude of reduction in MR and left ventricular (LV) and left atrial (LA) remodeling after the MitraClipTM therapy.” In this study of patients diagnosed with significant (grade 3+ or 4+) DMR or functional MR (FMR), the authors found that, “even reduction of MR severity to moderate (2+) is associated with LV and LA reverse remodeling. In both DMR and FMR, reduction in left ventricular end-diastolic volume (LVEDV) and LA volumes were improved proportionally to the degree of MR reduction at one year.”

Grayburn, et al., “The Relationship between the Magnitude of Reduction in Mitral Regurgitation Severity and Left Ventricular and Left Atrial Reverse Remodeling after MitraClip Therapy,” Circulation in Cardiovascular Imaging, September 2013, epub, September 6, 2013.

In conclusion, the applicant cited data from the ACCESS-EU study, which noted improvement in disease-specific quality of life measures, including the Minnesota Living with Heart Failure Questionnaire and Six-Minute Walk Test. The applicant also provided data supporting the overall safety and effectiveness of the MitraClip® System device in European “real-world” outcome studies.

We stated in the FY 2015 IPPS/LTCH PPS proposed rule that, as noted in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27547 through 27552), we are concerned that the applicant revised its initial FDA request for the use of the MitraClip® System device in all patients diagnosed with significant MR, after learning that the FDA expressed concern that the initial study, EVEREST II, demonstrated that, while the MitraClip® System device had clinically meaningful improvements in LV volume and QOL, the surgical option had better outcomes than the MitraClip® System device in surgical candidates. The FDA then required a second trial focused on high surgical risk patients. We noted that the data evaluated by the FDA and presented by the applicant in its application for new technology add-on payments included information from the following:

EVEREST I feasibility trial; enrollment 2003-2006; 55 patients.

EVEREST II RCT; enrollment 2005-2008; 279 patients.

EVEREST II High-Risk Study; enrollment 2007-2008; 78 patients. (A comparator group of 36 patients was identified from patients who were screened for the study, but did not meet the mitral valve anatomic criteria for placement of the device.)

EVEREST (REALISM) Continued Access Study and compassionate use; enrollment 2009-2013; 49 patients.

The applicant provided comparisons of various outcomes prior to the procedure using the MitraClip® System device and outcomes 12 months later. MR severity, LV end diastolic volume, NYHA Class, SF36 Physical/Mental scale, and heart failure hospitalization rates all had clinically meaningful improvements. For the EVEREST II HRS, the applicant provided analysis demonstrating a significant survival benefit (76 percent versus 55 percent/p<0.047) over the comparator group.

We stated in the FY 2015 IPPS/LTCH PPS proposed rule that in our review of the clinical trials' data, we have the following key points of concern:

  • Post-hoc analyses of pooled data sets retain all of the individual shortcomings of the individual data sets;
  • Pooling does not enhance the utility and scientific value of uncontrolled single-arm registries with no comparators; and
  • Inappropriate pooling introduces additional confounders.

We stated that it is also unclear if the appropriate target population for the MitraClip® System device has been identified because the clinical trials conducted by the applicant included patients diagnosed with both DMR and FMR. This makes it difficult to determine which group of patients may benefit more, or less, from the new technology. For example, in a subgroup analysis of the EVEREST II RCT, the authors concluded that, older patients and those patients diagnosed with FMR or abnormal left ventricular function had results more comparable to surgical repair. Data results from 2 years of the EVEREST II RCT also demonstrated that surgery reduced incidents of MR more than the procedures performed using the percutaneous MitraClip® System device. However, both the surgical patients and the patients who were treated using the MitraClip® System device showed comparable results for improved left ventricular function, NYHA functional class, and quality of life.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether this technology meets the substantial clinical improvement criterion, particularly in comparison to other surgical therapies, such as mitral valve repair or replacement, and the appropriate target population for this technology.

Comment: A number of commenters agreed with the applicant that the MitraClip® System meets the substantial clinical improvement criterion. The commenters also recommended the approval of the MitraClip® System for new technology add-on payments in FY 2015. One commenter, an association of thoracic surgeons, expressed support for the approval of the MitraClip® System for new technology add-on payments. The commenter explained that the MitraClip® System provides a treatment option to Medicare beneficiaries that represents a substantial clinical improvement for patients who are too high risk for surgical mitral valve repair or replacement. Other commenters indicated that they had experience using the MitraClip® System.

Response: We appreciate the commenters' support. Many of the commenters described their positive experiences using the MitraClip® System, which improved the clinical outcome of the patients treated. Furthermore, the commenters believed that most, if not all, of the cases treated using the MitraClip® System would have had no other treatment option available. In addition, the commenters asserted that the MitraClip® System helped to provide improvements to the quality of life of the patients treated with the technology. We considered the commenters' positive experiences using the MitraClip® System in our determination of whether the MitraClip® System represents a substantial clinical improvement in the treatment options available to Medicare beneficiaries.

Comment: The applicant submitted a public comment that stated peer-reviewed evidence supported the belief that the MitraClip® System meets the substantial clinical improvement criterion. The applicant further noted that in previous rulemaking, CMS has indicated that new technologies represent a substantial clinical improvement if “the device offers a treatment option for a patient population unresponsive to, or ineligible for, currently available treatment.” The commenter believed that the MitraClip® System meets this criterion when used in accordance with the FDA-approved indication for the treatment of prohibitive risk degenerative mitral regurgitation (DMR). Specifically, the applicant stated that for those patients who are ineligible for surgery due to prohibitive surgical risk, the MitraClip® System offers the first available option to mechanically correct their mitral valve disease and, therefore, improve cardiac functioning and functional status and quality of life, while decreasing heart failure related hospitalizations and potentially reducing mortality.

The applicant reiterated the opinion that the clinical evidence demonstrated that the technology represents a substantial clinical improvement in the treatment options available to Medicare beneficiaries for the following reasons:

Lim et al. Improved Functional Status and Quality of Life in Prohibitive Surgical Risk Patients With Degenerative Mitral Regurgitation Following Transcatheter Mitral Valve Repair with the MitraClip® System, JACC (2013), In Press, Accepted Manuscript, Available online 31 October 2013.

MitraClip® Clip Delivery System Instructions for Use, at abbottvascular.com/ifu.

  • A majority of patients experience MR reduction from 3+/4+ to ≤2+ after the procedure. This improvement is sustained in 83 percent of patients at 12 months. Results at 2 years demonstrated that 82.5 percent of surviving patients remained at ≤2+, which demonstrated that there is no evidence of deterioration of MR severity between 1-year and 2-year follow up.
  • Reduction in MR with the MitraClip therapy to ≤2+ has been shown to provide significant symptomatic DMR patients with meaningful clinical benefits including reduction of left ventricular volumes.
  • Patients experienced clinically important improvement in NYHA Functional Class at 12 months; roughly 87 percent of patients experienced NYHA Class III or Class IV symptoms at baseline, which improved to less than 15 percent at 12 months.
  • Despite the elderly and highly comorbid nature of the population, quality of life scores improved. The improvements in both the Physical Component Summary and Mental Component Summary scores exceeded the 2-3 point threshold generally considered to represent a minimum clinically important difference.
  • Heart failure hospitalizations were reduced by 73 percent in the 12 months post-MitraClip procedure from the 12 months pre-MitraClip procedure.

The commenter concluded that, in recognition of these benefits, the 2014 AHA/ACC Guidelines for the Management of Patients with Valvular Heart Disease recommended the MitraClip therapy as a treatment option for the FDA-approved indication. The commenter noted that the guidelines state that TMVR may be considered for severely symptomatic patients (NYHA Class III to Class IV) with chronic severe primary MR (stage D) who have favorable anatomy for the repair procedure and a reasonable life expectancy, but who have a prohibitive surgical risk because of severe comorbidities and remain severely symptomatic despite optimal GDMT for HF.

The applicant also addressed CMS' concerns presented in the proposed rule. Specifically, with respect to the concern regarding the appropriate target population for this technology, the commenter believed that the target population has been clearly defined in the FDA approval indication and associated labeling for the MitraClip® System. The applicant noted that since the publication of the proposed rule, as stated above, the AHA/ACC has reviewed the MitraClip® System evidence and updated their guidelines to recommend consideration for the use of the MitraClip® System for patients meeting the FDA-approved indication. In addition, the applicant indicated that the CMS Coverage and Analysis Group has also reviewed the MitraClip® evidence and issued a proposed decision memorandum to extend coverage for the FDA-approved indication at highly experienced centers of excellence meeting specific criteria. Further, the applicant noted that detailed multi-society requirements have been published specifying operator and institutional criteria for performing the MitraClip® System procedure, and these have been incorporated by CMS into the proposed decision memorandum. Finally, the applicant stated that it has worked together with national societies and CMS to establish a new mitral module of the national TVT registry to systematically track adherence to these requirements by all health care centers using the MitraClip® System and to collect data on patient outcomes with linkage to the CMS claims database.

With respect to CMS' concerns regarding how the MitraClip® system compares to other surgical therapies, such as mitral valve repair or replacement, the applicant stated that clinical outcomes from the prohibitive risk DMR Cohort were determined by the FDA to adequately establish the safety, effectiveness, and positive benefit-risk profile of the MitraClip® System for the indicated population, and these data are the basis for Premarket Approval Application (PMA) approval. In conclusion of thought, the applicant stated that the FDA concluded that the totality of clinical evidence demonstrated the reasonable assurance of safety and effectiveness of the MitraClip® System to reduce MR and provide patient benefit in this discrete and specific patient population.

The applicant also commented that the prohibitive risk DMR Cohort, on which FDA approval was granted, included 127 consecutively-enrolled patients who completed 12 months of follow-up after treatment with the MitraClip® System device. The applicant explained that this Cohort included 25 patients from the EVEREST II High Risk Registry (HRR) study, 98 patients from the high risk arm of the REALISM Continued Access study, and 4 Compassionate Use patients. The applicant further explained that the four Compassionate Use patients are included for analysis in the Prohibitive Risk DMR Cohort because they meet the definition of prohibitive risk and all valve anatomic criteria for eligibility. For inclusion in this Cohort, three physicians (two experienced mitral valve surgeons and one experienced mitral valve cardiologist) had to concur that the patient met the definition of prohibitive risk.

The applicant further stated that patients in the prohibitive risk DMR Cohort were all enrolled under a highly-rigorous IDE clinical trial protocol that included pre-specified eligibility criteria and adjudicated endpoints. The applicant stated that pooling of the EVEREST II Continued Access Study (REALISM) data with EVEREST II HRR was intended and pre-specified in the REALISM protocol. The applicant noted that one of the REALISM protocol's stated objectives was to gather additional safety and effectiveness data to support the PMA. The applicant further stated that the same device design was used, and care was taken to ensure the two studies had identical entry criteria, data collection, monitoring, and analysis methods. In addition, the applicant stated that the REALISM protocol defined the evaluation of poolability and specified clinically important baseline variables to be compared. The applicant stated that the majority (10/13) of these baseline characteristics, especially high-risk characteristics/comorbidities, was similar in REALISM and HRR, resulting in comparable average STS predicted mortality risk scores.

The applicant stated that the findings from the prohibitive risk DMR Cohort were highly consistent with real-world evidence from a large number of published European studies that included similar groups of high-risk patients.

The applicant concluded that despite some limitations in evaluating evidence from pooled datasets, it should be noted that all available evidence on the MitraClip® System consistently indicate that the use of this technology provides both mechanistic and clinical benefit for these high surgical risk patients.

Response: We appreciate the applicant's subsequent analysis of data. With respect to the substantial clinical improvement represented by this technology, we considered all the case specific clinical information presented by the applicant and the public to determine whether there is evidence to support a conclusion that the use of the MitraClip® System represents a substantial clinical improvement in the treatment options available to Medicare beneficiaries. Specifically, we considered the peer-reviewed medical literature, clinical studies, and the clinically accepted use of the device. We believe that it is important that the MitraClip® System be used in the treatment of the appropriate target population and that the NCD will establish the appropriate Medicare patient population for this procedure. We agree with the applicant that the MitraClip® System offers a treatment option for a patient population unresponsive to, or ineligible for, currently available treatment; specifically those patients that have been determined to be at prohibitive risk for mitral valve surgery (per the FDA indications). In addition, we received positive comments from a major cardiovascular and a major thoracic society and from many physicians who indicated that the MitraClip® System helped to produce positive clinical outcomes by providing a treatment option for patients with no other available options, as well as resolving MR. Furthermore, the MitraClip® System is the only device currently available to mechanically correct mitral valve disease. Without the availability of this device, patients with DMR might otherwise receive general treatment to maintain their condition, which would eventually result in death rather than a treatment to resolve their condition. Also, the MitraClip® System can be an effective treatment option that improves quality of life and reduces heart failure symptoms and hospitalizations. Therefore, after reviewing the totality of the evidence, we believe that the MitraClip® System represents a substantial clinical improvement over existing therapies. We remain interested in seeing whether the clinical evidence will continue to find that the MitraClip® System will be effective. We will continue to monitor the clinical data as the data become available.

After consideration of the public comments we received, we are approving the MitraClip® System for new technology add-on payments in FY 2015. As noted above, any payment made under the Medicare program for services provided to a beneficiary is contingent upon CMS' coverage of the item, and any restrictions on the coverage apply. This approval is on the basis of using the MitraClip® consistent with any coverage decision that will be issued by CMS after the publication of this final rule. Subject to any coverage determinations made by CMS regarding the MitraClip® System, cases involving the MitraClip® System that are eligible for the new technology add-on payments will be identified by ICD-9-CM procedure code 35.97. The average cost of the MitraClip® System is reported as $30,000. Under section 412.88(a)(2), new technology add-on payments are limited to the lesser of 50 percent of the average cost of the device or 50 percent of the costs in excess of the MS-DRG payment for the case. As a result, the maximum add-on payment for a case involving the MitraClip® System is $15,000 for FY 2015.

e. Responsive Neurostimulator (RNS®) System

NeuroPace, Inc. submitted an application for new technology add-on payments for FY 2015 for the use of the RNS® System. (We note that the applicant submitted an application for new technology add-on payments for FY 2014, but failed to receive FDA approval prior to the July 1 deadline.) Seizures occur when brain function is disrupted by abnormal electrical activity. Epilepsy is a brain disorder characterized by recurrent, unprovoked seizures. According to the applicant, the RNS® System is the first implantable medical device (developed by NeuroPace, Inc.) for treating persons diagnosed with epilepsy whose partial onset seizures have not been adequately controlled with antiepileptic medications. The applicant further stated that, the RNS® System is the first closed-loop, responsive system to treat partial onset seizures. Responsive electrical stimulation is delivered directly to the seizure focus in the brain when abnormal brain activity is detected. A cranially implanted programmable neurostimulator senses and records brain activity through one or two electrode-containing leads that are placed at the patient's seizure focus/foci. The neurostimulator detects electrographic patterns previously identified by the physician as abnormal, and then provides brief pulses of electrical stimulation through the leads to interrupt those patterns. Stimulation is delivered only when abnormal electrocorticographic activity is detected. The typical patient is treated with a total of 5 minutes of stimulation a day. The RNS® System incorporates remote monitoring, which allows patients to share information with their physicians remotely.

With respect to the newness criterion, the applicant stated that some patients diagnosed with partial onset seizures that cannot be controlled with antiepileptic medications may be candidates for the vagus nerve stimulator (VNS) or for surgical removal of the seizure focus. According to the applicant, these treatments are not appropriate for, or helpful to, all patients. Therefore, the applicant believed that there is an unmet clinical need for additional therapies for partial onset seizures. The applicant further stated that the RNS® System addresses this unmet clinical need by providing a novel treatment option for treating persons diagnosed with medically intractable partial onset seizures. The applicant received FDA premarket approval in November 2013. The following ICD-9-CM procedure codes are used to identify this technology: 01.20 (Cranial implantation or replacement of neurostimulator pulse generator); 01.29 (Removal of cranial neurostimulator pulse generator); and 02.93 (Implantation or replacement of intracranial neurostimulator lead(s)).

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the technology meets the newness criterion. However, we did not receive any public comments in response to the proposed rule regarding whether the technology meets the newness criterion. The applicant received FDA premarket approval on November 14, 2013. Therefore, for the purpose of evaluation for determinng eligibility for FY 2015 IPPS new technology add-on payments, we consider this technology to be “new” as of November 14, 2013, and we will use the following ICD-9-CM procedure codes to identify the technology for purposes of new technology add-on payments: 01.20 (Cranial implantation or replacement of neurostimulator pulse generator); 01.29 (Removal of cranial neurostimulator pulse generator); and 02.93 (Implantation or replacement of intracranial neurostimulator lead(s)).

With regard to the cost criterion, the applicant stated that substantially all cases eligible for the RNS® System would map to MS-DRG 024 (Craniotomy with Major Device Implant/Acute Complex Central Nervous System Principal Diagnosis without MCC). The applicant further stated that, while it is possible for some cases to occur in MS-DRG 023 (Craniotomy with Major Device Implant/Acute Complex Central Nervous System Principal Diagnosis with MCC or Chemotherapy Implant), it would be extremely rare because the applicant believed that these major complications and/or comorbidities would probably preclude a patient from receiving treatment using the RNS® System because the technology is an elective procedure.

The applicant submitted two analyses to demonstrate that the technology meets the cost criterion. For the first analysis, the applicant used clinical trial claims data collected in the RNS® System Pivotal Clinical Investigation to calculate the anticipated average case-weighted standardized charge per case. The applicant maintained that this analysis best represents the anticipated charges for the technology because it is based on actual cases treated using this technology. The applicant analyzed 163 claims from 28 hospitals participating in the clinical trial. Five claims from one hospital were excluded because no hospital-specific information regarding standardization was available. The resulting 158 claims included dates of service ranging from May 2006 through May 2009. The average case-weighted standardized charge per case for these 158 claims was $54,691.

The applicant then standardized the charges for each claim. The applicant noted that it was not necessary to remove any charges from these claims because the technology was provided at no charge in the trial. After standardizing the charges for each claim, the applicant inflated the charges reported on each claim using the BLS' CPI-IP data covering the same period. Specifically, because the publicly available FY 2012 MedPAR data do not identify the month of the discharge on inpatient claims, but do identify the calendar quarter, the applicant used a mid-month convention to determine the relevant monthly CPI-IP for each calendar quarter. The applicant then calculated the percentage change from the relevant quarter to the quarter of the most recently available CPI-IP, which was the August 2013 CPI-IP. Specifically, the applicant used the following assumptions:

FY 2012 calendar quarterMidpoint of quarterCPI IPPercent change to August 2013
Q4 2011Nov-11242.6727.93
Q1 2012Feb-11245.7216.59
Q2 2012May-11247.6465.76
Q3 2012Aug-11248.8565.25
Most recent as of applicationAug-13261.915
Source as cited by applicant: Bureau of Labor Statistics' Web site, accessed October 13, 2013; Base Period: December 1996 = 100.

After inflating the charges, the applicant estimated charges for the RNS® System by multiplying the device cost to the hospital by an anticipated hospital markup of 100 percent, or conversely by dividing the device cost by a CCR of 0.50. The applicant based its estimated CCR on four analyses. First, the applicant reviewed the 2007 and 2008 reports prepared by RTI for CMS on charge compression, which found that the national aggregate CCR for devices and implants was 0.43 and 0.467, as presented in the respective reports. Second, the applicant queried hospitals participating in the RNS® System Pivotal trial, and these queries yielded a mean and median CCR for implantable devices of 0.37 and 0.36, respectively. Third, the applicant reviewed data from the (All Payor) Premier database for cases performed during 2000 through 2010 that reported ICD-9 CM procedure codes 02.93 and/or 86.95 on a claim, and calculated a mean and median CCR for implanted leads and neurostimulators of 0.50 and 0.44, respectively. The applicant then reviewed other discussions of past new technology add-on payment applications published in the Federal Register, and noted that other applicants used lower CCRs (higher markups) for implanted devices than the CCR of 0.50 used in the applicant's analyses.

Using this approach, the applicant added the anticipated hospital charge for the implantable RNS® System to the average case-weighted standardized charge per case, and determined a final average case-weighted standardized charge per case of $128,723. The anticipated hospital charge for the implantable RNS® System is $73,900. Using the FY 2014 IPPS Table 10 thresholds, the threshold for MS-DRG 024 is $91,197. Because the final average case-weighted standardized charge per case of $128,723 for MS-DRG 024 exceeds the average case-weighted threshold amount, the applicant maintained that the RNS® System meets the cost criterion.

In the second analysis, which the applicant characterizes as supplementary, the applicant researched the FY 2012 MedPAR file for cases reporting the following combinations of ICD-9-CM procedures codes: 02.93 and 86.95, or procedures codes 02.93 and 01.20 that mapped to MS-DRG 024. The applicant found 383 claims for cases reporting the combination of ICD-9-CM procedures codes 02.93 and 01.20, and pointed out that these cases were coded with procedure code 01.20 in error because no new RNS® System implantations occurred after May 2009. The applicant analyzed these 383 claims, and found that more than 90 percent of these cases had a primary or secondary diagnosis of Parkinson's disease, essential tremor, or dystonia. These diagnoses are FDA-approved indications for deep brain stimulation (DBS). In addition, the applicant noted that the total covered charges for these cases were less than the estimated charges for a full DBS system, and hypothesized that these cases did not represent implantation of a full DBS system, but did represent the implantation of leads only. The applicant contacted two hospitals that reported claims for cases where total covered charges were less than the charges for a full DBS system, and the hospitals confirmed that their claims represented lead implantations only. Therefore, for the second analysis, the applicant included all of the cases assigned to MS-DRG 024 reporting a combination of ICD-9-CM procedures codes 02.93 and 86.95, and all of the cases assigned to MS-DRG 024 reporting a combination of ICD-9-CM procedures codes 02.93 and 01.20 where the covered charges were greater than, or equal to, the estimated charges of a full DBS system. The applicant maintained that 374 claims from 106 providers met this criterion, and data represented claims from the fourth calendar quarter of 2011 through the third calendar quarter of 2012. Based on this assumption, the applicant calculated an average case-weighted standardized charge per case of $65,555.

The applicant then removed DBS charges from the average case-weighted standardized charge per case. The applicant estimated charges for a full DBS system, and maintained that the average cost for a full DBS system is $25,979. Similar to its first analysis, the applicant assumed a CCR of 0.50, or 100 percent markup, which resulted in estimated charges for a full DBS system of $51,958. After removing the DBS system charges, the applicant inflated the charges to the current period using the same methodology in the first analysis, added charges for the RNS® System, and determined a final average case-weighted standardized charge per case of $130,233. As noted above, the anticipated hospital charge for the implantable RNS® System is $73,900. Using the FY 2014 IPPS Table 10 thresholds, the average case-weighted threshold for MS-DRG 024 is $91,197. Because the final average standardized charge per case of $130,233 for MS-DRG 024 exceeds the threshold amount, the applicant maintained that the RNS® System meets the cost criterion.

Under either analysis, the applicant maintained that the final average case-weighted standardized charge per case would exceed the average case-weighted threshold.

In the FY 2015 IPPS/LTCH PPS proposed rule, we invited public comments on whether the RNS® System meets the cost criterion, particularly based on the assumptions and methodology used in the applicant's analyses. However, we did not receive any public comments in response to the proposed rule regarding whether this technology meets the cost criterion. After further evaluation of the new technology add-on payment application, we believe that the technology meets the cost criterion.

With regard to substantial clinical improvement, as previously stated, some patients diagnosed with partial onset seizures may not be able to control their seizures with antiepileptic medications, VNS, or with surgical removal of the seizure focus. The applicant stated that the RNS® System provides treatment for those patients diagnosed with partial onset seizures who fail treatment with antiepileptic medications, or VNS therapy, and who are ineligible for respective surgery because of the extent and/or location of the seizure focus, or patients who do not elect surgery. According to the applicant, the RNS® System clinical trials provide Class I evidence that treatment using the RNS® System substantially reduces disabling seizures in patients diagnosed with severe epilepsy, who have tried and failed treatment with antiepileptic medications, and in many cases, VNS or epilepsy surgery. The applicant maintained that the results from their clinical trials demonstrate significant and sustained improvements in health outcomes over the controlled period and over the long term. The applicant conducted a feasibility trial, which was designed to demonstrate adequate safety of its treatment, and provide evidence of effectiveness to support commencement of a randomized double-blinded pivotal trial. In addition, the applicant has an ongoing long-term treatment clinical investigation trial (LTT trial) to assess the long-term safety and effectiveness of the treatment on patients who have completed either the Feasibility trial, or the RNS® System Pivotal trial for an additional seven years. The LTT trial started in April 2006, and the final patient is expected to complete the trial in 2018. The applicant noted that patients enrolled in the LTT trial continued to experience a reduction in seizures over several years of follow-up, further demonstrating the positive effect of responsive stimulation from the RNS® System is durable.

The applicant stated that their pivotal trial met its primary effectiveness endpoint by proving that there was a statistically significant greater reduction in seizures in the treatment group compared to the control group (p = 0.012). Significant improvements at 1 and 2 years post-implant included:

  • A significant reduction in disabling seizures of 44 percent and 53 percent at 1 and 2 years, respectively;
  • Fifty-five percent of patients who reached 2 years post-implant experienced a 50 percent or greater reduction in seizures; and
  • Significant improvements in overall quality of life, as well as individual quality of life measures including memory, language, attention, concentration and medication effects.

The applicant asserted that there was no negative effect of treatment using the RNS® System on neuropsychological function (including verbal functioning, visual spatial processing, and memory) or mood. The applicant concluded that the RNS® System Pivotal trial provides Class I evidence that responsive cortical stimulation is effective in significantly reducing seizure frequency in adults with one or two seizure foci who have failed two or more antiepileptic medication trials. The applicant stated that experience across all of the RNS® System trials demonstrates the reduction in seizure frequency of disabling partial onset seizures improves over time. In addition, the applicant noted that sustained improvements were also seen in quality of life. Finally, the applicant noted that safety and tolerability measures compare favorably to alternative treatments, such as antiepileptic medications, VNS, and epilepsy surgery.

With regard to the substantial clinical improvement criterion, we stated in the proposed rule that we are concerned that the average age of the patients enrolled in the applicant's trials was 35 years. Although the applicant maintained that 31 percent of the patients enrolled in the pivotal trial were Medicare beneficiaries, we are unsure of the extent to which this technology would be used by Medicare beneficiaries because of the relatively young age of the majority of the patients enrolled in the pivotal trial. We also are concerned that further clarification on how the RNS® System compares to other neurostimulation treatments was not provided by the applicant.

Because the applicant included claims with DBS charges in one of its cost analyses, we believe that the similarities and differences between DBS and the RNS® System may also be relevant under the substantial clinical improvement criterion. In addition, we stated in the proposed rule that we are concerned that the time period in the clinical trial may not be sufficient to confirm durability. In the RNS® System Pivotal Clinical Investigation, the primary effectiveness endpoint considered seizure frequency over the last 3 months of the blinded period of the trial. We note that the applicant is currently conducting a 5-year study.

We invited public comments on whether the RNS® System meets the substantial clinical improvement criterion, particularly in regard to the degree in which the technology would be used by Medicare beneficiaries, the comparison to other neurostimulation treatments, and its durability.

Comment: Commenters stated that the technology is currently used and will continue to be used in the treatment of Medicare beneficiaries who have been diagnosed with epilepsy. One commenter noted that 31 percent of individuals in the RNS® System clinical trial were Medicare beneficiaries, and all of these individuals were enrolled in the Medicare program because of a disability as opposed to being enrolled in the Medicare program because of their age. In addition, the commenter provided an analysis of data obtained from publicly available databases, specifically using the Premier Perspective all payor database for the time period from 2008 through 2013 and the CMS MedPAR database for FY 2012 and FY 2013. This analysis showed that, for Medicare beneficiaries who have been diagnosed with medically intractable partial epilepsy, 72 to 77 percent of the Medicare claims were submitted for payment of services provided to patients who were under the age of 65. The commenter also queried the public Web sites of the healthcare centers that participated in the RNS® System Pivotal trial, which included data on patients who have participated in specific programs directed by 120 adult comprehensive epilepsy centers, and found that these centers reported that 33 percent of their patients who have been diagnosed with epilepsy were enrolled in the Medicare program and 76 percent of these Medicare beneficiaries were under the age of 65. Several other commenters asserted that patients who have been diagnosed with epilepsy and receive treatment using this technology would be eligible for Medicare based on a disabling condition. The commenter provided examples of the types of patients that they have treated who are younger than the age of 65, but who are insured through the Medicare program based on a disabling condition.

Response: We appreciate the information detailed within the commenter's analysis. We agree with the commenters that this technology will be available for use by Medicare beneficiaries.

Comment: Commenters provided comparison analyses for this technology and VNS therapy, DBS, surgical resection, and other medications, and also conducted assessments of the durability of the RNS® System. (We further discuss the results of the comparison analyses and assessments conducted by these commenters below.) Many of these commenters pointed out that this technology is capable of capturing and storing information regarding seizure activity, which could enable the use of this technology to initiate possible changes in medical management of patients treated with an implant over time.

In comparison to VNS therapy, commenters stated that the RNS® System is a closed loop system that provides electrical stimulation in response to brain activity, while VNS therapy is an open loop system that provides electrical stimulation continuously or intermittently at programmed intervals. In addition, commenters stated that the RNS® System can be applied directly to the seizure focus or foci in the brain, while VNS therapy provides stimulation to the vagus nerve. The commenters noted that this distinction represents an improvement relative to VNS therapy because patients receive less stimulation using the RNS® System. The commenters also pointed out that the side effects of VNS therapy, such as hoarseness, coughing, and throat pain, are distressing and uncomfortable for patients and can make VNS therapy difficult to tolerate. These commenters also noted that these side effects do not emerge with the use of the RNS® System. One commenter provided data from the clinical trials for VNS therapy, which showed that more than half of the patients treated with VNS therapy “perceived” stimulation. The commenter also provided data from clinical trials for VNS therapy that showed that the side effects for VNS therapy included voice alternation, increased coughing, pharyngitis, dyspnea, dyspepsia, nausea, and laryngismus. The commenter compared the indications from the clinical trial data with data from the RNS® System trials, which indicate that there were no patients with ongoing complaints related to “perception of stimulation,” although some patients experienced symptoms such as flashing lights or focal muscle twitching. The commenter stated that stimulation with the RNS® System was adjusted for patients experiencing these symptoms, such that the symptoms became imperceptible. Many commenters stated that they were able to use the RNS® System to reduce the frequency of seizures in patients who have been diagnosed with epilepsy for whom VNS therapy did not reduce seizures. One commenter provided clinical trial data regarding VNS therapy that showed that in two studies in blinded periods VNS therapy reduced median seizures per day by 6 to 23 percent, and that over 3 years VNS therapy reduced median seizures per day by 31 to 41 percent. The commenter also provided clinical trial data regarding the RNS® System that showed in the blinded period a 28 percent reduction of median seizures per day compared to 19 percent for the control group. In addition, the commenter also provided clinical trial data regarding the RNS® System that showed that over 3 years the RNS® System reduced median seizures by 44 to 60 percent. The commenter also pointed out that 34 percent of patients enrolled in the RNS® System trial were previously treated with VNS therapy, but experienced positive outcomes with the RNS® System.

In comparison to DBS, commenters stated that the RNS® System was not approved by the FDA for treatment of epilepsy, and DBS is not considered to be the standard of care for the treatment of epilepsy by the American Academy of Neurology or the American Epilepsy Society. The commenters stated that they did not have experience with the RNS® System to compare with DBS to because it is not typically used, or approved for, treating patients diagnosed with epilepsy. One commenter noted that DBS is only available to patients on an experimental or investigational basis for the treatment of epilepsy. Another commenter stated that no direct comparison trial has been conducted between DBS and the RNS® System. The commenter reviewed data from a clinical trial that studied the use of DBS treatment of the anterior nucleus of the thalamus in subjects with medically intractable partial seizures. While the commenter stated that some of the data appeared to be comparable to the results of the RNS® System trials in terms of seizure reduction and quality of life, differences existed in the construction of the trials, including inclusion and exclusion criteria and primary efficacy endpoints. The commenter also stated that, similar to VNS therapy, DBS provides continuous or intermittent stimulation at program intervals, resulting in more stimulation being delivered than delivered using the RNS® System.

In comparison to surgical resection, commenters noted that the RNS® System can be used when surgical resection is not available as a treatment option. Commenters stated that some patients who have been diagnosed with epilepsy have seizure focus or foci area(s) in regions of the brain that should not be removed because removal would result in serious neurological defects. Therefore, commenters stated that the RNS® System represents a treatment option for patients who have been diagnosed with epilepsy for whom surgery is not an option. In addition, commenters stated that they were able to use the RNS® System to reduce the frequency of seizures in patients who had been treated with surgical resection and did not experience a reduction in seizures after surgery.

In comparison to antiepileptic medications used to treat patients who have been diagnosed with epilepsy, commenters stated that the RNS® System offers a treatment option that does not have the unpleasant side effects associated with some of these medications. The commenters stated that these side effects include problems with cognition or coordination, depression, and fatigue.

With regard to durability, one commenter provided data from the RNS® System clinical trial for 6 years. The results of the trial indicate that the median percent reduction in seizures compared to the baseline year was sustained or improved at 60 percent 3 years after implantation and 66 percent 6 years after implantation. The median follow-up time for this group of patients based on the trial's data was 5.4 years. The commenter indicated that these results are comparable, or better, for the subset of patients who were enrolled in the RNS® System clinical trial and that were Medicare beneficiaries. The commenter further stated that the updated data showed that the proportion of patients who were enrolled in the RNS® System clinical trial that experienced extended periods of seizure freedom of 3 or 6 months was slightly larger than previously shared in the November 1, 2012 new technology add-on payment application for the RNS® System.

Response: We appreciate the commenters' input. We agree with the commenters that the RNS® System offers a treatment option for a patient population that is unresponsive to currently available treatments. Specifically, we agree with the commenters that the RNS® System clinical trial data showed that the technology reduces seizure frequency in patients who have received treatment with VNS therapy or surgical resection and continued to have seizures subsequent to those treatments. We also agree with the commenters that the technology could be a treatment option for patients for whom surgical resection is not appropriate due to the location of the seizure focus or foci area(s). In addition, we agree with the commenters that use of the device improves clinical outcomes compared to currently available treatments. For example, it appears that seizure reduction over time using the RNS® System appears to be at least comparable with documented seizure reductions using VNS therapy, although no direct comparison of the two systems has been completed, and the RNS® System appears not to have the side effects that have been associated with VNS therapy. We agree with the commenters that it is inappropriate to compare the RNS® System to a technology that is not FDA approved for the same treatment.

After consideration of the public comments we received, we believe that the RNS® System meets all of the new technology add-on payment criteria. Therefore, we are approving new technology add-on payments for the RNS® System for FY 2015. Cases involving the RNS® System that are eligible for new technology add-on payments will be identified using the following ICD-9-CM procedure codes: 01.20 (Cranial implantation or replacement of neurostimulator pulse generator) in combination with 02.93 (Implantation or replacement of intracranial neurostimulator lead(s)). According to the applicant, cases using the RNS® System would incur an anticipated cost per case of $36,950. Under § 412.88(a)(2) of the regulations, new technology add-on payments are limited to the lesser of 50 percent of the average costs of the device or 50 percent of the costs in excess of the MS-DRG payment rate for the case. As a result, the maximum add-on payment for cases involving the RNS® System is $18,475 for FY 2015.

III. Changes to the Hospital Wage Index for Acute Care Hospitals

A. Background

Section 1886(d)(3)(E) of the Act requires that, as part of the methodology for determining prospective payments to hospitals, the Secretary adjust the standardized amounts “for area differences in hospital wage levels by a factor (established by the Secretary) reflecting the relative hospital wage level in the geographic area of the hospital compared to the national average hospital wage level.” We currently define hospital labor market areas based on the delineations of statistical areas established by the Office of Management and Budget (OMB). A discussion of the FY 2015 hospital wage index based on the statistical areas appears under section III.B. of the preamble of this final rule.

Section 1886(d)(3)(E) of the Act requires the Secretary to update the wage index annually and to base the update on a survey of wages and wage-related costs of short-term, acute care hospitals. This provision also requires that any updates or adjustments to the wage index be made in a manner that ensures that aggregate payments to hospitals are not affected by the change in the wage index. The adjustment for FY 2015 is discussed in section II.B. of the Addendum to this final rule.

As discussed in section III.H. of the preamble of this final rule, we also take into account the geographic reclassification of hospitals in accordance with sections 1886(d)(8)(B) and 1886(d)(10) of the Act when calculating IPPS payment amounts. Under section 1886(d)(8)(D) of the Act, the Secretary is required to adjust the standardized amounts so as to ensure that aggregate payments under the IPPS after implementation of the provisions of sections 1886(d)(8)(B), 1886(d)(8)(C), and 1886(d)(10) of the Act are equal to the aggregate prospective payments that would have been made absent these provisions. The budget neutrality adjustment for FY 2015 is discussed in section II.A.4.b. of the Addendum to this final rule.

Section 1886(d)(3)(E) of the Act also provides for the collection of data every 3 years on the occupational mix of employees for short-term, acute care hospitals participating in the Medicare program, in order to construct an occupational mix adjustment to the wage index. A discussion of the occupational mix adjustment that we are applying to the FY 2015 wage index appears under section III.F. of the preamble of this final rule.

B. Core-Based Statistical Areas for the Hospital Wage Index

1. Background

The wage index is calculated and assigned to hospitals on the basis of the labor market area in which the hospital is located. Under section 1886(d)(3)(E) of the Act, beginning with FY 2005, we delineate hospital labor market areas based on the Core-Based Statistical Areas (CBSAs) established by the Office of Management and Budget (OMB). The statistical areas used in FY 2014 are based on OMB standards published on December 27, 2000 (65 FR 82228) and Census 2000 data and Census Bureau population estimates for 2007 and 2008 (OMB Bulletin No. 10-02). For a discussion of OMB's delineations of CBSAs and our implementation of the CBSA definitions, we refer readers to the preamble of the FY 2005 IPPS final rule (69 FR 49026 through 49032). We also discussed in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51582) and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53365) that, in 2013, OMB planned to announce new labor market area delineations based on new standards adopted in 2010 (75 FR 37246) and the 2010 Census of Population and Housing data. As stated in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27552) and final rule (78 FR 50586), on February 28, 2013, OMB issued OMB Bulletin No. 13-01, which established revised delineations for Metropolitan Statistical Areas, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the use of the delineations of these statistical areas. A copy of this bulletin may be obtained at http://www.whitehouse.gov/sites/default/files/omb/bulletins/2013/b-13-01.pdf. According to OMB, “[t]his bulletin provides the delineations of all Metropolitan Statistical Areas, Metropolitan Divisions, Micropolitan Statistical Areas, Combined Statistical Areas, and New England City and Town Areas in the United States and Puerto Rico based on the standards published on June 28, 2010, in the Federal Register (75 FR 37246 through 37252) and Census Bureau data.” In this FY 2015 IPPS/LTCH PPS final rule, when referencing the new OMB geographic boundaries of statistical areas, we are using the term “delineations” rather than the term ” definitions” that we have used in the past, consistent with OMB's use of the terms (75 FR 37249).

In order to implement these changes for the IPPS, it is necessary to identify the new labor market area delineation for each county and hospital in the country. While the revisions OMB published on February 28, 2013 are not as sweeping as the changes OMB announced in 2003, the February 28, 2013 bulletin does contain a number of significant changes. For example, under the new OMB delineations, there would be new CBSAs, urban counties that would become rural, rural counties that would become urban, and existing CBSAs would be split apart. In addition, the effect of the new OMB delineations on various hospital reclassifications, the out-migration adjustment (established by section 505 of Pub. L. 108-173), and treatment of hospitals located in certain rural counties (that is, “Lugar” hospitals) provided for under section 1886(d)(8)(B) of the Act must be considered. These are just a few of the many issues that need to be reviewed regarding the effects of the new OMB labor market area delineations prior to proposing and establishing policies.

However, because the bulletin was not issued until February 28, 2013, with supporting data not available until later, and because the changes made by the bulletin and their ramifications needed to be extensively reviewed and verified, we were unable to undertake such a lengthy process before publication of the FY 2014 IPPS/LTCH PPS proposed rule and, thus, did not implement changes to the wage index for FY 2014 based on these new OMB delineations. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50586), we stated that we intended to propose changes to the wage index based on the new OMB delineations in the FY 2015 IPPS/LTCH PPS proposed rule. As discussed below, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28054 through 28064, we proposed to implement the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for the FY 2015 IPPS wage index.

2. Implementation of New Labor Market Area Delineations

As discussed previously, CMS did not implement the new OMB labor market area delineations for FY 2014 because we needed sufficient time to assess the new changes. We believe it is important for the IPPS to use the latest labor market area delineations available as soon as is reasonably possible in order to maintain a more accurate and up-to-date payment system that reflects the reality of population shifts and labor market conditions. While CMS and other stakeholders have explored potential alternatives to the current CBSA-based labor market system (we refer readers to the CMS Web site at: www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Reform.html), no consensus has been achieved regarding how best to implement a replacement system. As discussed in the FY 2005 IPPS final rule (69 FR 49027), “While we recognize that MSAs are not designed specifically to define labor market areas, we believe they do represent a useful proxy for this purpose.” We further believe that using the most current delineations will increase the integrity of the IPPS wage index system by creating a more accurate representation of geographic variations in wage levels. We have reviewed our findings and impacts relating to the new OMB delineations, and find no compelling reason to delay implementation. Therefore, we proposed to implement the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for the FY 2015 IPPS wage index. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28055), we also proposed to use these new delineations to calculate area wage indexes in a manner that is generally consistent with the CBSA-based methodologies finalized in the FY 2005 IPPS final rule, and refined in subsequent rulemaking. We also proposed a wage index transition period applicable to all hospitals that experience negative impacts due to the proposed implementation of the new OMB delineations. This transition is discussed in more detail below.

Comment: Commenters were supportive of the proposal to adopt the new OMB delineations. One commenter, while supportive of CMS' proposal to adopt the new OMB delineations, effective for FY 2015, recommended that CMS adopt an alternative hospital wage index system in future rulemaking. Another commenter suggested that CMS implement new labor market area definitions to distinguish “core” urban areas from surrounding areas within a CBSA.

Response: We appreciate the support for our proposal to adopt the new OMB delineations. For FY 2015, we did not propose any modification to the current CBSA-based labor market area methodology, aside from proposing to adopt the new OMB labor market area delineations. However, we thank the commenters for their continued interest in examining alternative means for defining labor market areas. CMS presented an alternative wage index methodology in a Report to Congress on April 11, 2012 (http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Downloads/Wage-Index-Reform-Report-to-Congress-2012.zip). As discussed in the report, implementation of such a reform would require revisions to several statutory provisions that provide various forms of wage index reclassification and redesignation. Until a consensus on wage index reform is achieved, we believe that implementing the most recent OMB delineations is critical in maintaining the efficacy and integrity of the Medicare hospital wage index system. We did not propose, nor will we finalize, any additional changes to the CBSA-based labor market area delineations, including the concept of defining core and noncore portions of a CBSA.

After consideration of the public comments we received, we are finalizing the implementation of the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective beginning with the FY 2015 IPPS wage index. We received public comments on our proposals with respect to the use of these new OMB delineations to calculate the area wage indexes and the transition periods, which we address in sections III.B.2.a. through d. of the preamble of this final rule. We also finalize our policies in those sections.

a. Micropolitan Statistical Areas

As discussed in the FY 2005 IPPS final rule (69 FR 49029 through 49032), CMS considered whether to use Micropolitan Statistical Areas to define the labor market areas for the purpose of the IPPS wage index. OMB defines a “Micropolitan Statistical Area” as a CBSA “associated with at least one urban cluster that has a population of at least 10,000, but less than 50,000” (75 FR 37252). We refer to these areas as Micropolitan Areas. After extensive impact analysis, CMS determined the best course of action would be to treat all hospitals located in Micropolitan Areas as “rural” and include them in the calculation of each State's rural wage index. Because Micropolitan areas tend to encompass smaller population centers and contain fewer hospitals than MSAs, we determined that if Micropolitan Areas were to be treated as separate labor market areas, the IPPS wage index would have included drastically more single-provider labor market areas. This larger number of labor market areas with fewer hospitals could create instability in year-to-year wage index values for a large number of hospitals; could reduce the averaging effect of the wage index, thus lessening some of the efficiency incentive inherent in a system based on the average hourly wages for a large number of hospitals; and could arguably create an inequitable system when so many hospitals have wage indexes based solely on their own wage data while other hospitals' wage indexes are based on an average hourly wage across many hospitals. For these reasons, we adopted a policy to include Micropolitan Areas in the State's rural wage area, and have continued this policy through the present.

Based upon the new 2010 Decennial Census data, a number of urban counties have switched status and have joined or became Micropolitan Areas, and some counties that once were part of a Micropolitan Area, under current OMB delineations, have become urban. Overall, there are fewer Micropolitan Areas (541) under the new OMB delineations based on the 2010 Census than existed under the latest data from the 2000 Census (581). We believe that the best course of action would be to continue the policy established in the FY 2005 IPPS final rule and include hospitals located in Micropolitan Areas in each State's rural wage index. These areas continue to be defined as having relatively small urban cores (populations of 10,000-49,999). We do not believe it would be appropriate to calculate a separate wage index for areas that typically may include only a few hospitals for the reasons set forth in the FY 2005 IPPS/LTCH PPS final rule, as discussed above. Therefore, in conjunction with our proposal to implement the new OMB labor market area delineations beginning in FY 2015, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28055), we proposed to continue to treat Micropolitan Areas as “rural” and to include the Micropolitan Areas in the calculation of each State's rural wage index.

Comment: A number of commenters supported CMS' proposal to continue to treat Micropolitan Areas as rural for hospital wage index purposes.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, in conjunction with our policy to implement the new OMB labor market area delineations beginning in FY 2015, we are continuing to treat Micropolitan Areas as “rural” and to include the Micropolitan Areas in the calculation of each State's rural wage index.

b. Urban Counties That Became Rural Under the New OMB Delineations

As previously discussed, we proposed to implement the new OMB labor market area delineations (based upon the 2010 Decennial Census data) beginning in FY 2015. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28055 through 28056), we stated that our analysis shows that a total of 37 counties (and county equivalents) and 12 hospitals that were once considered part of an urban CBSA would be considered to be located in a rural area, beginning in FY 2015, under these new OMB delineations. In the proposed rule, we included a listing of the 37 urban counties that would be rural if we finalized our proposal to implement the new OMB delineations.

We proposed that the wage data for all hospitals currently located in the 37 urban counties listed in the proposed rule would be considered rural under the new OMB delineations when calculating their respective State's rural wage index. We stated that we recognize that rural areas typically have lower area wage index values than urban areas, and hospitals located in these counties may experience a negative impact in their IPPS payment due to the proposed adoption of the new OMB delineations. We refer readers to section III.B.2.e. of the preamble of this final rule for a discussion of the proposed and finalized wage index transition period, in particular, the discussion regarding the 3-year transition for hospitals located in these specific counties.

Comment: Commenters were supportive of the proposal to adopt the new OMB delineations, including the proposed reassignment of counties from urban areas to rural areas.

Response: We appreciate the commenters' support.

As discussed above, we are finalizing our proposal to adopt the new OMB delineations. After consideration of the public comments we received, we also are finalizing our proposed reassignment of counties from urban areas to rural areas based on these new OMB delineations. The following chart lists the 37 urban counties that are considered to be rural under this policy.

Counties That Will Lose Urban Status and Become Rural

CountyStatePrevious CBSA No.CBSA
Greene CountyIN14020Bloomington, IN.
Anson CountyNC16740Charlotte-Gastonia-Rock Hill, NC-SC.
Franklin CountyIN17140Cincinnati-Middletown, OH-KY-IN.
Stewart CountyTN17300Clarksville, TN-KY.
Howard CountyMO17860Columbia, MO.
Delta CountyTX19124Dallas-Fort Worth-Arlington, TX.
Pittsylvania CountyVA19260Danville, VA.
Danville CityVA19260Danville, VA.
Preble CountyOH19380Dayton, OH.
Gibson CountyIN21780Evansville, IN-KY.
Webster CountyKY21780Evansville, IN-KY.
Franklin CountyAR22900Fort Smith, AR-OK.
Ionia CountyMI24340Grand Rapids-Wyoming, MI.
Newaygo CountyMI24340Grand Rapids-Wyoming, MI.
Greene CountyNC24780Greenville, NC.
Stone CountyMS25060Gulfport-Biloxi, MS.
Morgan CountyWV25180Hagerstown-Martinsburg, MD-WV.
San Jacinto CountyTX26420Houston-Sugar Land-Baytown, TX.
Franklin CountyKS28140Kansas City, MO-KS.
Tipton CountyIN29020Kokomo, IN.
Nelson CountyKY31140Louisville/Jefferson County, KY-IN.
Geary CountyKS31740Manhattan, KS.
Washington CountyOH37620Parkersburg-Marietta-Vienna, WV-OH.
Pleasants CountyWV37620Parkersburg-Marietta-Vienna, WV-OH.
George CountyMS37700Pascagoula, MS.
Power CountyID38540Pocatello, ID.
Cumberland CountyVA40060Richmond, VA.
King and Queen CountyVA40060Richmond, VA.
Louisa CountyVA40060Richmond, VA.
Washington CountyMO41180St. Louis, MO-IL.
Summit CountyUT41620Salt Lake City, UT.
Erie CountyOH41780Sandusky, OH.
Franklin CountyMA44140Springfield, MA.
Ottawa CountyOH45780Toledo, OH.
Greene CountyAL46220Tuscaloosa, AL.
Calhoun CountyTX47020Victoria, TX.
Surry CountyVA47260Virginia Beach-Norfolk-Newport News, VA-NC.

c. Rural Counties That Became Urban Under the New OMB Delineations

As previously discussed, we proposed to implement the new OMB labor market area delineations (based upon the 2010 Decennial Census data) beginning in FY 2015. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28056 through 28058), we indicated that analysis of these OMB labor market area delineations shows that a total of 105 counties (and county equivalents) and 81 hospitals that were located in rural areas would be located in urban areas under the new OMB delineations. In the proposed rule, we included a listing of the 105 rural counties that would be urban if we finalized our proposal to implement the new OMB delineations.

We proposed that when calculating the area wage index, the wage data for hospitals located in these 105 rural counties would be included in their new respective urban CBSAs. Typically, hospitals located in an urban area would receive a higher wage index value than hospitals located in their State's rural area. However, with regard to the wage index applicable to individual hospitals, we proposed to implement a transitional wage index adjustment for any hospital that would receive a lower wage index under the new OMB delineations than it would have received under the current CBSA definitions. We refer readers to section III.B.2.e. of the preamble of this final rule for further discussion of this transition.

Comment: Commenters were supportive of the proposal to adopt the new OMB delineations, including the proposed reassignments of counties from rural areas to urban areas for purposes of the wage index.

Response: We appreciate the commenters' support.

As discussed above, we are finalizing our proposal to adopt the new OMB delineations. After consideration of the public comments we received, we also are finalizing our proposed reassignment of counties from rural to urban for purposes of the wage index based on these new OMB delineations. The following chart lists the 105 rural counties that will be urban for purposes of the wage index for FY 2015 under this policy.

Counties That Will Lose Rural Status and Become Urban

CountyStateNew CBSA No.CBSA
Utuado MunicipioPR10380Aguadilla-Isabela, PR.
Linn CountyOR10540Albany, OR.
Oldham CountyTX11100Amarillo, TX.
Morgan CountyGA12060Atlanta-Sandy Springs-Roswell, GA.
Lincoln CountyGA12260Augusta-Richmond County, GA-SC.
Newton CountyTX13140Beaumont-Port Arthur, TX.
Fayette CountyWV13220Beckley, WV.
Raleigh CountyWV13220Beckley, WV.
Golden Valley CountyMT13740Billings, MT.
Oliver CountyND13900Bismarck, ND.
Sioux CountyND13900Bismarck, ND.
Floyd CountyVI13980Blacksburg-Christiansburg-Radford, VA.
De Witt CountyIL14010Bloomington, IL.
Columbia CountyPA14100Bloomsburg-Berwick, PA.
Montour CountyPA14100Bloomsburg-Berwick, PA.
Allen CountyKY14540Bowling Green, KY.
Butler CountyKY14540Bowling Green, KY.
St. Mary's CountyMD15680California-Lexington Park, MD.
Jackson CountyIL16060Carbondale-Marion, IL.
Williamson CountyIL16060Carbondale-Marion, IL.
Franklin CountyPA16540Chambersburg-Waynesboro, PA.
Iredell CountyNC16740Charlotte-Concord-Gastonia, NC-SC.
Lincoln CountyNC16740Charlotte-Concord-Gastonia, NC-SC.
Rowan CountyNC16740Charlotte-Concord-Gastonia, NC-SC.
Chester CountySC16740Charlotte-Concord-Gastonia, NC-SC.
Lancaster CountySC16740Charlotte-Concord-Gastonia, NC-SC.
Buckingham CountyVA16820Charlottesville, VA.
Union CountyIN17140Cincinnati, OH-KY-IN.
Hocking CountyOH18140Columbus, OH.
Perry CountyOH18140Columbus, OH.
Walton CountyFL18880Crestview-Fort Walton Beach-Destin, FL.
Hood CountyTX23104Dallas-Fort Worth-Arlington, TX.
Somervell CountyTX23104Dallas-Fort Worth-Arlington, TX.
Baldwin CountyAL19300Daphne-Fairhope-Foley, AL.
Monroe CountyPA20700East Stroudsburg, PA.
Hudspeth CountyTX21340El Paso, TX.
Adams CountyPA23900Gettysburg, PA.
Hall CountyNE24260Grand Island, NE.
Hamilton CountyNE24260Grand Island, NE.
Howard CountyNE24260Grand Island, NE.
Merrick CountyNE24260Grand Island, NE.
Montcalm CountyMI24340Grand Rapids-Wyoming, MI.
Josephine CountyOR24420Grants Pass, OR.
Tangipahoa ParishLA25220Hammond, LA.
Beaufort CountySC25940Hilton Head Island-Bluffton-Beaufort, SC.
Jasper CountySC25940Hilton Head Island-Bluffton-Beaufort, SC.
Citrus CountyFL26140Homosassa Springs, FL.
Butte CountyID26820Idaho Falls, ID.
Yazoo CountyMS27140Jackson, MS.
Crockett CountyTN27180Jackson, TN.
Kalawao CountyHI27980Kahului-Wailuku-Lahaina, HI.
Maui CountyHI27980Kahului-Wailuku-Lahaina, HI.
Campbell CountyTN28940Knoxville, TN.
Morgan CountyTN28940Knoxville, TN.
Roane CountyTN28940Knoxville, TN.
Acadia ParishLA29180Lafayette, LA.
Iberia ParishLA29180Lafayette, LA.
Vermilion ParishLA29180Lafayette, LA.
Cotton CountyOK30020Lawton, OK.
Scott CountyIN31140Louisville/Jefferson County, KY-IN.
Lynn CountyTX31180Lubbock, TX.
Green CountyWI31540Madison, WI.
Benton CountyMS32820Memphis, TN-MS-AR.
Midland CountyMI33220Midland, MI.
Martin CountyTX33260Midland, TX.
Le Sueur CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WI.
Mille Lacs CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WI.
Sibley CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WI.
Maury CountyTN34980Nashville-Davidson—Murfreesboro—Franklin, TN.
Craven CountyNC35100New Bern, NC.
Jones CountyNC35100New Bern, NC.
Pamlico CountyNC35100New Bern, NC.
St. James ParishLA35380New Orleans-Metairie, LA.
Box Elder CountyUT36260Ogden-Clearfield, UT.
Gulf CountyFL37460Panama City, FL.
Custer CountySD39660Rapid City, SD.
Fillmore CountyMN40340Rochester, MN.
Yates CountyNY40380Rochester, NY.
Sussex CountyDE41540Salisbury, MD-DE.
Worcester CountyMD41540Salisbury, MD-DE.
Highlands CountyFL42700Sebring, FL.
Webster ParishLA43340Shreveport-Bossier City, LA.
Cochise CountyAZ43420Sierra Vista-Douglas, AZ.
Plymouth CountyIA43580Sioux City, IA-NE-SD.
Union CountySC43900Spartanburg, SC.
Pend Oreille CountyWA44060Spokane-Spokane Valley, WA.
Stevens CountyWA44060Spokane-Spokane Valley, WA.
Augusta CountyVA44420Staunton-Waynesboro, VA.
Staunton CityVA44420Staunton-Waynesboro, VA.
Waynesboro CityVA44420Staunton-Waynesboro, VA.
Little River CountyAR45500Texarkana, TX-AR.
Sumter CountyFL45540The Villages, FL.
Pickens CountyAL46220Tuscaloosa, AL.
Gates CountyNC47260Virginia Beach-Norfolk-Newport News, VA-NC.
Falls CountyTX47380Waco, TX.
Columbia CountyWA47460Walla Walla, WA.
Walla Walla CountyWA47460Walla Walla, WA.
Peach CountyGA47580Warner Robins, GA.
Pulaski CountyGA47580Warner Robins, GA.
Culpeper CountyVA47894Washington-Arlington-Alexandria, DC-VA-MD-WV.
Rappahannock CountyVA47894Washington-Arlington-Alexandria, DC-VA-MD-WV.
Jefferson CountyNY48060Watertown-Fort Drum, NY.
Kingman CountyKS48620Wichita, KS.
Davidson CountyNC49180Winston-Salem, NC.
Windham CountyCT49340Worcester, MA-CT.

d. Urban Counties That Moved to a Different Urban CBSA Under the New OMB Delineations

As we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28058 through 28060), in addition to rural counties becoming urban and urban counties becoming rural, several urban counties would shift from one urban CBSA to another urban CBSA under our proposal to adopt the new OMB delineations. In certain cases, adopting the new OMB delineations would involve a change only in CBSA name or number, while the CBSA continues to encompass the same constituent counties. For example, CBSA 29140 (Lafayette, IN) would experience both a change to its number and its name, and become CBSA 29200 (Lafayette-West Lafayette, IN), while all of its three constituent counties would remain the same. For the proposed rule, we identified 19 counties that would remain in a CBSA that experienced a change in name or number under the new delineations, but would retain the same constituent counties. In the proposed rule, we included a table listing those 19 counties.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28059), we did not discuss further in this section the above proposed changes because they are inconsequential changes with respect to the IPPS wage index. However, we did discuss that, in other cases, which if we adopted the new OMB delineations, counties would shift between existing and new CBSAs, changing the constituent makeup of the CBSAs.

In one type of change, an entire CBSA would be subsumed by another CBSA. For example, CBSA 37380 (Palm Coast, FL) currently is a single county (Flagler, FL) CBSA. Flagler County would become a part of CBSA 19660 (Deltona-Daytona Beach-Ormond Beach, FL) under the new OMB delineations.

In another type of change, some CBSAs have counties that would split off to become part of or to form entirely new labor market areas. For example, CBSA 37964 (Philadelphia Metropolitan Division) currently is comprised of five Pennsylvania counties (Bucks, Chester, Delaware, Montgomery, and Philadelphia). We stated that if we adopted the new OMB delineations, Montgomery, Bucks, and Chester counties would split off and form the new CBSA 33874 (Montgomery County-Bucks County-Chester County, PA Metropolitan Division), while Delaware and Philadelphia counties would remain in CBSA 37964.

Finally, in some cases, a CBSA would lose counties to another existing CBSA if we adopted the new OMB delineations. For example, Lincoln County and Putnam County, WV would move from CBSA 16620 (Charleston, WV) to CBSA 26580 (Huntington-Ashland, WV-KY-OH). CBSA 16620 still would exist in the new labor market delineations with fewer constituent counties.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28059 through 28060), we included a listing of the urban counties that would move from one urban CBSA to another urban CBSA if we adopted the new OMB delineations. If hospitals located in these counties move from one CBSA to another under the new OMB delineations, there may be impacts, both negative and positive, upon their specific wage index values. We referred readers to section III.B.2.e. of the preamble of the proposed rule for a discussion of our proposals to moderate the impact of our proposed adoption of the new OMB delineations.

Comment: Commenters were supportive of the proposal to adopt the new OMB delineations, including the proposed reassignments of counties from one urban area to another urban area.

Response: We appreciate the commenters' support.

As discussed above, we are finalizing our proposal to adopt the new OMB delineations. After consideration of the public comments we received, we also are finalizing our proposed reassignment of counties from one urban area to another urban area for purposes of the wage index based on these new OMB delineations. The following chart identifies the 19 counties that remain in a CBSA that experienced a change in name or number under this policy, but will retain the same constituent counties for purposes of the FY 2015 wage index.

Counties That Will Remain in CBSA That Changed Number

Prior CBSA No.New CBSA No.CountyState
1448414454Norfolk CountyMA
1448414454Plymouth CountyMA
1448414454Suffolk CountyMA
4764447664Lapeer CountyMI
4764447664Livingston CountyMI
4764447664Macomb CountyMI
4764447664Oakland CountyMI
4764447664St. Clair CountyMI
2618046520Honolulu CountyHI
2914029200Benton CountyIN
2914029200Carroll CountyIN
2914029200Tippecanoe CountyIN
4204411244Orange CountyCA
4206042200Santa Barbara CountyCA
4460048260Jefferson CountyOH
4460048260Brooke CountyWV
4460048260Hancock CountyWV
1364443524Frederick CountyMD
1364443524Montgomery CountyMD

The following chart lists the urban counties that will move from one urban CBSA to another urban CBSA under our adoption of the new OMB delineations for purposes of the FY 2015 wage index.

Counties That Will Change to Another CBSA

Prior CBSANew CBSACountyState
1130026900Madison CountyIN
1134024860Anderson CountySC
1406014010McLean CountyIL
3776415764Essex CountyMA
1662026580Lincoln CountyWV
1662026580Putnam CountyWV
1697420994DeKalb CountyIL
1697420994Kane CountyIL
2194041980Ceiba MunicipioPR
2194041980Fajardo MunicipioPR
2194041980Luquillo MunicipioPR
2610024340Ottawa CountyMI
3114021060Meade CountyKY
3410028940Grainger CountyTN
3564435614Bergen CountyNJ
3564435614Hudson CountyNJ
2076435614Middlesex CountyNJ
2076435614Monmouth CountyNJ
2076435614Ocean CountyNJ
3564435614Passaic CountyNJ
2076435084Somerset CountyNJ
3564435614Bronx CountyNY
3564435614Kings CountyNY
3564435614New York CountyNY
3564420524Putnam CountyNY
3564435614Queens CountyNY
3564435614Richmond CountyNY
3564435614Rockland CountyNY
3564435614Westchester CountyNY
3738019660Flagler CountyFL
3770025060Jackson CountyMS
3796433874Bucks CountyPA
3796433874Chester CountyPA
3796433874Montgomery CountyPA
3910020524Dutchess CountyNY
3910035614Orange CountyNY
4188442034Marin CountyCA
4198011640Arecibo MunicipioPR
4198011640Camuy MunicipioPR
4198011640Hatillo MunicipioPR
4198011640Quebradillas MunicipioPR
4890034820Brunswick CountyNC
4950038660Guánica MunicipioPR
4950038660Guayanilla MunicipioPR
4950038660Peñuelas MunicipioPR
4950038660Yauco MunicipioPR

e. Transition Period

(1) Background

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28060), we stated that, overall, we believe implementing the new OMB labor market area delineations would result in wage index values being more representative of the actual costs of labor in a given area. However, we recognized that some hospitals would experience decreases in wage index values as a result of the implementation of the new labor market area delineations. We also realize that some hospitals would have higher wage index values due to the implementation of the new labor market area delineations.

We explained that, in the past, we have provided for transition periods when adopting changes that have significant payment implications, particularly large negative impacts. For example, when implementing the new OMB definitions after the 2000 Census in the FY 2005 IPPS final rule (69 FR 49032 through 49034) for FY 2005, we evaluated several options to ease the transition to the new CBSA system.

As discussed in that FY 2005 IPPS final rule, we determined that the transition to the current wage index system would have the largest negative impacts upon hospitals that were originally considered urban, but would be considered rural under the new labor market area definitions. To alleviate the decreased payments associated with having a rural wage index, in calculating the area wage index, in the FY 2005 IPPS final rule, we allowed urban hospitals that became rural under new definitions to maintain their assignment to the labor market area where they were located for FY 2004. This adjustment was granted for a period of 3 fiscal years.

In the FY 2005 IPPS final rule, for all hospitals that experienced negative payment impacts due to adoption of new labor market area definitions (for example, they were moved to an urban CBSA with a lower wage index value than their previous rural or urban labor market area), we implemented a 1-year blended adjustment. We calculated wage indexes for all hospitals using both old and new labor market definitions. Hospitals received 50 percent of their wage index based on the new OMB delineations, and 50 percent of their wage index based on their current labor market area. This adjustment only applied to hospitals that would have experienced a drop in wage index values due to a change in labor market area definitions. Hospitals that benefitted from the labor market area transition received their new wage index at the time the new labor market area definitions became effective.

We continue to have the same concerns expressed in the FY 2005 IPPS final rulemaking. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28060 through 28064), we proposed a similar transition methodology to mitigate any negative financial impacts experienced by hospitals due to our proposal to implement the new OMB labor market area delineations for FY 2015.

(2) Transition for Hospitals in Urban Areas That Would Become Rural

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28060 through 28061), for hospitals that are currently located in an urban county that would become rural under the new OMB delineations, and would have no form of wage index reclassification or redesignation in place for FY 2015 (that is, MGCRB reclassifications under section 1886(d)(10) of the Act, redesignations under section 1886(d)(8)(B) of the Act, or rural reclassifications under section 1886(d)(8)(E) of the Act), we proposed a policy to assign them the urban wage index value of the CBSA in which they are physically located for FY 2014 for a period of 3 fiscal years (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index). As stated in the FY 2005 IPPS proposed rule (69 FR 28252), we have in the past provided transitions when adopting changes that have significant payment implications, particularly large negative impacts. We believe it is appropriate to apply a 3-year transition period for hospitals located in urban counties that would become rural under the new OMB delineations, given the potentially significant payment impacts for these hospitals. This is consistent with the transition policy adopted in FY 2005 (69 FR 49032 through 49034). We continue to believe, as we stated in the FY 2005 IPPS final rule (69 FR 49033), that the longer transition period is appropriate because, as a group, we expect these hospitals would experience a steeper and more abrupt reduction in their wage index due to the labor market revisions compared to other hospitals. Assigning these hospitals the urban wage index value of the CBSA in which they are physically located for FY 2014 for a period of 3 fiscal years (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index) would be the most similar to the actual payment wage index that these hospitals received in FY 2014, thereby minimizing the negative impact of adopting the new OMB delineations for these hospitals. Accordingly, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we proposed to assign these hospitals the area wage index value of the urban CBSA in which they were geographically located in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index). For example, if urban CBSA 12345 consisted of three counties in FY 2014, and, under the new OMB delineations, one of those counties, County X, would no longer be part of CBSA 12345 and would become rural for FY 2015, we proposed that hospitals in County X would be assigned the FY 2015 wage index of CBSA 12345, computed using the remaining two counties, with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index. We believe that assigning the wage index of the hospitals' current area is the simplest and most effective method for mitigating negative payment impacts due to the proposed adoption of the new OMB delineations. We have identified relatively few hospitals that are located in urban counties that would become rural, and fewer yet that do not have a reclassification or redesignation in effect for FY 2015. Because we believe that these urban to rural transitions would be the most likely to cause significant negative payment impacts, we believe that these hospitals should be granted a longer transition period than hospitals that may be switching between urban labor market areas, which as discussed later, we proposed to apply a 1-year blended wage index.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28061), we noted that there are situations where a hospital cannot be assigned the wage index value of the CBSA to which it geographically belonged in FY 2014 because that CBSA would be split and no longer exist and some or all of the constituent counties would be added to another urban labor market area under the new OMB delineations. If the hospital cannot be assigned the wage index value of the CBSA to which it is geographically located in FY 2014 because that CBSA would be split apart and no longer exist, and some or all of its constituent counties would be added to another urban labor market area under the new OMB delineations, we proposed that hospitals located in such counties that would become rural under the new OMB delineations would be assigned the wage index of the FY 2015 urban labor market area that contains the urban county in their FY 2014 CBSA to which they are closest (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied) for a period of 3 fiscal years. We believe this approach of assigning the wage index of the FY 2015 urban labor market area that contains the urban county in their FY 2014 CBSA to which they are closest (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied) would most closely approximate the hospitals' FY 2014 actual payment wage index, thereby minimizing the negative effects of the proposed change in the OMB delineations. For example, George County, MS and Jackson County, MS, together, in FY 2014, comprise the urban CBSA 37700 (Pascagoula, MS). Under the new OMB delineations, George County would be considered rural and Jackson County, MS would become part of the urban labor market area of Gulfport-Biloxi-Pascagoula, MS (CBSA 25060). In this instance, we proposed that hospitals in George County, MS would be assigned the FY 2015 wage index for CBSA 25060 (Gulfport-Biloxi-Pascagoula, MS), with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied.

Furthermore, we proposed that any hospital that is currently located in an urban county that would become rural for FY 2015 under the new OMB delineations, but also has a reclassification or redesignation in effect for FY 2015 (from a pre-existing reclassification or redesignation granted prior to FY 2015), would not be eligible for the 3-year transition wage index. This is because if the hospital is reclassified or redesignated in some manner, it would instead receive a wage index that reflects its own choice to obtain its reclassified or redesignated status. Accordingly, if a hospital is currently located in an urban county that would become rural for FY 2015 under the new OMB delineations and such hospital sought and was granted reclassification or redesignation for FY 2015 or such hospital seeks and is granted any reclassification or redesignation for FY 2016 or FY 2017, we proposed that the hospital would permanently lose its 3-year transitional assigned wage index status, and would not be eligible to reinstate it. For example, if a hospital that is currently urban but would become rural under the new OMB delineations received a 3-year transition wage index in FY 2015 based on the wage index of the urban CBSA to which it was geographically located in FY 2014 and then by its own choice, reclassifies to obtain a different area wage index in FY 2016, the hospital would not be eligible to reinstate the transition wage index, even if it opts to cancel its reclassification for FY 2017. We proposed the transition adjustment to assist hospitals if they experience a negative payment impact specifically due to the proposed adoption of the new OMB delineations in FY 2015. If a hospital chooses in a future fiscal year to forego this transition adjustment by obtaining some form of reclassification or redesignation, we do not believe reinstatement of this transition adjustment would be appropriate. The purpose of the adjustment is to assist hospitals that may be negatively impacted by the new OMB delineations in transitioning to a wage index based on these delineations. By obtaining a reclassification or redesignation, we believe that the hospital has made the determination that the transition adjustment is not necessary because it has other viable options for mitigating the impact of the transition to the new OMB delineations.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28061), with respect to the wage index computation, we proposed to follow our existing policy regarding the inclusion of a hospital's wage index data in the CBSA in which it is geographically located (we refer readers to Step 6 of the method for computing the unadjusted wage index in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51592)). Accordingly, beginning with FY 2015, we proposed that the wage data of all hospitals receiving this type of 3-year transition adjustment would be included in the statewide rural area in which they are geographically located under the new OMB labor market area delineations. After the 3-year transition period, beginning in FY 2018, we proposed that these formerly urban hospitals discussed above would receive their statewide rural wage index, absent any reclassification or redesignation.

In addition, we proposed that the hospitals receiving this 3-year transition because they are in counties that were urban under the current CBSA definitions, but would be rural under the new OMB delineations, would not be considered urban hospitals. Rather, they would maintain their status as rural hospitals for other payment considerations. This is because our proposal to apply a 3-year transitional wage index for these newly rural hospitals only applies for the purpose of calculating the wage index under our proposal to adopt the new CBSA delineations. We did not propose transitions for other IPPS payment policies that may be impacted by the proposed adoption of the new CBSA delineations. However, we will continue to apply the existing regulations at § 412.102 with respect to determining DSH payments in the first year after a hospital loses urban status (we refer readers to section II.B.2.e.(7) of the preambles of the proposed rule and this final rule).

Comment: Commenters were supportive of CMS' proposals to provide a 3-year transition adjustment for hospitals that are shifting from urban to rural areas. Commenters appreciated CMS' attempt to mitigate the negative effects of the application of the new OMB labor market delineations. Some commenters questioned why hospitals that switch from urban to rural could benefit from a longer 3-year transition adjustment, while other hospitals that would also be negatively affected by the transition could only benefit from a single year of a blended transition adjustment. They suggested a similar 3-year transition adjustment for all hospital experiencing a negative impact, including hospitals that are moving from urban to urban, or are not moving at all, but are being impacted by other hospitals moving in or out of the labor market area.

Response: We appreciate the commenters' support for our proposals. We address comments pertaining to the difference between the 3-year urban to rural transition adjustment and the 1-year 50/50 blended wage index transition adjustment, as well as the requested 3-year transition period for all hospitals experiencing a negative impact in section III.B.2.e.(4) of the preamble of this final rule.

After consideration of the public comments we received, we are finalizing our proposals without modification. We will provide hospitals that are changing from an urban to a rural labor market area a 3-year wage index adjustment. Specifically, for hospitals that are currently located in an urban county that became rural under the new OMB delineations, and have no form of wage index reclassification or redesignation in place for FY 2015 (that is, MGCRB reclassifications under section 1886(d)(10) of the Act, redesignations under section 1886(d)(8)(B) of the Act, or rural reclassifications under section 1886(d)(8)(E) of the Act), we will assign them the urban wage index value of the CBSA in which they are physically located for FY 2014 for a period of 3 fiscal years (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index). If the hospital cannot be assigned the wage index value of the CBSA to which it is geographically located in FY 2014 because that CBSA is split apart and no longer exists, and some or all of its constituent counties are added to another urban labor market area under the new OMB delineations, hospitals located in such counties that became rural under the new OMB delineations will be assigned the wage index of the FY 2015 urban labor market area that contains the urban county in their FY 2014 CBSA to which they are closest (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied) for a period of 3 fiscal years. Any hospital that is currently located in an urban county that would become rural for FY 2015 under the new OMB delineations, but also has a reclassification or redesignation in effect for FY 2015 (from a preexisting reclassification or redesignation granted prior to FY 2015), will not be eligible for the 3-year transition wage index. Accordingly, if a hospital is currently located in an urban county that would become rural for FY 2015 under the new OMB delineations and such hospital sought and was granted reclassification or redesignation for FY 2015 or such hospital seeks and is granted any reclassification or redesignation for FY 2016 or FY 2017, the hospital will permanently lose its 3-year transitional assigned wage index status, and will not be eligible to reinstate it.

With respect to the wage index computation, we will follow our existing policy regarding the inclusion of a hospital's wage index data in the CBSA in which it is geographically located (we refer readers to Step 6 of the method for computing the unadjusted wage index in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51592)). Beginning with FY 2015, the wage data of all hospitals receiving this type of 3-year transition adjustment will be included in the statewide rural area in which they are geographically located under the new OMB delineations. After the 3-year transition period, beginning in FY 2018, these formerly urban hospitals discussed above will receive their statewide rural wage index, absent any reclassification or redesignation. In addition, the hospitals receiving this 3-year transition because they are in counties that are urban under the current CBSA definitions, but become rural under the new OMB delineations, will not be considered urban hospitals. Rather, they will maintain their status as rural hospitals for other payment considerations.

(3) Transition for Hospitals Deemed Urban Under Section 1886(d)(8)(B) of the Act Where the Urban Area Became Rural Under the New OMB Delineations

As discussed in section II.H.3. of the preamble of the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28061 through 28062) and this final rule, there are some hospitals that are currently geographically located in rural areas but are deemed to be urban under section 1886(d)(8)(B) of the Act. For FY 2015, some of these hospitals currently redesignated under section 1886(d)(8)(B) of the Act would no longer be eligible for deemed urban status under the new OMB delineations, as discussed in detail in section III.H.3. of the preamble of this final rule. Similar to the policy implemented in the FY 2005 IPPS final rule (69 FR 49059), and consistent with the policy we proposed for other hospitals in counties that were urban and would become rural under the new OMB delineations, we proposed to apply the 3-year transition to these hospitals currently redesignated to urban areas under section 1886(d)(8)(B) of the Act that would no longer be deemed urban under the new OMB delineations and would revert to being rural. That is, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we proposed to assign these hospitals the FY 2015 area wage index value of hospitals reclassified to the urban CBSA (that is, the attaching wage index) to which they were redesignated in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied). If the hospital cannot be assigned the reclassified wage index value of the CBSA to which it was redesignated in FY 2014 because that CBSA would split apart and no longer exist, and some or all of its constituent counties would be added to another urban labor market area under the new OMB delineations, we proposed that such hospitals would be assigned the wage index of the hospitals reclassified to the FY 2015 urban labor market area that contains the urban county in their FY 2014 redesignated CBSA to which they are closest for a period of 3 fiscal years. We proposed to assign these hospitals the area wage index of hospitals reclassified to a CBSA because hospitals deemed urban under section 1886(d)(8)(B) of the Act are treated as reclassified under current policy, under which such hospitals receive an area wage index that includes wage data of all hospitals reclassified to the area.

We did not receive any specific public comment addressing these proposals. In general, commenters were supportive of CMS' proposal to implement the new OMB labor market delineations, including the policy to mitigate the negative effects of the transition to a new labor market area. We are finalizing our proposal to provide a 3-year adjustment to hospitals that were deemed urban under 1886(d)(8)(B) of the Act under the current labor market delineations, but are considered rural under the new delineations. We will apply the 3-year transition to these hospitals currently redesignated to urban areas under section 1886(d)(8)(B) of the Act that are no longer be deemed urban under the new OMB delineations and will revert to being rural. That is, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we will assign these hospitals the FY 2015 area wage index value of hospitals reclassified to the urban CBSA (that is, the attaching wage index) to which they were redesignated in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied). If the hospital cannot be assigned the reclassified wage index value of the CBSA to which it was redesignated in FY 2014 because that CBSA was split apart and no longer exists, and some or all of its constituent counties were added to another urban labor market area under the new OMB delineations, such hospitals will be assigned the wage index of the hospitals reclassified to the FY 2015 urban labor market area that contains the urban county in their FY 2014 redesignated CBSA to which they are closest for a period of 3 fiscal years. We will assign these hospitals the area wage index of hospitals reclassified to a CBSA because hospitals deemed urban under section 1886(d)(8)(B) of the Act are treated as reclassified under current policy, under which such hospitals receive an area wage index that includes wage data of all hospitals reclassified to the area. Beginning in FY 2015, affected hospitals will be assigned the reclassified wage index of an urban area (as described above) for a period of up to 3 years. This wage index assignment will be forfeited if the hospital obtains any form of wage index reclassification or redesignation.

(4) Transition for Hospitals That Will Experience a Decrease in Wage Index Under the New OMB Delineations

While we believe that instituting the latest OMB labor market area delineations would create a more accurate wage index system, we also recognize that implementing the new OMB delineations may cause some short-term instability in hospital payments. Therefore, in addition to the 3-year transition adjustment for hospitals being transitioned from urban to rural status as discussed above, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28062), we proposed a 1-year blended wage index for all hospitals that would experience any decrease in their actual payment wage index (that is, a hospital's actual wage index used for payment, which accounts for all applicable effects of reclassification and redesignation) exclusively due to the proposed implementation of the new OMB delineations. Similar to the policy adopted in the FY 2005 IPPS final rule (69 FR 49033), we proposed that a post-reclassified wage index with the rural and imputed floor applied would be computed based on the hospital's FY 2014 CBSA (that is, using all of its FY 2014 constituent county/ies), and another post-reclassified wage index with the rural and imputed floor applied would be computed based on the hospital's new FY 2015 CBSA (that is, the FY 2015 constituent county/ies). We proposed to compare these two wage indexes. If the proposed FY 2015 wage index with FY 2015 CBSAs would be lower than the proposed FY 2015 wage index with FY 2014 CBSAs, we proposed that a blended wage index would be computed, consisting of 50 percent of each of the two wage indexes added together. We proposed that this blended wage index would be the hospital's wage index for FY 2015. We stated our belief that a 1-year, 50/50 blend would mitigate the short-term instability and negative payment impacts due to the proposed implementation of the new OMB delineations, providing hospitals with a transition period during which they may adjust to their new geographic CBSA or may assess any reclassification options that would be available to them starting in FY 2016. We proposed a longer 3-year transition adjustment for hospitals losing urban status because there are significantly fewer affected urban-to-rural hospitals, and we believe the negative impacts to a hospital shifting from urban to rural status would typically be greater than other types of transitions. We believe that a transition period longer than 1 year to address other impacts of the proposed adoption of new OMB delineations would reduce the accuracy of the overall labor market area wage index system because far more hospitals would be affected.

In addition, for FY 2015, for hospitals that would receive the proposed 3-year transition, it is possible that receiving the FY 2015 wage index (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied) of the CBSA where the hospital is geographically located for FY 2014 might still be less than the FY 2015 wage index that the hospital would have received in the absence of the adoption of the new OMB delineations (particularly in States where the rural floor is historically very high). Therefore, such a hospital may additionally benefit from application of the 50/50 blended wage indexes. Accordingly, we proposed to include the assignment of the 3-year transitional wage index in our calculation of the FY 2015 portion of the 50/50 blended wage index for that hospital. After FY 2015, such a hospital may revert to the second year of the 3-year transition. For example, if Hospital X (formerly part of CBSA 12345, now rural) is assigned CBSA 12345's FY 2015 wage index value of 1.0000 as part of the 3-year transition, but that FY 2015 wage index value would have been 1.1000 under the previous OMB delineations, that hospital would receive a 50/50 blended wage index of 1.0500 for FY 2015. In FY 2016 and FY 2017, Hospital X would still be eligible to receive the remaining 2 years of the 3-year transition wage index of CBSA 12345 (that is, in FY 2016, Hospital X would receive the FY 2016 wage index of CBSA 12345 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied)), and in FY 2017, Hospital X would receive the FY 2017 wage index of CBSA 12345 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied).

Comment: Commenters were generally supportive of CMS' efforts to mitigate the negative impacts from the transition to the new OMB delineations. A number of commenters requested that CMS expand the 1-year 50/50 blended wage index adjustment for a longer period of time. One commenter suggested the adjustment be phased in over multiple years, with a first year adjustment equal to the hospital's wage index under the current CBSA definitions. Several of these commenters stated that because hospitals cannot obtain an MGCRB reclassification under the new OMB delineations until FY 2016, the adjustment for FY 2015 should negate any negative impacts from the transition to the new OMB delineations. These commenters explained that the MGCRB timetable would not allow them to benefit from newly available reclassification opportunities until at least 1 year following the implementation of new OMB delineations. Other commenters questioned why hospitals that switch from urban to rural could benefit from a longer 3-year transition adjustment, while other hospitals that also would be negatively affected by the transition could only benefit from a single year of a blended transition adjustment, and requested a 3-year transition period for all hospitals experiencing a negative impact. They suggested a similar 3-year transition adjustment for affected hospitals experiencing a negative impact, including the hospitals that are moving from urban to urban, or are not moving at all, but are being impacted by other hospitals moving in or out of the labor market area.

Response: We appreciate the commenters' support. We explored multiple alternatives to the proposed 1-year 50/50 blended wage index adjustment. While we acknowledge that some providers will see negative impacts based upon the adoption of the new OMB delineations, we also point out that some providers will experience increases in their wage index values from the new OMB delineations. It is CMS' longstanding policy to provide temporary adjustments to mitigate negative impacts from the adoption of new policies or procedures. However, these adjustments must be made in a budget-neutral manner, and all wage index values would be reduced to provide for any such transition benefit.

We continue to believe that, in general, rural labor markets tend to have lower area wage index values than nearby urban areas. We proposed a longer 3-year transition adjustment for hospitals losing urban status because there are significantly fewer affected urban-to-rural hospitals, and we believe the negative impacts on a hospital shifting from urban to rural status would typically be greater than other types of transitions. We believe that a transition period longer than 1 year to address other impacts of the proposed adoption of new OMB delineations would reduce the accuracy of the overall labor market area wage index system because far more hospitals would be affected. We identified nine hospitals that could be negatively affected by their transition from urban to rural status under the new OMB delineations. Based on our experience regarding the impact of the policy established in FY 2005, we believe it is necessary to provide up to a 3-year transition adjustment for these hospitals to prevent the potential for drastic reductions in wage index values. The relatively small number of affected providers causes little concern for potential budget neutrality adjustment distortions in overall wage index values. However, significantly more providers will be negatively affected by other impacts from adopting the new labor market area delineations. Moving away from a 1-year 50/50 blend to an adjustment value that more closely approximates the hospital's previous labor market assignment, or providing for a longer transition period, would result in a significantly larger national budget neutrality adjustment. We believe the implementation of the new labor market area delineations will create more accurate representations of a hospital's labor market areas, and we do not believe it is appropriate to expand or extend the 50/50 blended wage index adjustment further than what was proposed, because doing so would only further delay what we believe are the more refined and accurate labor market areas, based on the recent 2010 Census. Because the wage index is a relative measure of the value of labor in prescribed labor market areas, we believe it is important to implement the new delineations with as minimal a transition as is reasonable.

Hospitals currently must wait more than a year for an MGCRB reclassification application to become effective. We do not believe the implementation of new OMB delineations requires any modification to this policy. We believe the 1-year 50/50 blended wage index adjustment provides an adequate safeguard against significant hospital payment reductions, and provides hospitals time to assess their reclassification options for future fiscal years.

Comment: One group of commenters suggested CMS made an error in calculating the Connecticut rural wage index value under the old FY 2014 OMB definitions. Commenters claimed that CMS incorrectly assigned a hospital as being reclassified under section 1886(d)(8)(B) of the Act (that is, a “Lugar” hospital) when calculating the wage index under the old delineations. This hospital is located in a county that became urban under the new OMB delineations. Commenters claimed that the hospital opted to waive its “Lugar” status effective for FYs 2013, 2014, and FY 2015 in order to receive its outmigration adjustment. However, when CMS calculated the FY 2014 rural wage index for the purpose of applying the proposed transition blend, CMS calculated the rural wage index with this hospital being reclassified. By including this hospital as reclassified to an urban area, the commenters claimed that the wage index based on the “old” labor market area definitions, and therefore, the proposed FY 2015 payment wage index was significantly lower than it would be if this provider was properly identified as rural under the old definitions.

Response: In prior fiscal years, the Connecticut rural wage index was set by a single hospital. While there were multiple hospitals located in rural areas in the State, all but one obtained or was granted some form of reclassification to another area. The wage data of rural hospitals that reclassify elsewhere may only be included in their State's rural wage index if doing so would increase the wage index value (section 1886(d)(8)(C)(ii) of the Act). Because including the reclassified rural Connecticut hospitals would have lowered the State's rural area wage index value, the wage index was instead based on that single hospital's data. That hospital was designated urban under section 1886(d)(8)(B) of the Act but waived this status to receive an out-migration adjustment. As discussed in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51599 through 51600), a hospital may submit a request to waive its “Lugar” status for a period of 3 years. By doing so, we would no longer consider the hospital to be reclassified and would always use that hospital's data in the calculation the State's rural wage index. (We note that while we agree that the hospital waived its reclassification status for FY 2014 by accepting the out-migration adjustment, we disagree that the hospital in question waived its reclassified status for FY 2015. According to our records, the hospital sent a letter to CMS dated July 15, 2011, requesting to accept the out-migration adjustment and waive its Lugar redesignation for FYs 2012, 2013, and 2014.) When calculating the wage index based on the “old” labor market area definitions, CMS considered this hospital as being reclassified under section 1886(d)(8)(B) of the Act. Because all the rural Connecticut hospitals were now considered reclassified, the wage index was based upon their combined data because the baseline rural wage index did not include any hospitals. The result of including all reclassified hospitals was a rural wage index value that was significantly lower than in previous years. Considering that several hospitals in Connecticut benefited from the State's rural floor, this reduction in the rural wage index affected multiple hospitals in the State.

After further consideration of the commenters' concerns, we agree with the commenters that this hospital should be treated as rural for the portion of the 1-year blended wage index under the FY 2014 delineations because this hospital had waived it Lugar status by accepting the out-migration adjustment in FY 2014. Therefore, we are revising this hospital's wage index and the wage index of the hospitals affected by this change for FY 2015, as reflected in Tables 2-2, 4A-2 and 4B-2, 4C-2, and 4D-2.

After consideration of the public comments we received, we are finalizing the transition policy as proposed. We will apply a 1-year blended wage index for all hospitals that would experience any decrease in their actual payment wage index (that is, a hospital's actual wage index used for payment, which accounts for all applicable effects of reclassification and redesignation) exclusively due to the proposed implementation of the new OMB delineations. In FY 2015, a post-reclassified wage index with the rural and imputed floor applied will be computed based on the hospital's FY 2014 CBSA (that is, using all of its FY 2014 constituent county/ies), and another post-reclassified wage index with the rural and imputed floor applied will be computed based on the hospital's new FY 2015 CBSA (that is, the FY 2015 constituent county/ies). We will compare these two wage indexes. If the FY 2015 wage index with FY 2015 CBSAs is lower than the FY 2015 wage index with FY 2014 CBSAs, a blended wage index will be computed, consisting of 50 percent of each of the two wage indexes added together. This blended wage index will be the hospital's wage index for FY 2015.

For FY 2015, for hospitals that would receive the proposed 3-year transition, it is possible that receiving the FY 2015 wage index (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied) of the CBSA where the hospital is geographically located for FY 2014 might still be less than the FY 2015 wage index that the hospital would have received in the absence of the adoption of the new OMB delineations (particularly in States where the rural floor is historically very high). In this situation, we will include the assignment of the 3-year transitional wage index in our calculation of the FY 2015 portion of the 50/50 blended wage index for that hospital. After FY 2015, such a hospital may revert to the second year of the 3-year transition.

(5) Impact of Adoption of New OMB Labor Market Area Delineations

As we did for the proposed rule (79 FR 28062 through 28063), for this final rule, to illustrate how the adoption of the new OMB labor market area delineations will impact hospitals' FY 2015 wage indexes, we compared the final FY 2015 occupational mix adjusted post-reclassified wage indexes with rural floor budget neutrality applied under the FY 2014 CBSAs and under the FY 2015 CBSAs using the new OMB delineations. (This analysis does not include the effects of the out-migration adjustment, the frontier floor, the 3-year hold harmless transition wage indexes, or the 1-year transition blended wage indexes). As a result of applying the new OMB delineations to the wage data, the wage index values for 2,409 urban hospitals (85.6 percent) and 412 (65.2 percent) rural hospitals will increase. The wage index values of 2,372 (84.3 percent) urban hospitals will increase by less than 5 percent, and the wage index values of 14 (0.5 percent) urban hospitals will increase by at least 5 percent but less than 10 percent. The wage index values of 23 (0.8 percent) urban hospitals will increase by greater than or equal to 10 percent. The wage index values of 383 (60.6 percent) rural hospitals will increase by less than 5 percent, 18 rural hospitals (2.8 percent) will increase by at least 5 percent but less than 10 percent, and 11 rural hospitals (1.7 percent) will increase by greater than or equal to 10 percent. However, the wage index values for 397 urban hospitals (14.1 percent) and 220 (34.8 percent) rural hospitals will decrease. The wage index values of 341 (12.1 percent) urban hospitals will decrease by less than 5 percent, 50 urban hospitals (1.8 percent) will decrease by at least 5 percent but less than 10 percent, and 6 urban hospitals (0.2 percent) will decrease by greater than or equal to 10 percent. The wage index values of 191 (30.2 percent) rural hospitals will decrease by less than 5 percent, 28 rural hospitals (4.4 percent) will decrease by 5 percent and less than 10 percent, and 1 rural hospital (0.2 percent) will decrease by greater than or equal to 10 percent. The wage index values of 8 (0.3 percent) urban hospitals and zero rural hospitals will remain unchanged by the adoption of the new OMB delineations. The largest positive impacts are for 8 hospitals in 5 States (Texas, Michigan, Minnesota, Louisiana, and Alabama) that will be moving from a rural to an urban area under the new OMB delineations (ranging from a 17.23 percent increase in Texas to a 24.02 percent increase in wage index in Alabama), and for 14 hospitals that will be moving from one urban CBSA (FY 2014 CBSA 20764, Edison-New Brunswick, NJ) to new urban CBSA 35614 (New York-Jersey City-White Plains, NY-NJ) under the new OMB delineations, representing a 15.13 percent increase in wage index. The largest negative impacts will be for 5 hospitals in 4 States (New York, Alabama, Idaho, and North Carolina) that will be moving from an urban to a rural area under the new OMB delineations (ranging from a 12.18 percent decrease in North Carolina to a 27.06 percent decrease in wage index in New York). One hospital in Delaware is moving from a rural to an urban area under the new OMB delineations and will experience an 11.38 percent decrease in wage index. Another hospital in Texas is moving from one urban area to another urban area under the new OMB delineations and will experience a 10.19 percent decrease in wage index. These results illustrate that hospitals that move from rural CBSAs to urban CBSAs under the new OMB delineations generally will benefit significantly, while hospitals that move from urban to rural CBSAs generally will have negative impacts. For all hospitals combined, the wage index values of 2,821 hospitals (81.9 percent) overall will increase, and 617 hospitals (17.9 percent) overall will decrease, indicating that most hospitals will be positively affected by the adoption of the new OMB delineations. Furthermore, the magnitude of the changes will be relatively small overall, with only 151 hospitals (4.4 percent) experiencing either an increase or decrease of at least 5 percent.

The following table shows the impact of the adoption of the new OMB delineations on hospitals' FY 2015 wage indexes, comparing the FY 2015 occupational mix adjusted post-reclassified wage indexes with rural floor budget neutrality applied under the FY 2014 CBSAs and the FY 2015 CBSAs using the new OMB delineations. (This analysis does not include the effects of the out-migration adjustment, the frontier floor, the 3-year hold harmless transition wage indexes, or the 1-year transition blended wage indexes.)

Percent change in FY 2015 wage indexNumber of post-reclassified rural hospitals based on FY 2014 CBSANumber of post-reclassified urban hospitals based on FY 2014 CBSATotal number of hospitals
Decrease greater than or equal to 10.0167
Decrease greater than or equal to 5.0 but less than 10.0285078
Decrease greater than or equal to 2.0 but less than 5.03388121
Decrease greater than 0.0 but less than 2.0158253411
No change088
Increase greater than 0.0 but less than 2.03762,3312,707
Increase greater than or equal to 2.0 but less than 5.074148
Increase greater than or equal to 5.0 but less than 10.0181432
Increase greater than or equal to 10.0112334
Total6322,8143,446

We did not receive any public comments on the analysis in the proposed rule showing the effects of adopting the new CBSA delineations.

(6) Budget Neutrality

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28063), for FY 2015, we proposed to apply both the 3-year transition and 50/50 blended wage index adjustments in a budget neutral manner. We proposed to make an adjustment to the standardized amount to ensure that the total payments, including the effect of the transition provisions, would equal what payments would have been if we would not be providing for any transitional wage indexes under the new OMB delineations.

We did not receive any public comments specific to our proposal to implement the 3-year transition and the 50/50 blended wage index adjustments in a budget neutral manner. We are finalizing the policy as proposed. For a complete discussion on this budget neutrality adjustment for FY 2015, we refer readers to section II.A.4.b. of the Addendum to this final rule.

We note that, consistent with past practice (69 FR 49034), we are not adopting the new OMB delineations themselves in a budget neutral manner. We do not believe that the revision to the labor market areas in and of itself constitutes an “adjustment or update” to the adjustment for area wage differences, as provided under section 1886(d)(3)(E) of the Act.

(7) Determining Disproportionate Share Hospital (DSH) Payments Under the New OMB Delineations

As noted in the FY 2005 IPPS final rule (69 FR 49033), the provisions of § 412.102 of the regulations continue to apply with respect to determining DSH payments for hospitals affected by our adoption of the new OMB delineations. Specifically, in the first year after a hospital loses urban status, the hospital would receive an additional payment that equals two-thirds of the difference between the urban DSH payments applicable to the hospital before its redesignation from urban to rural and the rural DSH payments applicable to the hospital subsequent to its redesignation from urban to rural. In the second year after a hospital loses urban status, the hospital would receive an additional payment that equals one-third of the difference between the urban DSH payments applicable to the hospital before its redesignation from urban to rural and the rural DSH payments applicable to the hospital subsequent to its redesignation from urban to rural.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28063 through 28064), we proposed to make changes to the regulations to delete § 412.64(b)(1)(ii)(D). In this provision, we currently define a “hospital reclassified as rural” as a hospital located in a county that, in FY 2004, was urban but was redesignated as rural after September 30, 2004, as a result of the most recent census data and implementation of the new MSA definitions announced by OMB on June 6, 2003. Because the term “hospital reclassified as rural” is not used in § 412.64, but is used in § 412.102, we proposed to delete § 412.64(b)(1)(ii)(D) and revise the language at § 412.102 to address the circumstances set forth in § 412.64(b)(1)(ii)(D). The regulation at § 412.102, which addresses special treatment of hospitals located in areas that are changing from urban to rural as a result of a geographic redesignation, is the only location that currently references a “hospital reclassified as rural”, as defined at § 412.64(b)(1)(ii)(D). To avoid confusion with urban hospitals that choose to reclassify as rural under § 412.103, we proposed to revise the regulation text at § 412.102 so that it no longer refers to the defined term “hospital reclassified as rural,” and instead specifically states the circumstances in which § 412.102 applies. In addition, we proposed to modify the regulation text so that it would apply to all transitions from urban to rural status that occur as a result of any future adoption of new or revised OMB standards for delineating statistical areas adopted by CMS. Specifically, we proposed to revise the regulations at § 412.102 to state that an urban hospital that was part of an MSA, but was redesignated as rural as a result of the most recent OMB standards for delineating statistical areas adopted by CMS, may receive an adjustment to its rural Federal payment amount for operating costs for 2 successive fiscal years as provided in paragraphs (a) and (b) of the section.

We did not receive any public comments regarding either of these proposals. We are finalizing the changes to § 412.102 and § 412.64(b)(1)(ii)(D) as proposed, effective for FY 2015.

C. Worksheet S-3 Wage Data for the FY 2015 Wage Index

The FY 2015 wage index values are based on the data collected from the Medicare cost reports submitted by hospitals for cost reporting periods beginning in FY 2011 (the FY 2014 wage indexes were based on data from cost reporting periods beginning during FY 2010).

1. Included Categories of Costs

The FY 2015 wage index includes the following categories of data associated with costs paid under the IPPS (as well as outpatient costs):

  • Salaries and hours from short-term, acute care hospitals (including paid lunch hours and hours associated with military leave and jury duty);
  • Home office costs and hours;
  • Certain contract labor costs and hours (which includes direct patient care, certain top management, pharmacy, laboratory, and nonteaching physician Part A services, and certain contract indirect patient care services (as discussed in the FY 2008 final rule with comment period (72 FR 47315 through 47318)); and
  • Wage-related costs, including pension costs (based on policies adopted in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51586 through 51590)) and other deferred compensation costs.

2. Excluded Categories of Costs

Consistent with the wage index methodology for FY 2014, the wage index for FY 2015 also excludes the direct and overhead salaries and hours for services not subject to IPPS payment, such as skilled nursing facility (SNF) services, home health services, costs related to GME (teaching physicians and residents) and certified registered nurse anesthetists (CRNAs), and other subprovider components that are not paid under the IPPS. The FY 2015 wage index also excludes the salaries, hours, and wage-related costs of hospital-based rural health clinics (RHCs), and Federally qualified health centers (FQHCs) because Medicare pays for these costs outside of the IPPS (68 FR 45395). In addition, salaries, hours, and wage-related costs of CAHs are excluded from the wage index, for the reasons explained in the FY 2004 IPPS final rule (68 FR 45397 through 45398).

3. Use of Wage Index Data by Suppliers and Providers Other Than Acute Care Hospitals Under the IPPS

Data collected for the IPPS wage index also are currently used to calculate wage indexes applicable to suppliers and other providers, such as SNFs, home health agencies (HHAs), ambulatory surgical centers (ASCs), and hospices. In addition, they are used for prospective payments to IRFs, IPFs, and LTCHs, and for hospital outpatient services. We note that, in the IPPS rules, we do not address comments pertaining to the wage indexes of any supplier or provider except IPPS providers and LTCHs. Such comments should be made in response to separate proposed rules for those suppliers and providers.

D. Verification of Worksheet S-3 Wage Data

The wage data for the FY 2015 wage index were obtained from Worksheet S-3, Parts II and III of the Medicare cost report for cost reporting periods beginning on or after October 1, 2010, and before October 1, 2011. For wage index purposes, we refer to cost reports during this period as the “FY 2011 cost report,” the “FY 2011 wage data,” or the “FY 2011 data.” Instructions for completing the wage index sections of Worksheet S-3 are included in the Provider Reimbursement Manual (PRM), Part 2 (Pub. No. 15-2), Chapter 40, Sections 4005.2 through 4005.4 for Form CMS-2552-10. The data file used to construct the FY 2015 wage index includes FY 2011 data submitted to us as of June 25, 2014. As in past years, we performed an extensive review of the wage data, mostly through the use of edits designed to identify aberrant data.

We asked our MACs to revise or verify data elements that result in specific edit failures. For the proposed FY 2015 wage index, we stated that we identified and excluded 50 providers with aberrant data that should not be included in the wage index, although we stated that if data elements are corrected, we intended to include data from those providers in the final FY 2015 wage index (79 FR 28064). We have since determined that we had only removed 49, not 50, providers with aberrant data from the proposed wage index. We have received corrected data from 19 providers and data from an additional provider, and therefore, we are including the data for these 20 providers in the final FY 2015 wage index. However, since issuance of the proposed rule, we have determined that the data from 4 other providers (not included in the original 49 providers) were aberrant and should not be included in the final FY 2015 wage index. Therefore, in total, we are excluding the data of 34 providers from the final FY 2015 wage index.

In constructing the FY 2015 wage index, we included the wage data for facilities that were IPPS hospitals in FY 2011, inclusive of those facilities that have since terminated their participation in the program as hospitals, as long as those data did not fail any of our edits for reasonableness. We believe that including the wage data for these hospitals is, in general, appropriate to reflect the economic conditions in the various labor market areas during the relevant past period and to ensure that the current wage index represents the labor market area's current wages as compared to the national average of wages. However, we excluded the wage data for CAHs as discussed in the FY 2004 IPPS final rule (68 FR 45397 through 45398). For the proposed rule, we removed 6 hospitals that converted to CAH status on or after February 14, 2013, the cut-off date for CAH exclusion from the FY 2014 wage index, and through and including February 13, 2014, the cut-off date for CAH exclusion from the FY 2015 wage index. After removing hospitals with aberrant data and hospitals that converted to CAH status, the final FY 2015 wage index is calculated based on 3,416 hospitals.

For the final FY 2015 wage index, we allotted the wages and hours data for a multicampus hospital among the different labor market areas where its campuses are located in the same manner that we allotted such hospitals' data in the FY 2014 wage index (78 FR 50587). Table 2 containing the final FY 2015 wage index associated with this final rule (available via the Internet on the CMS Web site) includes separate wage data for the campuses of 6 multicampus hospitals.

Comment: Commenters representing hospitals located in CBSA 46140 disagreed with the removal of the wage data of one hospital in that CBSA from the FY 2015 wage index. They argued that CMS's removal of the hospital's data is arbitrary and capricious, based only on the fact that the hospital's average hourly wage is higher than those of the other hospitals in the CBSA. The commenters noted that the hospital's data were included in the wage index in previous years, and CMS has provided “no rational explanation for its inconsistent treatment now.” The commenters further stated that “if CMS were to adopt a policy of excluding the hospital with the highest wage data from each CBSA, fairness would require that CMS also exclude the hospital with the lowest wage data from each CBSA.” The commenters stated that if CMS is employing a “bright-line cut off,” CMS must publish such “bright-line tests.”

Response: Section 1886(d)(3)(E) of the Act requires the Secretary to adjust the proportion of hospitals' costs attributable to wages and wage-related costs for area differences reflecting the relative hospital wage level in the geographic area of the hospital compared to the national average hospital wage level. We also refer readers to section 1886(d)(9)(C)(iv)(I) of the Act. Since the origin of the IPPS, the wage index has been subject to its own annual review process, first by the MACs, and then by CMS. Hospitals are aware that both the MACs (via instructions issued by CMS) and CMS evaluate the accuracy and reasonableness of hospitals' wage index data. Each year, in every IPPS proposed rule, we discuss the process wherein CMS asks the MACs to “revise or verify data elements that result in specific edit failures” (most recently, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28064)). We state that, in constructing the wage index, we include the wage data for facilities that were IPPS hospitals in the relevant cost reporting year (that is, FY 2011 for the FY 2015 wage index), and that we include “those facilities that have since terminated their participation in the program as hospitals, as long as those data did not fail any of our edits for reasonableness. We believe that including the wage data for these hospitals is, in general, appropriate to reflect the economic conditions in the various labor market areas during the relevant past period and to ensure that the current wage index represents the labor market area's current wages as compared to the national average of wages” (emphasis added; 79 FR 28064). CMS has historically exercised its discretion in developing a wage index that reflects a relative measure of the value of the labor provided to a typical hospital in a particular labor market area. We applied these same procedures, as discussed below, to the hospital at issue, and we disagree with the commenters that we have arbitrarily and capriciously removed the wage data of the cited hospital from the FY 2015 wage index.

In the instance of the particular hospital to which the commenters refer, while the hospital's wage data was properly documented, it did not merely have the highest average hourly wage in the CBSA; its average hourly wage was extremely and unusually high, significantly higher than the next highest average hourly wage in that CBSA and in the surrounding areas. We do not believe that the average hourly wage of this particular hospital accurately reflects the economic conditions in its labor market area during the FY 2011 cost reporting period, and, therefore, its inclusion in the wage index would not ensure that the FY 2015 wage index represents the labor market area's current wages as compared to the national average of wages. Accordingly, we have exercised our discretion to remove this hospital's wage data from the February 20, 2014 PUF, and from the May 2, 2014 PUF as well. Similarly, we have exercised our discretion by removing from the wage index (in FY 2015 and in prior years) the data of hospitals with average hourly wages that are unusually and uncharacteristically low for their respective CBSAs because we believe that the wage data of those hospitals also do not accurately reflect the economic conditions in their labor market area. We included the hospital's data in the wage index in previous years because the hospital's average hourly wage was lower and more reasonable relative to its labor market area in the prior years and, thus, we did not remove the hospital's wage data from the prior years' wage index.

Questions have been raised recently regarding the reporting of contract housekeeping and dietary services on Worksheet S-3, Part II, lines 33 and 35 of the Medicare cost report. CMS finalized its proposal to begin collecting contract labor costs and hours for housekeeping, and dietary (along with management services and the overhead services of administrative and general) in the FY 2003 IPPS final rule (67 FR 50022 through 50023). At that time, we stated, “We continue to consider whether to expand our contract labor definition to include more types of contract services in the wage index. In particular, we have examined whether to include the costs for acquired dietary and housekeeping services, as many hospitals now provide these services through contracts. Costs for these services tend to be below the average wages for all hospital employees. Therefore, excluding the costs and hours for these services if they are provided under contract, while including them if the services are provided directly by the hospital, creates an incentive for hospitals to contract for these services in order to increase their average hourly wage for wage index purposes” (67 FR 50022). In the FY 2003 IPPS proposed rule, we explained that we selected the three overhead services of administrative and general, housekeeping, and dietary because they are provided at all hospitals, either directly or through contracts, and together they comprise about 60 percent of a hospital's overhead hours (67 FR 31433). In the FY 2003 IPPS final rule, we stated that we “will monitor the hospital industry for information regarding the hospitals' ability to provide the data. Further, we will work with hospitals and intermediaries [MACs] to develop acceptable methods for tracking the costs and hours. Finally, before including these additional costs in the wage index, we will provide a detailed analysis of the impact of including these additional costs in the wage index values in the Federal Register and provide for public comment. Our final decision on whether to include contract indirect patient care labor costs in our calculation of the wage index will depend on the outcome of our analyses and public comments” (67 FR 50023).

Subsequent to the issuance of the FY 2003 IPPS final rule, we revised Worksheet S-3, Part II of the Medicare cost report (CMS Form 2552-96) to add four lines for the reporting of contract labor salaries, wages, and hours. The lines added for contract housekeeping and dietary services were lines 26.01 and 27.01, respectively. (Line 9.03 for contract management and line 22.01 for contract administrative and general (A&G) services were also added at that time). These lines were effective with cost reporting periods beginning on or after October 1, 2003 (that is, FY 2004). Because the cost report data used for the wage index are on a 4-year lag, data from these new contract labor lines would first be available for the FY 2008 wage index.

In the FY 2008 rulemaking process, we provided an analysis of the effect on the inclusion in the wage index of the wages and hours related to the new contract labor lines. At that time, 56 hospitals (1.6 percent) failed edits for contract housekeeping line 26.01; and 99 hospitals (2.8 percent) failed edits for contract dietary line 27.01 (72 FR 24680 and 24782). We also noted that “many of these edit failures are for wage data that are not to be included in the wage index and will be excluded through the wage index calculation. . . . In addition, some of the aberrant data will be resolved by the final rule through the correction process” (72 FR 24680 and 24782). The small percentage of hospitals that failed edits for these contract labor lines indicates that the vast majority of hospitals completing these contract labor lines were able to obtain and report reasonable salaries, wages, and hours associated with contract housekeeping and dietary services. In the FY 2008 IPPS final rule, we stated that we believe that “the impact of this policy is generally very minor, and we do not believe the additional complexity of a transition wage index is warranted for an impact this small. Further, we continue to believe it is prudent policy to include in the wage index the costs for these contract indirect patient care services” (72 FR 47316). Therefore, we adopted the policy to include the new contract labor lines in the wage index, beginning with the FY 2008 wage index.

The questions that have recently come to our attention involve hospitals that consistently do not provide documentable salaries, wages, and hours for their contracted housekeeping and/or dietary services. (On the Medicare cost report (CMS Form 2552-10), contract housekeeping is on Worksheet S-3, Part II, line 33 and contract dietary is on line 35). When this situation occurs, CMS has instructed the MACs to use reasonable estimates, such as regional average hourly rates, as a substitute for actual wages and hours, and to report the estimates on the hospital's Worksheet S-3, Part II, line 33 or line 35, respectively. Our policy has been to use reasonable estimates for these housekeeping and dietary lines, rather than report zeroes for wages and hours, because, as discussed above and as stated in the FY 2003 IPPS final rule, “{c}osts for these services tend to be below the average wages for all hospital employees. Therefore, excluding the costs and hours for these services if they are provided under contract, while including them if the services are provided directly by the hospital, creates an incentive for hospitals to contract for these services in order to increase their average hourly wage for wage index purposes” (57 FR 50022). We understand that the reason many hospitals provide for failing to report such contract wages and hours is that their contracts do not clearly specify this information, often because they use a single vendor to provide several different contract labor services. We believe that allowing hospitals to routinely use contracts that do not clearly break out the salaries, wages, and hours associated with these services as a reason for not being able to report proper salaries, wages, and hours for these cost report lines undermines the purpose of instituting these lines in the first place. Furthermore, because every hospital must provide housekeeping and dietary services, and because the wage index is a relative measure of the value of the labor provided to a hospital in a particular labor market area, to report zeroes for salaries, wages, and hours for housekeeping and dietary services is not only unrealistic (in that every hospital provides for these services), but also misrepresents the labor costs in that area and undermines our policy. Consequently, CMS has instructed the MACs not to zero out these line items when a hospital cannot document the housekeeping or dietary salaries, wages, and hours, but instead to use a reasonable estimation of these wages and hours.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28065 through 28066) rule, we reiterated our requirement that all hospitals must document salaries, wages, and hours for the purpose of reporting this information on Worksheet S-3, Part II, lines 32, 33, 34, and/or 35 (for either directly employed housekeeping and dietary employees on lines 32 and 34, and contract labor on lines 33 and 35). It is not acceptable for a hospital to request that the MACs zero out these line items if the hospital's contract does not specifically break out the actual wages and hours. As indicated above, and stated in the FY 2008 IPPS proposed rule (72 FR 24680 and 24782), a small percentage of hospitals failed edits associated with the contract housekeeping and dietary lines, showing that the vast majority of hospitals reporting data on these lines were able to obtain and report reasonable salaries, wages, and hours associated with contract housekeeping and dietary services. We encourage hospitals to ensure that their contracts clearly specify the salaries, wages, and hours related to all of their contract labor. Because these line items have been included in the cost report since FY 2004, we believe that hospitals have had adequate notice and time to structure their contracts so that the wages and hours of contract employees can be determined and included in the cost reports. We expect hospitals to provide accurate data on their cost reports.

We understand that there may be rare situations where a hospital would not have documentable salaries, wages, and hours for contract housekeeping and dietary services. In these situations, we believe that it is appropriate and necessary to use reasonable estimates for these numbers in order to determinate the best, most realistic, wage index that we can. As discussed previously, housekeeping and dietary services are unique in that the costs for housekeeping and dietary services tend to be below the average wages for all hospital employees. Thus, an incentive is created for hospitals to avoid reporting these contract labor salaries, wages, and hours on the cost report in order to increase their average hourly wage for wage index purposes. To deter hospitals from not reporting this information and to ensure that the wage index more accurately reflects the labor costs in an area, we believe that it is both necessary and appropriate for the MACs to estimate such salaries, wages, and hours in the rare instance where a hospital cannot provide such information. Therefore, in the absence of documentable wages and hours for contract housekeeping and dietary services, MACs would continue to use reasonable estimates for these services. Examples of reasonable estimates are regional average hourly rates, including an average of the wages and hours for dietary and housekeeping services of other hospitals in the same CBSA as the hospital in question. Hospitals also may conduct time studies to determine hours worked. If, for whatever reason, regional averages or time studies cannot be used, MACs may use data from the Bureau of Labor Statistics to obtain average wages and hours for housekeeping and dietary services. Commenters may also suggest alternatives for imputing reasonable estimates for possible consideration by CMS. In all cases, MACs must determine that the data used are reasonable.

Comment: One commenter encouraged CMS to instruct the MACs to be consistent across their entire jurisdiction in how the MACs estimate wages and hours for contract dietary and housekeeping services, in the instances where there is a lack of documentable wages and hours for these services. Another commenter noted that CMS stated that commenters may suggest alternatives for imputing reasonable estimates for possible consideration by CMS. This commenter asked that CMS consider eliminating entirely all wages and hours associated with dietary and housekeeping services, both for hospital employees and contract labor, based on the belief that these services represent an “immaterial” 3.27 percent of total Worksheet S-3, Part II, line 1 wages, and their removal from the wage index would remove a time-consuming burden for both providers and MACs. The commenter asserted that if all wages and hours associated with dietary and housekeeping services were eliminated from the wage index, the “comparison among hospitals would remain meaningful and would remove any disparity among hospitals related to the issue.”

Response: We agree with the first commenter that it is important for CMS' policies and instructions to be implemented uniformly by the MACs across all jurisdictions. We provide updated and uniform instructions to the MACs each year prior to the start of the annual wage index desk review process, and also communicate with the MACs through various media throughout each annual wage index cycle, including instructions on how to estimate wages and hours for contract dietary and housekeeping services in the absence of documentable wages and hours for these categories. We do not agree with the second commenter's request that CMS eliminate entirely all wages and hours associated with dietary and housekeeping services, both for hospital employees and contract labor. The IPPS wage index is a relative measure of the value of all types of labor provided to a typical hospital in a particular labor market area, not just the labor with high average hourly wages. We believe it would be inappropriate to agree to selectively include, or exclude, certain categories of labor from the wage index because doing so would result in a less accurate measure of labor costs and would undermine the relativity of the wage index as whole. We believe that hospitals have had adequate notice and time to structure their contracts so that the wages and hours of contract employees can be determined and included in the cost reports. We expect hospitals to provide accurate data on their cost reports, and the accuracy of the wages and hours of contract labor will continue to be reviewed by the MACs as part of the annual desk review process. As we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28065 through 28066), to deter hospitals from not reporting this information and to ensure that the wage index more accurately reflects the labor costs in an area, we believe that it is both necessary and appropriate for MACs to estimate such salaries, wages, and hours in the rare instance where a hospital cannot provide such information for its dietary and housekeeping services under contract. We will continue to instruct the MACs to use reasonable estimates for these services, in the absence of documentable wages and hours for contract housekeeping and dietary services.

E. Method for Computing the FY 2015 Unadjusted Wage Index

The method used to compute the FY 2015 wage index without an occupational mix adjustment follows the same methodology that we used to compute the FY 2012, FY 2013, and FY 2014 final wage indexes without an occupational mix adjustment (76 FR 51591 through 51593, 77 FR 53366 through 53367, and 78 FR 50587 through 50588, respectively).

As discussed in the FY 2012 final rule, in “Step 5,” for each hospital, we adjust the total salaries plus wage-related costs to a common period to determine total adjusted salaries plus wage-related costs. To make the wage adjustment, we estimate the percentage change in the employment cost index (ECI) for compensation for each 30-day increment from October 14, 2010, through April 15, 2012, for private industry hospital workers from the BLS' Compensation and Working Conditions. We have consistently used the ECI as the data source for our wages and salaries and other price proxies in the IPPS market basket, and we did not propose any changes to the usage for FY 2015 (79 FR 28066). The factors used to adjust the hospital's data were based on the midpoint of the cost reporting period, as indicated in the following table.

Midpoint of Cost Reporting Period

AfterBeforeAdjustment factor
10/14/201011/15/20101.02230
11/14/201012/15/20101.02078
12/14/201001/15/20111.01929
01/14/201102/15/20111.01782
02/14/201103/15/20111.01637
03/14/201104/15/20111.01494
04/14/201105/15/20111.01355
05/14/201106/15/20111.01219
06/14/201107/15/20111.01084
07/14/201108/15/20111.00948
08/14/201109/15/20111.00811
09/14/201110/15/20111.00674
10/14/201111/15/20111.00538
11/14/201112/15/20111.00403
12/14/201101/15/20121.00269
01/14/201202/15/20121.00134
02/14/201203/15/20121.00000
03/14/201204/15/20120.99866

For example, the midpoint of a cost reporting period beginning January 1, 2011, and ending December 31, 2011, is June 30, 2011. An adjustment factor of 1.01084 would be applied to the wages of a hospital with such a cost reporting period.

Using the data as described above and in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50587 through 50588), the FY 2015 national average hourly wage (unadjusted for occupational mix) is $39.2971. The FY 2015 Puerto Rico overall average hourly wage (unadjusted for occupational mix) is $16.9893.

F. Occupational Mix Adjustment to the FY 2015 Wage Index

As stated earlier, section 1886(d)(3)(E) of the Act provides for the collection of data every 3 years on the occupational mix of employees for each short-term, acute care hospital participating in the Medicare program, in order to construct an occupational mix adjustment to the wage index, for application beginning October 1, 2004 (the FY 2005 wage index). The purpose of the occupational mix adjustment is to control for the effect of hospitals' employment choices on the wage index. For example, hospitals may choose to employ different combinations of registered nurses, licensed practical nurses, nursing aides, and medical assistants for the purpose of providing nursing care to their patients. The varying labor costs associated with these choices reflect hospital management decisions rather than geographic differences in the costs of labor.

1. Development of Data for the FY 2015 Occupational Mix Adjustment Based on the 2010 Occupational Mix Survey

As provided for under section 1886(d)(3)(E) of the Act, we collect data every 3 years on the occupational mix of employees for each short-term, acute care hospital participating in the Medicare program.

As discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50588), the occupational mix adjustment to the FY 2014 wage index was based on data collected on the 2010 Medicare Wage Index Occupational Mix Survey (Form CMS-10079 (2010)). For the FY 2015 wage index, we proposed to use the occupational mix data collected on the 2010 survey to compute the occupational mix adjustment for FY 2015. We did not receive any public comments on this proposal; therefore, we are finalizing our policy to use the occupational mix data collected on the 2010 survey to compute the occupational mix adjustment for FY 2015. We are including data for 3,183 hospitals that also have wage data included in the FY 2015 wage index.

2. New 2013 Occupational Mix Survey for the FY 2016 Wage Index

As stated earlier, section 304(c) of Public Law 106-554 amended section 1886(d)(3)(E) of the Act to require CMS to collect data every 3 years on the occupational mix of employees for each short-term, acute care hospital participating in the Medicare program. We used occupational mix data collected on the 2010 survey to compute the occupational mix adjustment for FY 2013, FY 2014, and the FY 2015 wage index associated with this final rule. Therefore, a new measurement of occupational mix is required for FY 2016.

On December 7, 2012, we published in the Federal Register a notice soliciting comments on the proposed 2013 Medicare Wage Index Occupational Mix Survey (77 FR 73032 through 73033). The new 2013 survey, which will be applied to the FY 2016 wage index, includes the same data elements and definitions as the 2010 survey and provides for the collection of hospital-specific wages and hours data for nursing employees for calendar year 2013 (that is, payroll periods ending between January 1, 2013 and December 31, 2013). The comment period for the notice ended on February 5, 2013. After considering the public comments that we received on the December 2012 notice, we made a few minor editorial changes and published the 2013 survey in the Federal Register on February 28, 2013 (78 FR 13679). This survey was approved by OMB on May 14, 2013, and is available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Downloads/WAGE-INDEX-OCCUPATIONAL-MIX-SURVEY2013.pdf.

The 2013 Occupational Mix Survey Hospital Reporting Form CMS-10079 for the Wage Index Beginning FY 2016 (in excel format) is available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Files-Items/Medicare-Wage-Index-Occupational-Mix-Survey2013.html?DLPage=1&DLSort=1&DLSortDir=descending. Hospitals were required to submit their completed 2013 surveys to their MACs by July 1, 2014. The preliminary, unaudited 2013 survey data was posted on the CMS Web site afterward, on July 11, 2014. The FY 2012 Worksheet S-3 wage data for the FY 2016 wage index review and correction process was posted on the CMS Web site in May 2014. Both the preliminary FY 2016 wage data and occupational mix survey data can be found on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Files-Items/FY-2016-Wage-Index-Home-Page.html?DLPage=1&DLSort=1&DLSortDir=descending.

3. Calculation of the Occupational Mix Adjustment for FY 2015

For FY 2015, we proposed to calculate the occupational mix adjustment factor using the same methodology that we used for the FY 2012, FY 2013, and FY 2014 wage indexes (76 FR 51582 through 51586, 77 FR 53367 through 53368, and 78 FR 50588 through 50589, respectively). As a result of applying this methodology, the proposed FY 2015 occupational mix adjusted national average hourly wage (based on the proposed new OMB labor market area delineations) was $39.1177. The proposed FY 2015 occupational mix adjusted Puerto Rico-specific average hourly wage (based on the proposed new OMB labor market area delineations) was $17.0526.

Because the occupational mix adjustment is required by statute, all hospitals that are subject to payments under the IPPS, or any hospital that would be subject to the IPPS if not granted a waiver, must complete the occupational mix survey, unless the hospital has no associated cost report wage data that are included in the FY 2015 wage index. For the proposed FY 2015 wage index, because we are using the Worksheet S-3, Parts II and III wage data of 3,400 hospitals, and we are using the occupational mix surveys of 3,165 hospitals for which we also have Worksheet S-3 wage data, that represents a “response” rate of 93.1 percent (3,165/3,400). In the proposed FY 2015 wage index established in the FY 2015 IPPS/LTCH PPS proposed rule, we applied proxy data for noncompliant hospitals, new hospitals, or hospitals that submitted erroneous or aberrant data in the same manner that we applied proxy data for such hospitals in the FY 2012 wage index occupational mix adjustment (76 FR 51586).

In the FY 2011 IPPS/LTCH PPS proposed rule and final rule (75 FR 23943 and 75 FR 50167, respectively), we stated that, in order to gain a better understanding of why some hospitals are not submitting the occupational mix data, we will require hospitals that do not submit occupational mix data to provide an explanation for not complying. This requirement was effective beginning with the 2010 occupational mix survey. We instructed MACs to continue gathering this information as part of the FY 2014 and FY 2015 wage index desk review process. We stated that we would review these data for future analysis and consideration of potential penalties for noncompliant hospitals.

Comment: One commenter stated that all hospitals should be obligated to submit the occupational mix survey because failure to complete the survey jeopardizes the accuracy of the wage index. The commenter added that a penalty should be instituted for nonsubmitters. The commenter also stated that pending CMS' analysis of the Commuting Based Wage Index and the Institute of Medicine's study on geographic variation in hospital wage costs, CMS should eliminate the occupational mix survey and the significant reporting burden it creates.

Response: We appreciate the commenter's concern for the accuracy of the wage index, and we have continually exhorted all hospitals to complete and submit the occupational mix surveys. We did not propose a particular penalty for hospitals that do not submit the CY 2013 occupational mix survey, but we are continuing to consider for future rulemaking various options for ensuring full compliance. Examples include applying a hospital's occupational mix survey data from a previous survey period to the current wage index of a given fiscal year; including the occupational mix survey as part of the cost report, and if not completed, the cost report would be rejected by the MAC; or application of a State-specific minimum or reduced occupational mix adjustment. Regarding the commenter's request that CMS should eliminate the survey due to the burden it creates, section 1886(d)(3)(E) of the Act requires us to measure the earnings and paid hours of employment by occupational category. As long as the requirement to apply an occupational mix adjustment to the wage index remains in place in the statute, there may be some amount of administrative burden involved in reporting that data.

After consideration of the public comments we received, for FY 2015, we are finalizing our proposal to calculate the occupational mix adjustment factor using the same methodology that we used for the FY 2012, FY 2013, and FY 2014 wage indexes (76 FR 51582 through 51586, 77 FR 53367 through 53368, and 78 FR 50588 through 50589, respectively). As a result of applying this methodology, the FY 2015 occupational mix adjusted national average hourly wage (based on the new OMB labor market area delineations) is $39.2591. The FY 2015 occupational mix adjusted Puerto Rico-specific average hourly wage (based on the new OMB labor market area delineations) is $17.0410. For the FY 2015 wage index, because we are using the Worksheet S-3, Parts II and III wage data of 3,416 hospitals, and we are using the occupational mix surveys of 3,183 hospitals for which we also have Worksheet S-3 wage data, that represents a “response” rate of 93.2 percent (3,183/3,416).

G. Analysis and Implementation of the Occupational Mix Adjustment and the FY 2015 Occupational Mix Adjusted Wage Index

1. Analysis of the Occupational Mix Adjustment and the Occupational Mix Adjusted Wage Index

As discussed in section III.F. of the preamble of this final rule, for FY 2015, we apply the occupational mix adjustment to 100 percent of the FY 2015 wage index. We calculated the occupational mix adjustment using data from the 2010 occupational mix survey data, using the methodology described in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51582 through 51586).

Using the occupational mix survey data and applying the occupational mix adjustment to 100 percent of the FY 2015 wage index results in a national average hourly wage (based on the new OMB labor market area delineations) of $39.2591 and a Puerto-Rico specific average hourly wage of $17.0410. After excluding data of hospitals that either submitted aberrant data that failed critical edits, or that do not have FY 2011 Worksheet S-3, Parts II and III, cost report data for use in calculating the FY 2015 wage index, we calculated the FY 2015 wage index using the occupational mix survey data from 3,183 hospitals. For the FY 2015 wage index, because we are using the Worksheet S-3, Parts II and III wage data of 3,416 hospitals, and we are using the occupational mix survey data of 3,183 hospitals for which we also have Worksheet S-3 wage data, those data represent a “response” rate of 93.2 percent (3,183/3,416). The FY 2015 national average hourly wages for each occupational mix nursing subcategory as calculated in Step 2 of the occupational mix calculation are as follows:

Occupational mix nursing subcategoryAverage hourly wage
National RN37.420970136
National LPN and Surgical Technician21.78229118
National Nurse Aide, Orderly, and Attendant15.31107725
National Medical Assistant17.251053917
National Nurse Category31.769556957

The national average hourly wage for the entire nurse category as computed in Step 5 of the occupational mix calculation is $31.769556957. Hospitals with a nurse category average hourly wage (as calculated in Step 4) of greater than the national nurse category average hourly wage receive an occupational mix adjustment factor (as calculated in Step 6) of less than 1.0. Hospitals with a nurse category average hourly wage (as calculated in Step 4) of less than the national nurse category average hourly wage receive an occupational mix adjustment factor (as calculated in Step 6) of greater than 1.0.

Based on the 2010 occupational mix survey data, we determined (in Step 7 of the occupational mix calculation) that the national percentage of hospital employees in the nurse category is 43.46 percent, and the national percentage of hospital employees in the all other occupations category is 56.54 percent. At the CBSA level, using the new OMB delineations for FY 2015, the percentage of hospital employees in the nurse category ranged from a low of 21.88 percent in one CBSA to a high of 62.04 percent in another CBSA.

We compared the FY 2015 occupational mix adjusted wage indexes for each CBSA to the unadjusted wage indexes for each CBSA. We used the FY 2015 new OMB delineations for this analysis. As a result of applying the occupational mix adjustment to the wage data, the wage index values for 219 (53.8 percent) urban areas and 29 (61.7 percent) rural areas increased. One hundred and nineteen (29.2 percent) urban areas will increase by 1 percent but less than 5 percent, and 4 (1.0 percent) urban areas will increase by 5 percent or more. Fourteen (29.8 percent) rural areas will increase by 1 percent but less than 5 percent, and no rural areas will increase by 5 percent or more. However, the wage index values for 186 (45.7 percent) urban areas and 18 (38.3 percent) rural areas decreased. Seventy nine (19.4 percent) urban areas will decrease by 1 percent but less than 5 percent, and 1 (0.2 percent) urban area will decrease by 5 percent or more. Seven (14.9 percent) rural areas will decrease by 1 percent and less than 5 percent, and no rural areas will decrease by 5 percent or more. The largest positive impacts will be 6.58 percent for an urban area and 3.36 percent for a rural area. The largest negative impacts will be 5.32 percent for an urban area and 1.73 percent for a rural area. Two urban areas' wage indexes, but no rural area wage indexes, will remain unchanged by application of the occupational mix adjustment. These results indicate that a larger percentage of rural areas (61.7 percent) will benefit from the occupational mix adjustment than will urban areas (53.8 percent). However, approximately one-third (38.3 percent) of rural CBSAs will still experience a decrease in their wage indexes as a result of the occupational mix adjustment.

2. Application of the Rural, Imputed, and Frontier Floors

a. Rural Floor

Section 4410(a) of Public Law 105-33 provides that, for discharges on or after October 1, 1997, the area wage index applicable to any hospital that is located in an urban area of a State may not be less than the area wage index applicable to hospitals located in rural areas in that State. This provision is referred to as the “rural floor.” Section 3141 of Public Law 111-148 also requires that a national budget neutrality adjustment be applied in implementing the rural floor. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28068), based on the proposed FY 2015 wage index associated with the proposed rule and based on the proposed implementation of the new OMB delineations discussed in section III.B. of the preamble of the proposed rule, we estimated that 441 hospitals would receive an increase in their FY 2015 proposed wage index due to the application of the rural floor.

Based on the final FY 2015 wage index associated with this final rule and available on the CMS Web site and based on the implementation of the new OMB delineations, 422 hospitals are receiving an increase in their FY 2015 wage index due to application of the rural floor.

We received some public comments concerning the application of the rural floor. We respond to these public comments in Appendix A of this final rule.

b. Imputed Floor for FY 2015

In the FY 2005 IPPS final rule (69 FR 49109 through 49111), we adopted the “imputed floor” policy as a temporary 3-year regulatory measure to address concerns from hospitals in all-urban States that have argued that they are disadvantaged by the absence of rural hospitals to set a wage index floor for those States. Since its initial implementation, we have extended the imputed floor policy four times, the last of which was adopted in the FY 2014 IPPS/LTCH PPS final rule and is set to expire on September 30, 2014. (We refer readers to further discussion of the imputed floor in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50589 through 50590) and to our regulations at 42 CFR 412.64(h)(4).) Currently, there are two all-urban States, New Jersey and Rhode Island, that have a range of wage indexes assigned to hospitals in these States, including through reclassification or redesignation (we refer readers to discussions of geographic reclassifications and redesignations in section III.H. of the preamble of the proposed rule and this final rule). However, as we explain below, the method as of FY 2012 for computing the imputed floor (the original methodology) benefitted only New Jersey, and not Rhode Island.

In computing the imputed floor for an all-urban State under the original methodology, we calculated the ratio of the lowest-to-highest CBSA wage index for each all-urban State as well as the average of the ratios of lowest-to-highest CBSA wage indexes of those all-urban States. We then compared the State's own ratio to the average ratio for all-urban States and whichever is higher is multiplied by the highest CBSA wage index value in the State—the product of which established the imputed floor for the State. Under the current OMB labor market area delineations that we used for the FY 2014 wage index, Rhode Island has only one CBSA (Providence-New Bedford-Fall River, RI-MA) and New Jersey has 10 CBSAs. Therefore, under the original methodology, Rhode Island's own ratio equaled 1.0, and its imputed floor was equal to its original CBSA wage index value. However, because the average ratio of New Jersey and Rhode Island was higher than New Jersey's own ratio, this methodology provided a benefit for New Jersey, but not for Rhode Island.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53368 through 53369), we retained the imputed floor calculated under the original methodology as discussed above, and established an alternative methodology for computing the imputed floor wage index to address the concern that the original imputed floor methodology guaranteed a benefit for one all-urban State with multiple wage indexes (New Jersey) but could not benefit the other all-urban State (Rhode Island). The alternative methodology for calculating the imputed floor was established using data from the application of the rural floor policy for FY 2013. Under the alternative methodology, we first determined the average percentage difference between the post-reclassified, pre-floor area wage index and the post-reclassified, rural floor wage index (without rural floor budget neutrality applied) for all CBSAs receiving the rural floor. (Table 4D associated with the FY 2013 IPPS/LTCH PPS final rule (which is available on the CMS Web site) included the CBSAs receiving a State's rural floor wage index.) The lowest post-reclassified wage index assigned to a hospital in an all-urban State having a range of such values then is increased by this factor, the result of which establishes the State's alternative imputed floor. We amended § 412.64(h)(4) of the regulations to add new paragraphs to incorporate the finalized alternative methodology, and to make reference and date changes.

In summary, for the FY 2013 wage index, we did not make any changes to the original imputed floor methodology at § 412.64(h)(4) and, therefore, made no changes to the New Jersey imputed floor computation for FY 2013. Instead, for FY 2013, we adopted a second, alternative methodology for use in cases where an all-urban State has a range of wage indexes assigned to its hospitals, but the State cannot benefit from the methodology in existing § 412.64(h)(4).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50589 through 50590), we extended the imputed floor policy (both the original methodology and the alternative methodology) for 1 additional year, through September 30, 2014, while we continued to explore potential wage index reforms.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28068 through 28069), for FY 2015, we proposed to continue the extension of the imputed floor policy (both the original methodology and alternative methodology) for another year, through September 30, 2015, as we continue to explore potential wage index reforms. As discussed in section III.B. of the preamble of the proposed rule, we proposed to adopt the new OMB labor market area delineations beginning in FY 2015. Under OMB's new labor market area delineations based on Census 2010 data, Delaware would become an all-urban State, along with New Jersey and Rhode Island. Under the new OMB delineations, Delaware would have three CBSAs, New Jersey would have seven CBSAs, and Rhode Island would continue to have only one CBSA (Providence-Warwick, RI-MA). We referred readers to a detailed discussion of our proposal to adopt the new OMB labor market area delineations in section III.B. of the preamble of the proposed rule. We proposed to revise the regulations at § 412.64(h)(4) and (h)(4)(vi) to reflect the proposed 1-year extension of the imputed floor. We invited public comments on our proposal regarding the 1-year extension of the imputed floor.

Comment: Several commenters supported the CMS proposal to extend the imputed floor for 1 year, stating that it establishes an approach to remedy the competitive disadvantage suffered by all-urban States in the absence of an imputed wage index floor; and that the imputed wage index floor policy creates a climate of symmetry, equity and consistency in the Medicare reimbursement process. One commenter suggested that the industry have an opportunity to provide input to CMS prior to finalizing any decisions regarding the imputed floor policy. The commenter also suggested that if CMS decides to finalize a policy that would result in the expiration of the imputed floor, CMS afford hospitals a multiyear phase-out in order to offset their lost revenue.

One commenter stated that CMS should reconsider the extension of the imputed floor policy, and questioned what statutory authority CMS has to extend the imputed floor policy and declare new States eligible. Another commenter objected to the proposal and stated that it does not support the policy behind the imputed floor. The commenter recommended that CMS not finalize the proposal to extend the imputed floor, and stated it agreed with the rationale that CMS previously provided in the FY 2012 IPPS/LTCH PPS proposed rule (76 FR 25878 through 25879) for not proposing to extend the imputed floor policy, and urged CMS to let the policy expire.

Response: We appreciate the commenters' support for our proposal to extend the imputed floor for 1 year and are finalizing this proposal. In response to the commenters who objected to the proposed policy and made other recommendations, we will give further consideration to those comments as we continue to explore potential wage index reforms. As we have done every year since the proposal of the imputed floor, we provide and will continue to provide the industry with the opportunity to provide input on our proposals prior to finalizing any decisions regarding the imputed floor policy. We will take the commenters' recommendation to afford hospitals a multiyear phase-out into consideration should we propose not to extend the imputed floor policy in future years.

In response to the commenter who questioned what statutory authority CMS has to extend the imputed floor policy and declare new States eligible, as we stated in the FY 2005 IPPS final rule (69 FR 49110), we note that the Secretary has broad authority under section 1886(d)(3)(E) of the Act to “adjust the proportion (as estimated by the Secretary from time to time) of hospitals' costs which are attributable to wages and wage-related costs of the DRG prospective payment rates . . . for area differences in hospital wage levels by a factor (established by the Secretary) . . .” Therefore, we believe that we do have the discretion to adopt a policy that would adjust area wage indexes in the stated manner. We adopted the imputed floor policy and subsequently extended it through notice-and-comment rulemaking to address concerns from hospitals in all-urban states. Under the new OMB delineations discussed in section III.B. of the preamble of this final rule, Delaware becomes an all-urban State and, therefore, is subject to an imputed floor as well.

After consideration of the public comments we received, we are finalizing our proposal without modification to extend the imputed floor policy under both the original methodology and the alternative methodology for an additional year, through September 30, 2015, while we continue to explore potential wage index reform. We also are adopting as final the proposed revisions to § 412.64(h)(4) and (h)(4)(vi) to reflect the 1-year extension of the imputed floor.

The wage index and impact tables associated with this FY 2015 IPPS/LTCH PPS final rule that are available on the CMS Web site reflect the continued application of the imputed floor policy at § 412.64(h)(4) and a national budget neutrality adjustment for the imputed floor for FY 2015. There are 15 providers in New Jersey, and no providers in Delaware that will receive an increase in their FY 2015 wage index due to the continued application of the imputed floor policy under the original methodology. The wage index and impact tables for this FY 2015 final rule also reflect the application of the alternative methodology for computing the imputed floor, which will benefit four hospitals in Rhode Island.

c. State Frontier Floor

Section 10324 of Public Law 111-148 requires that hospitals in frontier States cannot be assigned a wage index of less than 1.0000 (we refer readers to regulations at 42 CFR 412.64(m) and to a discussion of the implementation of this provision in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50160 through 50161)). Based on the implementation of the new OMB delineations discussed in section III.B. of the preamble of this final rule, 46 hospitals will receive the frontier floor value of 1.0000 for their FY 2015 wage index in this final rule. These hospitals are located in Montana, North Dakota, South Dakota, and Wyoming. Although Nevada also is defined as a frontier State, its FY 2015 rural floor value of 1.1373 is greater than 1.0000, and therefore, no Nevada hospitals will receive a frontier floor value for their FY 2015 wage index.

We did not propose any changes to the frontier floor policy for FY 2015, and we did not receive any public comments on the issue.

The areas affected by the rural, imputed, and frontier floor policies for the FY 2015 wage index are identified in Table 4D associated with this final rule, which is available via the Internet on the CMS Web site.

3. FY 2015 Wage Index Tables

The wage index values for FY 2015 (except those for hospitals receiving wage index adjustments under section 1886(d)(13) of the Act), included in Tables 4A, 4B, 4C, and 4F, available on the CMS Web site, include the occupational mix adjustment, geographic reclassification or redesignation as discussed in section III.H. of the preamble of this final rule, and the application of the rural, imputed, and frontier State floors as discussed in section III.G.2. of the preamble of this final rule. We note that because we are adopting the new OMB labor market area delineations for FY 2015, these tables have additional tabulations to account for wage index calculations computed under the previous and the new OMB delineations.

Tables 3A and 3B, available on the CMS Web site, list the 3-year average hourly wage for each labor market area before the redesignation or reclassification of hospitals based on FYs 2009, 2010, and 2011 cost reporting periods. Table 3A lists these data for urban areas, and Table 3B lists these data for rural areas. In addition, Table 2, which is available on the CMS Web site, includes the adjusted average hourly wage for each hospital from the FY 2009 and FY 2010 cost reporting periods, as well as the FY 2011 period used to calculate the FY 2015 wage index. The 3-year averages are calculated by dividing the sum of the dollars (adjusted to a common reporting period using the method described in Step 5 in section III.G. of the preamble of this final rule) across all 3 years, by the sum of the hours. If a hospital is missing data for any of the previous years, its average hourly wage for the 3-year period is calculated based on the data available during that period. The average hourly wages in Tables 2, 3A, and 3B, which are available on the CMS Web site, include the occupational mix adjustment. The wage index values in Tables 4A, 4B, 4C, and 4D also include the national rural floor budget neutrality adjustment (which includes the imputed floor). The wage index values in Table 2 also include the out-migration adjustment for eligible hospitals. As stated above, because we are adopting the new OMB labor market area delineations for FY 2015, these tables have additional tabulations to account for wage index calculations computed under the current labor market area definitions and the new OMB labor market area delineations. In addition, for certain applicable hospitals, the wage index values included in Table 2 are computed to reflect the transitional wage index or the 50/50 blended wage index discussed in detail in section III.B.2.e. of the preamble of this final rule.

H. Revisions to the Wage Index Based on Hospital Redesignations and Reclassifications

1. General Policies and Effects of Reclassification and Redesignation

Under section 1886(d)(10) of the Act, the MGCRB considers applications by hospitals for geographic reclassification for purposes of payment under the IPPS. Hospitals must apply to the MGCRB to reclassify not later than 13 months prior to the start of the fiscal year for which reclassification is sought (generally by September 1). Generally, hospitals must be proximate to the labor market area to which they are seeking reclassification and must demonstrate characteristics similar to hospitals located in that area. The MGCRB issues its decisions by the end of February for reclassifications that become effective for the following fiscal year (beginning October 1). The regulations applicable to reclassifications by the MGCRB are located in 42 CFR 412.230 through 412.280. (We refer readers to a discussion in the FY 2002 IPPS final rule (66 FR 39874 and 39875) regarding how the MGCRB defines mileage for purposes of the proximity requirements.) The general policies for reclassifications and redesignations that we proposed for FY 2015, and the policies for the effects of hospitals' reclassifications and redesignations on the wage index, are the same as those discussed in the FY 2012 IPPS/LTCH PPS final rule for the FY 2012 final wage index (76 FR 51595 and 51596). Also, in the FY 2012 IPPS/LTCH PPS final rule, we discussed the effects on the wage index of urban hospitals reclassifying to rural areas under 42 CFR 412.103. Hospitals that are geographically located in States without any rural areas are ineligible to apply for rural reclassification in accordance with the provisions of 42 CFR 412.103. While our general policies on geographic reclassification, redesignations under section 1886(d)(8)(B) of the Act, and urban hospitals reclassifying to rural under 42 CFR 412.103 will remain unchanged for FY 2015, we note that, due to our adoption of the new OMB labor market area delineations for FY 2015, there are numerous unique classification considerations for FY 2015 that are discussed in more detail in section III.H. of the preamble of this final rule. For a discussion of the new CBSA changes based on the new OMB labor market area delineations and our implementation of those changes, we refer readers to sections III.B. and VI.C. of the preamble of this final rule.

Comment: One commenter stated that because the new OMB labor market area delineations will be effective October 1, 2014, for FY 2015, hospitals should have been given an opportunity to apply for reclassification to these new labor market areas a year ago. The commenter suggested that CMS provide a one-time expedited MGCRB application and approval process to be effective October 1, 2014.

Similarly, another commenter stated that a hospital would not have had an adequate opportunity to assess reclassification options for FY 2015 because CMS did not publish 3-year average hourly wage data based on the new OMB labor market area delineations with the FY 2014 IPPS/LTCH PPS final rule. The commenter therefore suggested that either the effective date of the implementation of the new OMB labor market areas delineations be postponed until FY 2016, or a new period be opened to allow hospitals to reclassify for FY 2015.

Response: We do not agree with these comments. We did not propose to adopt the new OMB labor market area delineations in the FY 2014 IPPS/LTCH PPS proposed rule and, therefore, did not finalize the new OMB delineations in the FY 2014 IPPS/LTCH PPS final rule. Instead, we notified hospitals of our intention to propose changes to the wage index based on the new OMB delineations in the FY 2015 IPPS/LTCH proposed and final rules (78 FR 27552 through 27553; 78 FR 50586). Therefore, hospitals could not apply for reclassification on the basis of the new OMB labor market area delineations a year ago because they had not yet been implemented. Because we had not implemented the new OMB delineations, we were unable to release data, including average hourly wage data, based on these new delineations last year.

Section 1886(d)(10)(C) of the Act mandates that hospitals must apply to the MGCRB to reclassify not later than 13 months prior to the start of the fiscal year for which reclassification is sought (generally by September 1), and the MGCRB must issue its decision within 180 days after the first day of the 13-month period preceding the fiscal year for which a hospital has filed its application. Therefore, we believe we have balanced our obligation to implement the reclassification decisions of the MGCRB with our responsibility to implement the most accurate labor market areas through the new OMB delineations in as uniform a manner as possible.

However, we recognized that the new OMB delineations could affect reclassification decisions. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28073), we stated that hospitals that wished to be reassigned to an alternate CBSA (other than the CBSA to which their reclassification would be reassigned in this proposed rule) for which they meet the applicable proximity criteria could request reassignment within 45 days from the publication of the proposed rule. We also stated that if, for whatever reason, a hospital still finds itself assigned to a labor market area that would provide a wage index for FY 2015 that is lower than the wage index the hospital would have received under the FY 2014 CBSA delineations, we proposed a 50/50 blended wage index adjustment in FY 2015 for all hospitals that would experience a decrease in their FY 2015 wage index value due to the implementation of the new OMB delineations and are finalizing this transition adjustment in this rule. This transitional adjustment will mitigate negative payment impacts for FY 2015, while providing hospitals additional time to fully assess any additional reclassification options available to them under the new OMB delineations for FY 2016. Therefore, we do not believe it is necessary to implement a one-time expedited MGCRB application and approval process, postpone the effective date of the implementation of the new OMB delineations until FY 2016, or open a new period to allow hospitals to reclassify for FY 2015.

Comment: A few commenters stated that, in cases where a countywide (group) reclassification had been previously approved by the MGCRB, a new hospital is not able to obtain the same reclassified wage index until the first year that individual hospital's wage index data match one of the 3 years' data used by the MGCRB and a new 3-year countywide reclassification is requested by the county's hospitals (which can be a 4-year delay). The commenters stated that the hospital will have a wage index lower than the hospitals with which it competes for skilled labor. The commenter suggested that CMS change its policy to allow for a timelier competitive wage index for new hospitals.

Two commenters suggested that the proximity rule for countywide reclassifications for hospitals in an urban county be modified to permit adjacent county reclassifications, regardless of whether they are in the same CSA or CBSA, or at a minimum, create an exception that would allow this in the event that half of the hospitals in the county are seeking to reclassify.

Another commenter suggested that a county be permitted to apply for designation as a “core county” if its 3-year average hourly wage is at least 108 percent of the 3-year average hourly wage of its CBSA, excluding the core county. The commenter also suggested that other counties within the same CBSA that are either adjacent to or within the same city as the core county, and whose 3-year average hourly wage is at least 85 percent of the core county's average hourly wage, be permitted to join the core county to form a “core area” if the resulting wage index is beneficial to all hospitals in the core area.

Response: We thank the commenters for their comments. We already have established criteria and processes for MGCRB reclassification, which are specified in 42 CFR 412.230 et. seq, and we did not propose any changes to these provisions for FY 2015. Consequently, we are not making any changes to address the commenter's concerns at this time. We refer the commenters to these regulations for complete details on wage index reclassifications.

2. FY 2015 MGCRB Reclassifications

a. FY 2015 Reclassification Requirements and Approvals

Under section 1886(d)(10) of the Act, the MGCRB considers applications by hospitals for geographic reclassification for purposes of payment under the IPPS. The specific procedures and rules that apply to the geographic reclassification process are outlined in regulations under 42 CFR 412.230 through 412.280.

At the time this final rule was constructed, the MGCRB had completed its review of FY 2015 reclassification requests. Based on such reviews, there were 309 hospitals approved for wage index reclassifications by the MGCRB starting in FY 2015 that did not withdraw or terminate their reclassifications within 45 days of the publication of the proposed rule. Because MGCRB wage index reclassifications are effective for 3 years, for FY 2015, hospitals reclassified beginning during FY 2013 or FY 2014 are eligible to continue to be reclassified to a particular labor market area based on such prior reclassifications for the remainder of their 3-year period. There were 155 hospitals approved for wage index reclassifications in FY 2013 that continue for FY 2015, and 270 hospitals approved for wage index reclassifications in FY 2014 that continue for FY 2015. Of all the hospitals approved for reclassification for FY 2013, FY 2014, and FY 2015, based upon the review at the time of this final rule, 734 hospitals are in a reclassification status for FY 2015.

Under the regulations at 42 CFR 412.273, hospitals that have been reclassified by the MGCRB are permitted to withdraw their applications within 45 days of the publication of a proposed rule. For information about withdrawing, terminating, or canceling a previous withdrawal or termination of a 3-year reclassification for wage index purposes, we refer readers to 42 CFR 412.273, as well as the FY 2002 IPPS final rule (66 FR 39887 through 39888) and the FY 2003 IPPS final rule (67 FR 50065 through 50066). Additional discussion on withdrawals and terminations, and clarifications regarding reinstating reclassifications and “fallback” reclassifications, were included in the FY 2008 IPPS final rule (72 FR 47333).

Changes to the wage index that result from withdrawals of requests for reclassification, terminations, wage index corrections, appeals, and the Administrator's review process for FY 2015 are incorporated into the wage index values published in this FY 2015 IPPS/LTCH PPS final rule. These changes affect not only the wage index value for specific geographic areas, but also the wage index value redesignated/reclassified hospitals receive; that is, whether they receive the wage index that includes the data for both the hospitals already in the area and the redesignated/reclassified hospitals. Further, the wage index value for the area from which the hospitals are redesignated/reclassified may be affected.

Comment: One commenter stated that CMS' policy that hospitals must request to withdraw or terminate MGCRB reclassifications within 45 days of the proposed rule is problematic because a hospital could terminate a reclassification based on information in the proposed rule, and with the publication of the final rule, discover that its original reclassified status was more desirable. The commenter stated that hospitals cannot make informed decisions concerning their reclassification status based on values in a proposed rule that are likely to change and, therefore, recommended that CMS revise its existing policy to permit hospitals to withdraw or terminate their reclassification status within 45 days of the publication of the final rule. Similarly, another commenter stated that the requirement for withdrawal of an existing reclassification is unnecessary and unfair because it requires that a hospital give up the certain benefit of the existing reclassification for the uncertain benefit of a proposal. The commenter stated that it is possible that CMS could modify the reclassification rules, and suggested that hospitals be allowed 30 days after the publication of the final rule to withdraw their reclassification requests or to reverse a withdrawal that was made based on the proposed rule in situations where data corrections could result in the hospital no longer benefiting by the alternative they selected.

Response: We did not make any proposals to change any of the reclassification processes or criteria for FY 2015. Any changes to the reclassification processes or criteria would first need to be proposed in a separate rulemaking. Consequently, we are not making any changes to address the commenters' concerns at this time. We maintain that information provided in the proposed rule constitutes the best available data to assist hospitals in making reclassification decisions. The values published in the final rule represent the final wage index values reflective of reclassification decisions.

b. Effects of Implementation of New OMB Labor Market Area Delineations on Reclassified Hospitals

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28070 through 28074), we indicated that because hospitals that have been reclassified beginning in FY 2013, 2014, or 2015 were reclassified based on the current labor market delineations, if we adopted the new OMB labor market area delineations beginning in FY 2015, the areas to which they have been reclassified, or the areas where they are located, may change. Under the new OMB delineations, we stated that many existing CBSAs would be reconfigured. We encouraged hospitals with current reclassifications to verify area wage indexes on Tables 4A-2 and 4B-2 associated with the proposed rule (which are available via the Internet on the CMS Web site), and confirm that the areas to which they have been reclassified for FY 2015 would continue to provide a higher wage index than their geographic area wage index. We stated that hospitals may withdraw their FY 2015 reclassifications by contacting the MGCRB within 45 days from the publication of the proposed rule.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28070), we stated that, in some cases, adopting the new OMB delineations would result in counties splitting apart from CBSAs to form new CBSAs, or counties shifting from one CBSA designation to another CBSA. Reclassifications granted under section 1886(d)(10) of the Act are effective for 3 fiscal years so that a hospital or county group of hospitals would be assigned a wage index based upon the wage data of hospitals in a nearby labor market area for a 3-year period. If CBSAs are split apart, or if counties shift from one CBSA to another under the new OMB delineations, it raises the question of how to continue a hospital's reclassification for the remainder of its 3-year reclassification period, if that area to which the hospital reclassified no longer exists, in whole or in part. We dealt with this question in FY 2005 as well when CMS adopted the current OMB labor market area definitions. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28071), we indicated that, consistent with the policy CMS implemented in the FY 2005 IPPS final rule (69 FR 49054 through 49056), if a CBSA would be reconfigured due to the new OMB delineations and it would not be possible for the reclassification to continue seamlessly to the reconfigured CBSA, we believe it is appropriate for us to determine the best alternative location to reassign current reclassifications for the remaining 3 years. Therefore, to maintain the integrity of a hospital's 3-year reclassification period, we proposed a policy to assure that current geographic reclassifications (applications approved for FY 2013, FY 2014, or FY 2015) that would be affected by CBSAs that are split apart or counties that shift to another CBSA under the new OMB delineations, would ultimately be assigned to a CBSA under the new OMB delineations that contains at least one county from the reclassified CBSA under the current FY 2014 OMB definitions, and would be generally consistent with rules that govern geographic reclassification. That is, consistent with the policy finalized in FY 2005 (69 FR 49054 and 49055), we proposed a general policy that affected reclassified hospitals would be assigned to a CBSA that (1) would contain the most proximate county that is located outside of the hospital's proposed FY 2015 geographic labor market area, and (2) is part of the original FY 2014 CBSA to which the hospital is reclassified. We stated our belief that by assigning reclassifications to the CBSA that contains the nearest eligible county (as described above) satisfies the statutory requirement at section 1886(d)(10)(v) of the Act by maintaining reclassification status for a period of 3 fiscal years, while generally respecting the longstanding principle of geographic proximity in the labor market reclassification process. The hospitals that we proposed to reassign to a different CBSA based on our proposed policy above were listed in a special Table 9A-2 for the proposed rule, which is available via the Internet on the CMS Web site. In addition, we proposed to allow a hospital, or county group of hospitals, to request reassignment to another CBSA that would contain a county that is part of the current FY 2014 CBSA to which they are reclassified, if the hospital or county group of hospitals can demonstrate compliance with applicable reclassification proximity rules, as described later in this section.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28071), we stated that we recognize that this proposed reclassification reassignment described for hospitals that are reclassified to CBSAs that would split apart or to counties that would shift to another CBSA under the new OMB delineations may result in the reassignment of the hospital for the remainder of its 3-year reclassification period to a CBSA having a lower wage index than the wage index that would have been assigned for the reclassified hospital in the absence of the proposed adoption of the new OMB delineations. Therefore, as discussed in section III.B.2.e.(4) of the preamble of the proposed rule, we proposed (and are finalizing in this final rule) that all hospitals that would experience a decrease in their FY 2015 wage index value due to the proposed implementation of the new OMB delineations would receive a 50/50 blended wage index adjustment in FY 2015. For FY 2015, we proposed to calculate a wage index value based on the current FY 2014 OMB definitions, and a wage index value based upon the proposed new OMB delineations (including reclassification assignments discussed in this section). If the wage index under the proposed new OMB delineations would be lower than the wage index calculated with the current (FY 2014) OMB definitions, we proposed that the hospital would be assigned a blended wage index (50 percent of the current; 50 percent of the proposed). We stated our belief that this proposed transitional adjustment would mitigate negative payment impacts for FY 2015, and would afford hospitals additional time to fully assess any additional reclassification options available to them under the new OMB delineations.

We are including the following descriptions of specific situations where we have determined that reassignment of reclassification areas is appropriate.

(1) Reclassifications to CBSAs That Are Subsumed by Other CBSAs

For the proposed rule (79 FR 28070), we identified 66 counties that are currently located in CBSAs that would be subsumed by another CBSA under the new OMB labor market area delineations. As a result, hospitals reclassifying to those CBSAs would now find that their reclassifications are to a CBSA that no longer exists. For these hospitals, we proposed to reassign reclassifications to the newly configured CBSA to which all of the original constituent counties in the FY 2014 CBSA are transferred. For example, CBSA 11300 (Anderson, IN) would no longer exist under the proposed FY 2015 delineations. The only constituent county in CBSA 11300, Madison County, IN, would be moving to CBSA 26900 (Indianapolis-Carmel-Anderson, IN). Because the original Anderson, IN labor market area no longer exists, we proposed to reassign reclassifications from the original Anderson, IN labor market area to a newly configured CBSA where the original constituent county or counties are transferred, which is Indianapolis-Carmel-Anderson, IN. For hospitals reclassified to a CBSA that would be subsumed by another CBSA, we included a table in the proposed rule that reflected the hospitals' current reclassified CBSA, and the CBSA to which we proposed to assign them for FY 2015 (79 FR 28071).

We did not receive any public comments regarding this proposal to reassign hospitals reclassified to CBSAs that were subsumed by another CBSA. Therefore, we are finalizing this provision as proposed. For any hospital that is reclassified to a CBSA that no longer exists, and all of the CBSA's constituent counties moved to another CBSA under the new OMB delineations, we assigned that hospital's reclassification to the subsuming CBSA to which all of the original constituent counties in the FY 2014 CBSA are transferred.

The following table lists 63 hospitals that are currently located in CBSAs that will be subsumed by another CBSA under the new OMB labor market area delineations and reflects the hospitals' current reclassified CBSA and the CBSA to which we are assigning them for FY 2015. We note that three hospitals have terminated their reclassification since publication of the proposed rule and have been omitted.

Hospital Reclassification Reassignments for Hospitals Reclassified to a CBSA That Is Subsumed by Another CBSA

CMS Certification Number (CCN)Current reclassified CBSANew CBSA
0500224204411244
0500544204411244
0501024204411244
0502434204411244
0502924204411244
0503294204411244
0503904204411244
0504234204411244
0505344204411244
0505734204411244
0506844204411244
0506864204411244
0507014204411244
0507654204411244
0507704204411244
1400671406014010
1500891130026900
2200011448414454
2200021448414454
2200081448414454
2200111448414454
2200191448414454
2200201448414454
2200491448414454
2200581448414454
2200621448414454
2200631448414454
2200701448414454
2200731448414454
2200741448414454
2200821448414454
2200841448414454
2200901448414454
2200951448414454
2200981448414454
2201011448414454
2201051448414454
2201631448414454
2201711448414454
2201751448414454
2201761448414454
2300024764447664
2300204764447664
2300244764447664
2300534764447664
2300894764447664
2301044764447664
2301424764447664
2301464764447664
2301654764447664
2301764764447664
2302444764447664
2302704764447664
2302734764447664
2302974764447664
3901511364443524
4100011448414454
4100041448414454
4100051448414454
4100071448414454
4100101448414454
4100111448414454
4100121448414454

(2) Reclassification to CBSAs Where the CBSA Number or Name Changed or to CBSAs Containing Counties That Moved to Another CBSA

For the proposed rule (79 FR 28072), we identified six CBSAs with current reclassifications that would maintain the same constituent counties, but the CBSA number or name would change if we adopted the new OMB delineations. For example, CBSA 29140 (Lafayette, IN) currently contains three counties (Benton, Carroll, and Tippecanoe Counties). The CBSA name and number for these counties would change to CBSA 29200 (Lafayette-West Lafayette, IN) under the new OMB delineations. Because the constituent counties in these CBSAs would not change under the new delineations, we would consider these CBSAs to be unchanged, and we did not propose any reassignment for hospitals reclassified to those labor market areas.

In the proposed rule, we identified eight CBSAs with current reclassifications that have one or more counties that would split off and move to a new CBSA or to a different existing CBSA under the new OMB delineations. These CBSAs are shown in the following table.

Current FY 2014 CBSACurrent FY 2014 CBSA name
16620Charleston, WV.
16974Chicago-Joliet-Naperville, IL.
20764Edison-New Brunswick, NJ.
31140Louisville/Jefferson County, KY-IN.
35644New York-White Plains-Wayne, NY-NJ.
37964Philadelphia, PA.
39100Poughkeepsie-Newburgh-Middletown, NY.
48900Wilmington, NC.

In the proposed rule, we determined that 69 hospitals had current reclassifications to one of these CBSAs. Similar to the methodology finalized in the FY 2005 IPPS final rule (69 FR 49054 through 49055), we proposed to follow the general policy discussed in section III.H.2.b. of the preamble of the proposed rule. Specifically, we proposed that affected reclassified hospitals would be assigned to a CBSA (under the new OMB delineations) that would contain the most proximate county that is (1) located outside of the hospital's proposed FY 2015 geographic labor market area; and (2) is included in the current CBSA to which they are reclassified. For each of the 69 hospitals, we conducted a mapping analysis and determined driving distances from their geographic location to the borders of each county that is in the reclassified CBSA under the FY 2014 delineations and is also included in a CBSA under the new OMB delineations, excluding any counties that would be located in the hospital's proposed FY 2015 geographic labor market area. Following the general reassignment principle that we proposed, we proposed to reassign those reclassified hospitals to the CBSA which contains the geographically closest county. For example, there are hospitals that currently are reclassified to CBSA 39100 (Poughkeepsie-Newburgh-Middletown, NY) under the FY 2014 delineations, which is comprised of Dutchess County and Orange County, NY. Under the new OMB delineations, Dutchess County would become part of new CBSA 20524 (Dutchess County-Putnam County, NY), while Orange County would join CBSA 35614 (New York-Jersey City-White Plains, NY-NJ Metropolitan Division). Therefore, we mapped the distances from one reclassified hospital to the border of Dutchess County and Orange County, NY (the two counties that were part of CBSA 39100 under the FY 2014 delineations). Our analysis showed that the hospital is 2.2 miles from Dutchess County, and 25.9 miles from Orange County. Therefore, we proposed to reassign this hospital's reclassification from the FY 2014 CBSA 39100 to the new CBSA 20524.

For the proposed rule, we also identified affected county group reclassifications. For these reclassifications, we proposed that we would follow our proposed policy discussed above, except that, for county group reclassifications, we proposed to reassign hospitals in a county group reclassification to the CBSA under the new OMB delineations to which the majority of hospitals in the group reclassification are geographically closest. Because hospitals in a county group applied as a group, we believe the reassignment should also be applied to the whole group. For example, the hospitals of Fairfield County, CT are reclassified as a group to CBSA 35644 under the FY 2014 delineations. Under the new OMB delineations, CBSA 35644 would no longer exist and would be split into the following two new CBSAs: 20524 (Dutchess County-Putnam County, NY) and 35614 (New York-Jersey City-White Plains, NY-NJ). Of the six hospitals in the group reclassification, all but one would be closer to an eligible county (Westchester, NY) in CBSA 35614 than to an eligible county (Putnam, NY) in CBSA 20524. Because these hospitals in Fairfield, CT applied as a group, we believe the reassignment should also be applied to the whole group. Therefore, we proposed to assign the hospitals in this group reclassification to CBSA 35614, the reconfigured CBSA to which the majority of the hospitals in the group reclassification are geographically closest.

To summarize, of the 69 hospitals identified in the proposed rule as reclassified to 1 of the 8 CBSAs in the preceding table that have counties that would split off and move to a new CBSA or a different existing CBSA under the new OMB delineations, there are 27 hospitals that would maintain the same reclassified CBSA number under our proposals. Another 28 hospitals would be reassigned to a reconfigured CBSA that would contain a similar number of counties from their current reclassified CBSA. For the remaining 14 reclassified hospitals, we proposed to assign them to a CBSA (under the new OMB delineations) that would have a different CBSA number from the labor market area to which they are currently reclassified (under the current FY 2014 delineations). This is because if the original CBSA to which the hospitals are reclassified is losing counties to another urban CBSA, it may be that the original reclassification determination would not be reflective of the new delineations. In addition, because proximity to a CBSA is a requirement of reclassifications approved under section 1886(d)(10) of the Act, we stated our belief that it is appropriate to propose to reassign reclassification status to an urban CBSA that contains the county (from the hospital's current CBSA reclassification) that is closest to the hospital. We stated our belief that this would more accurately reflect the geographic labor market area of the reclassified hospital.

Consistent with refinements implemented in the FY 2005 IPPS final rule (69 FR 49055), we proposed to allow hospitals that reclassified under section 1886(d)(10) of the Act to one of the eight CBSAs that split (that is, current FY 2014 CBSAs 16620, 16974, 20764, 31140, 35644, 37964, 39100, 48900) to be reclassified to any CBSA containing a county from their original reclassification labor market area, provided that the hospital demonstrates that it meets the applicable proximity requirements under 42 CFR 412.230(b) and (c) (for individual hospitals), 42 CFR 412.232(a)(1) (for a rural group), and 42 CFR 412.234(a)(2) and (a)(3) (for an urban group) to that CBSA. We stated that hospitals that wished to be reassigned to an alternate CBSA (other than the CBSA to which their reclassification would be reassigned in this proposed rule) for which they meet the applicable proximity criteria could request reassignment within 45 days from the publication of the proposed rule. Hospitals had to send a request to WageIndex@cms.hhs.gov and provide documentation certifying that they meet the requisite proximity criteria for reassignment to an alternate CBSA, as described above. We stated our belief that this option of allowing hospitals to submit a request to CMS would provide hospitals with greater flexibility with respect to their reclassification reassignment, while ensuring that the proximity requirements are met. We believe that where the proximity requirements are met, the reclassified wage index would be consistent with the labor market area to which the hospitals were originally approved for reclassification. Under this proposed policy, a hospital could request to be assigned a reclassification to any CBSA that contains any county from the CBSA to which it is currently reclassified. However, to be reassigned to an area that is not the most proximate to the hospital (or the majority of hospitals in a county group), we believe it is necessary that the hospital demonstrates that it complies with the applicable proximity criteria. If a hospital cannot demonstrate proximity to an alternate CBSA, the hospital would not be considered for reclassification to that labor market area, and reassignment would remain with the closest eligible (new) CBSA.

In the proposed rule (79 FR 28073), we included a table showing proposed hospital reclassification assignments for hospitals reclassified to CBSAs from which counties would be split off and moved to a different CBSA under the new OMB delineations. The table showed the current reclassified CBSA and the CBSA to which CMS proposed reassignment.

We proposed that hospitals that disagreed with our determination of the most proximate county had to provide an alternative method for determining proximity to CMS within 45 days from the publication of the proposed rule. We stated that changes to a hospital's CBSA assignment on the basis of a hospital's disagreement with our determination of closest county, or on the basis of being granted a reassignment due to meeting applicable proximity criteria to an eligible CBSA would be announced in this FY 2015 IPPS/LTCH PPS final rule.

Comment: Commenters were generally supportive of our proposal to adopt the new OMB delineations. Commenters did not specifically address the proposed assignment of reclassification status for hospitals that are reclassified to labor market areas where the CBSA number or name changed or to CBSAs containing counties that moved to another CBSA.

Response: We thank the commenters for their support of our proposal to implement the new OMB delineations for the hospital wage index.

After consideration of the public comments we received, we are finalizing the reassignment methodology as proposed. Hospitals that were reclassified to a CBSA that had one or more counties that split off and moved to another CBSA under the new OMB delineations are reclassified to a CBSA that will contain the most proximate county that (1) is located outside of the hospital's FY 2015 geographic labor market area; and (2) is included in the current CBSA to which they are reclassified. Group reclassifications are assigned to the CBSA under the new OMB delineations to which the majority of hospitals in that group reclassification are geographically closest and that (1) is located outside of the hospital's FY 2015 geographic labor market area; and (2) is included in the current CBSA to which they are reclassified.

We also allowed hospitals that reclassified under section 1886(d)(10) of the Act to one of the eight CBSAs that split (that is, current FY 2014 CBSAs 16620, 16974, 20764, 31140, 35644, 37964, 39100, 48900) to be reclassified to any CBSA containing a county from their original reclassification labor market area, provided that the hospital demonstrates that it meets the applicable proximity requirements under 42 CFR 412.230(b) and (c) (for individual hospitals), 42 CFR 412.232(a)(1) (for a rural group), and 42 CFR 412.234(a)(2) and (a)(3) (for an urban group) to that CBSA. Hospitals that wished to be reassigned to an alternate CBSA (other than the CBSA to which their reclassification would be reassigned in this proposed rule) for which they meet the applicable proximity criteria needed to request reassignment within 45 days from the publication of the proposed rule. We received one request in the WageIndex@cms.hhs.gov mailbox to request reassignment to another eligible labor market area. A rural hospital in North Carolina was originally reclassified to CBSA 48900 (Wilmington, NC). This CBSA had more than one county that was split off and moved to another CBSA under the new OMB delineations. Thus, under our proposed policy (which we are finalizing in this final rule), we reclassified this hospital to a CBSA that contained the most proximate county that is located outside of the hospital's FY 2015 geographic labor market area and is included in the current CBSA to which it is reclassified. Of all the former constituent counties of CBSA 48900, the hospital is geographically closest to Brunswick County, NC, which is outside of the hospital's FY 2015 geographic labor market area and is included in the current CBSA to which the hospital is reclassified. However, under the new OMB delineations, Brunswick County is moved from CBSA 48900 to CBSA 34820 (Myrtle Beach-Conway-North Myrtle Beach, SC-NC). Therefore, we assigned this hospital's reclassification to CBSA 34820 in the proposed rule. The hospital provided adequate evidence to demonstrate that it is located within 35-miles from Pender County, NC, which remains part of CBSA 48900. Because the proximity criteria limit for MGCRB reclassification of an individual rural hospital is 35 miles (§ 412.230(b)(1)), we are approving the hospital's request for reassignment back to CBSA 48900. The change is reflected in the proceeding table.

The following table shows hospital reclassification assignments for hospitals reclassified to CBSAs from which counties were split off and moved to a different CBSA under the new OMB delineations. The following table shows the current reclassified CBSA and the CBSA to which CMS is making reassignments. We note that 23 hospitals terminated their reclassification status since the proposed rule was published and have been omitted.

Hospital Reclassification Reassignments for Hospitals That Are Reclassified to CBSAs From Which Counties Are Split Off and Moved to a Different CBSA

CMS Certification Number (CCN)Current reclassified CBSAFY 2015 reassigned CBSA
1400121697420994
1401101697416974
1401551697416974
1401611697416974
1401861697416974
1500021697416974
1500041697416974
1500081697416974
1500341697416974
1500901697416974
1501251697416974
1501261697416974
1501651697416974
1501661697416974
1800123114031140
1800483114031140
3100023564435614
3100093564435614
3100143796437964
3100153564435614
3100173564435614
3100312076435614
3100503564435614
3100543564435614
3100763564435614
3100833564435614
3100963564435614
3101193564435614
3300273564435614
3301063564435614
3301673564435614
3301813564435614
3301823564435614
3301983564435614
3302243910020524
3302253564435614
3302593564435614
3303313564435614
3303323564435614
3303723564435614
3400424890048900
3400684890048900
3900443796433874
3900963796433874
3903163796433874
4200854890048900

Table 9A-2 for this final rule (which is available via the Internet on the CMS Web site) reflects all reassignments of hospital reclassifications for FY 2015.

(3) Reclassifications to CBSAs That Contain Hospital's Geographic County

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28074), we identified 14 reclassified hospitals that would be geographically located in their reclassified labor market area under the new OMB delineations. For example, hospital 34-0015 is located in Rowan County, NC. Rowan County is currently a Micropolitan Statistical Area in NC, and treated as rural. The hospital is reclassified to CBSA 16740 (Charlotte-Concord-Rock Hill, NC-SC). Under the new OMB delineations, CBSA 16740 (Charlotte-Concord-Gastonia, NC-SC) would include Rowan County. Therefore, the current reclassification would become redundant. CBSA 16740 did not lose any counties to another labor market area; therefore, assignment to another alternate CBSA would not be an option under our proposed methodology. Because, by definition, a hospital would not be “reclassified” to its own geographic labor market area, and maintaining that “reclassified” status to its own geographic labor market area would serve no beneficial purpose for a hospital, we expected that all such affected hospitals would wish to terminate their reclassification status. Therefore, we assumed, for purposes of the proposed rule, that the affected hospitals would be terminating their reclassification status for the remaining years of their 3-year reclassification period, and for FY 2015, we proposed to assign them the wage index of the CBSA in which they are geographically located. We stated that affected hospitals should inform CMS if they wish to retain their current reclassification by sending notice to CMS within 45 days from the publication of the proposed rule. If an affected hospital did not inform us that they wished to retain their current reclassification, we assumed that the hospital had elected to terminate the reclassification. For purposes of the proposed rule, we presented tables under the presumption that all 14 hospitals would opt to cancel their reclassification status. We proposed to assign these hospitals the wage index value of their home area from Table 4A-2 for the proposed rule (which is available via the Internet on the CMS Web site), and not include them as reclassified hospitals in Table 9A-2 for the proposed rule (which is available via the Internet on the CMS Web site).

We did not receive any public comments on this proposal, nor did any hospital contact CMS through the WageIndex@cms.hhs.gov mailbox. Therefore, we are finalizing the proposal without any modifications. The following hospitals' reclassifications are terminated, and they are assigned the wage index of the CBSA to which they are geographically located under the new OMB delineations.

Hospitals Reclassified to Home Labor Market Area

CMS Certification Number (CCN)Current geographic CBSAReclassified geographic CBSA
3400153416740
3401293416740
3401443416740
4200364216740
4505964523104
4200271134024860
1500881130026900
1501131130026900
1900031929180
4400734434980
4600174636260
4600394636260
1901441943340
4900194947894

c. Applications for Reclassifications for FY 2016

Applications for FY 2016 reclassifications are due to the MGCRB by September 2, 2014 (the first working day of September 2014). We note that this is also the deadline for canceling a previous wage index reclassification withdrawal or termination under 42 CFR 412.273(d). As discussed in section III.B. of the preamble of this final rule, we are adopting the new OMB labor market area delineations announced on February 28, 2013. Therefore, hospitals should apply for reclassifications based on the new OMB delineations we are using for FY 2015. Applications and other information about MGCRB reclassifications may be obtained via the Internet on the CMS Web site at: http://www.cms.gov/Regulations-and-Guidance/Review-Boards/MGCRB/index.html,, or by calling the MGCRB at (410) 786-1174. The mailing address of the MGCRB is: 2520 Lord Baltimore Drive, Suite L, Baltimore, MD 21244-2670.3.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28074, 28075, and 28304), we proposed changes to the regulations at § 412.232(b)(2) and § 412.234(a)(3)(iv) to include reference to the most recent OMB standards for delineating statistical areas (using the most recent Census Bureau data and estimates) that were adopted by CMS. For rural groups, the group of hospitals must demonstrate that the county in which the hospitals are located meets the standards for redesignation to an MSA as an “outlying county.” For urban groups, hospitals located in counties that are in the same combined statistical area or CBSA as the urban area to which they seek redesignation qualify as meeting the proximity requirements for reclassification to the urban area to which they seek redesignation. We did not propose any changes to the reclassification policy, but included language in the regulations to reflect use of the most recent OMB standards for delineating statistical areas (using the most recent Census Bureau data and estimates) that are adopted by CMS in consideration of group reclassification applications submitted for review in FY 2015 (that is submitted by September 2, 2014 (this date was erroneously stated in the proposed rule as September 30, 2014), reviewed by the MGCRB in FY 2015, to be effective in FY 2016) and future years.

We did not receive any public comments on our proposed changes to the regulations at § 412.232(b)(2) and § 412.234(a)(3)(iv) to include a reference to the most recent OMB standards for delineating statistical areas (using the most recent Census Bureau data and estimates) that are adopted by CMS. Therefore, we are adopting as final the proposed changes to § 412.232(b)(2) and § 412.234(a)(3)(iv).

3. Redesignation of Hospitals Under Section 1886(d)(8)(B) of the Act

Section 1886(d)(8)(B)(i) of the Act requires the Secretary to “treat a hospital located in a rural county adjacent to one or more urban areas as being located in the urban metropolitan statistical area to which the greatest number of workers in the county commute” if certain adjacency and commuting criteria are met. The criteria utilize standards for designating Metropolitan Statistical Areas published in the Federal Register by the Director of the Office of Management and Budget (OMB) based on the most recently available decennial population data. Effective beginning FY 2005, we used OMB's CBSA standards based on the 2000 Census and the 2000 Census data to identify counties in which hospitals qualify under section 1886(d)(8)(B) of the Act to receive the wage index of the urban area. Hospitals located in these counties have been known as “Lugar” hospitals and the counties themselves are often referred to as “Lugar” counties.

As discussed in section III.B. of the preamble to the proposed rule, we proposed to implement OMB's revised labor market area delineations based on the Census 2010 data for purposes of determining applicable wage indexes for acute care hospitals beginning in FY 2015. As we have done in the past, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28075 through 28078), we also proposed to use the new OMB delineations to identify rural counties that would qualify as “Lugar” under section 1886(d)(8)(B) of the Act and, therefore, would be redesignated to urban areas for FY 2015. We proposed to revise the regulations at § 412.64(b)(3)(i) to reflect the most recent OMB standards for delineating statistical areas adopted by CMS. In the FY 2015 IPPS/LTCH PPS proposed rule, we stated that, by applying the new OMB delineations, the number of qualifying counties would increase from 98 in FY 2014 to 127 in FY 2015, as reflected in a chart published in the proposed rule. Since publication of the proposed rule, we have discovered a mistake where we inadvertently did not account for Davidson County, NC (which was a Lugar county in FY 2014 but is in a rural county no longer qualifying to be Lugar under the new OMB delineations, as discussed in section III.H.3.c. of the preamble of this final rule). Therefore, the number of qualifying counties increases from 99 in FY 2014 to 127 in FY 2015, and we are correcting this oversight in the preamble of this final rule. After evaluating and analyzing the 2010 Census commuting data, we proposed that, effective for discharges on or after October 1, 2014, in accordance with section 1886(d)(8)(B) of the Act, hospitals located in the rural counties listed in the first column of the table in the proposed rule would be designated as part of the urban area listed in the second column based on the criteria discussed above.

Comment: One commenter suggested that Lugar hospitals be considered rural for all Medicare IPPS purposes other than receiving the urban wage index.

Response: Lugar status is a deemed status, and there are only two provisions under the Medicare statute that would allow a Lugar hospital to be treated as a rural provider: (1) if the hospital is eligible for an out-migration adjustment under section 1886(d)(13) of the Act; or (2) if the hospital applies for an urban to rural reclassification under section 1886(d)(8)(E) of the Act. In either case, the hospital would be treated as rural for all IPPS purposes, which includes the wage index.

We did not receive any other specific comments with regard to our proposal to use the new OMB delineations to identify rural counties that would qualify as “Lugar” under section 1886(d)(8)(B) of the Act. Therefore, we are finalizing the policy as proposed. We also are finalizing our proposed revision of the regulations at § 412.64(b)(3)(i) to reflect the most recent OMB standards for delineating statistical areas adopted by CMS.

In addition, since publication of the proposed rule we discovered that, in the FY 2015 IPPS/LTCH proposed rule, for five of the Lugar counties, we had erroneously printed the names and codes of the entire Metropolitan Statistical Areas rather than the Metropolitan Division names and codes. Because we recognize Metropolitan Divisions as CBSAs, we should have printed the division names and codes for the following counties: Starke County, IN; Fannin County, TX; Hill County, TX; Van Zandt County, TX; and Island County, WA. The table below contains the corrected listing of the rural counties designated as urban under section 1886(d)(8)(B) of the Act. We note that this error was made only in the chart; that is, the wage index tables and data associated the FY 2015 IPPS/LTCH PPS proposed rule (available via the Internet on the CMS Web site) properly captured the Metropolitan Divisions for hospitals in these five counties. We are finalizing that, effective for discharges on or after October 1, 2014, in accordance with section 1886(d)(8)(B) of the Act, hospitals located in the rural counties listed in the first column of the chart below will be designated as part of the urban area listed in the second column based on the finalized criteria discussed above.

We note that rural counties that no longer meet the qualifying criteria to be Lugar are discussed in section III.H.3.c. of the preamble of this final rule.

Rural Counties Containing Hospitals Redesignated as Urban Under Section 1886(d)(8)(B) of the Act

[Based on new OMB delineations and census 2010 data]

Rural countyLugar designated CBSANEW
County nameStateCBSACBSA name
Chambers CountyAL12220Auburn-Opelika, ALNew.
Cherokee CountyAL40660Rome, GA
Cleburne CountyAL11500Anniston-Oxford-Jacksonville, ALNew.
Macon CountyAL12220Auburn-Opelika, AL
Talladega CountyAL11500Anniston-Oxford-Jacksonville, AL
Denali BoroughAK21820Fairbanks, AKNew.
Hot Spring CountyAR26300Hot Springs, AR
Litchfield CountyCT35300New Haven-Milford, CT
Bradford CountyFL27260Jacksonville, FL
Levy CountyFL23540Gainesville, FL
Washington CountyFL37460Panama City, FLNew.
Chattooga CountyGA40660Rome, GA
Jackson CountyGA12060Atlanta-Sandy Springs-Roswell, GA
Lumpkin CountyGA12060Atlanta-Sandy Springs-Roswell, GA
Polk CountyGA40660Rome, GA
Talbot CountyGA17980Columbus, GA-AL
Oneida CountyID36260Ogden-Clearfield, UTNew.
Christian CountyIL44100Springfield, IL
Iroquois CountyIL28100Kankakee, IL
Logan CountyIL44100Springfield, IL
Mason CountyIL37900Peoria, IL
Ogle CountyIL40420Rockford, IL
Union CountyIL16060Carbondale-Marion, IL
Clinton CountyIN29200Lafayette-West Lafayette, IN
Greene CountyIN14020Bloomington, INNew.
Henry CountyIN26900Indianapolis-Carmel-Anderson, IN
Marshall CountyIN43780South Bend-Mishawaka, IN-MINew.
Parke CountyIN45460Terre Haute, INNew.
Spencer CountyIN21780Evansville, IN-KY
Starke CountyIN23844Gary, IN
Tipton CountyIN26900Indianapolis-Carmel-Anderson, INNew.
Warren CountyIN29200Lafayette-West Lafayette, IN
Boone CountyIA11180Ames, IA
Buchanan CountyIA47940Waterloo-Cedar Falls, IA
Cedar CountyIA26980Iowa City, IA
Delaware CountyIA20220Dubuque, IANew.
Iowa CountyIA26980Iowa City, IANew.
Jasper CountyIA19780Des Moines-West Des Moines, IANew.
Franklin CountyKS28140Kansas City, MO-KSNew.
Nelson CountyKY31140Louisville/Jefferson County, KY-INNew.
Assumption ParishLA12940Baton Rouge, LA
Jefferson Davis ParishLA29340Lake Charles, LANew.
St. Landry ParishLA29180Lafayette, LANew.
Oxford CountyME30340Lewiston-Auburn, MENew.
Caroline CountyMD12580Baltimore-Columbia-Towson, MDNew.
Franklin CountyMA44140Springfield, MANew.
Allegan CountyMI24340Grand Rapids-Wyoming, MI
Ionia CountyMI24340Grand Rapids-Wyoming, MINew.
Lenawee CountyMI11460Ann Arbor, MINew.
New.aygo CountyMI24340Grand Rapids-Wyoming, MINew.
Shiawassee CountyMI29620Lansing-East Lansing, MI
Tuscola CountyMI40980Saginaw, MI
Goodhue CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WINew.
Meeker CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WINew.
Rice CountyMN33460Minneapolis-St. Paul-Bloomington, MN-WINew.
Pearl River CountyMS25060Gulfport-Biloxi-Pascagoula, MS
Stone CountyMS25060Gulfport-Biloxi-Pascagoula, MSNew.
Dade CountyMO44180Springfield, MO
Otoe CountyNE30700Lincoln, NENew.
Douglas CountyNV16180Carson City, NVNew.
Lyon CountyNV16180Carson City, NV
Los Alamos CountyNM42140Santa Fe, NM
Cayuga CountyNY45060Syracuse, NY
Cortland CountyNY27060Ithaca, NYNew.
Genesee CountyNY40380Rochester, NY
Greene CountyNY10580Albany-Schenectady-Troy, NY
Lewis CountyNY48060Watertown-Fort Drum, NYNew.
Montgomery CountyNY10580Albany-Schenectady-Troy, NYNew.
Schuyler CountyNY27060Ithaca, NY
Seneca CountyNY40380Rochester, NYNew.
Camden CountyNC47260Virginia Beach-Norfolk-Newport News, VA-NCNew.
Caswell CountyNC15500Burlington, NC
Granville CountyNC20500Durham-Chapel Hill, NC
Greene CountyNC24780Greenville, NCNew.
Harnett CountyNC39580Raleigh, NC
Polk CountyNC43900Spartanburg, SC
Wilson CountyNC40580Rocky Mount, NCNew.
Traill CountyND24220Grand Forks, ND-MNNew.
Ashtabula CountyOH17460Cleveland-Elyria, OH
Champaign CountyOH44220Springfield, OH
Columbiana CountyOH49660Youngstown-Warren-Boardman, OH-PA
Harrison CountyOH48260Weirton-Steubenville, WV-OHNew.
Preble CountyOH19380Dayton, OHNew.
Clinton CountyPA48700Williamsport, PA
Fulton CountyPA25180Hagerstown-Martinsburg, MD-WVNew.
Greene CountyPA38300Pittsburgh, PA
Lawrence CountyPA38300Pittsburgh, PANew.
Schuylkill CountyPA39740Reading, PA
Susquehanna CountyPA13780Binghamton, NY
Adjuntas MunicipioPR38660Ponce, PRNew.
Coamo MunicipioPR41980San Juan-Carolina-Caguas, PRNew.
Las Marías MunicipioPR32420Mayagüez, PRNew.
Maricao MunicipioPR32420Mayagüez, PRNew.
Salinas MunicipioPR25020Guayama, PRNew.
Clarendon CountySC44940Sumter, SC
Colleton CountySC16700Charleston-North Charleston, SCNew.
Lee CountySC44940Sumter, SC
Marion CountySC22500Florence, SCNew.
New berry CountySC17900Columbia, SCNew.
Meigs CountyTN17420Cleveland, TN
Blanco CountyTX12420Austin-Round Rock, TXNew.
Bosque CountyTX47380Waco, TX
Calhoun CountyTX47020Victoria, TXNew.
Fannin CountyTX19124Dallas-Plano-Irving, TX
Grimes CountyTX17780College Station-Bryan, TX
Harrison CountyTX30980Longview, TX
Henderson CountyTX46340Tyler, TX
Hill CountyTX23104Fort Worth-Arlington, TXNew.
Milam CountyTX12420Austin-Round Rock, TX
Van Zandt CountyTX19124Dallas-Plano-Irving, TX
Willacy CountyTX15180Brownsville-Harlingen, TX
King and Queen CountyVA40060Richmond, VANew.
Louisa CountyVA40060Richmond, VANew.
Madison CountyVA16820Charlottesville, VANew.
Orange CountyVA47900Washington-Arlington-Alexandria, DC-VA-MD-WVNew.
Page CountyVA25500Harrisonburg, VA
Shenandoah CountyVA49020Winchester, VA-WV
Southampton CountyVA47260Virginia Beach-Norfolk-Newport News, VA-NCNew.
Surry CountyVA47260Virginia Beach-Norfolk-Newport News, VA-NCNew.
Island CountyWA42644Seattle-Bellevue-Everett, WA
Mason CountyWA36500Olympia-Tumwater, WA
Jackson CountyWV16620Charleston, WV
Morgan CountyWV25180Hagerstown-Martinsburg, MD-WVNew.
Roane CountyWV16620Charleston, WV
Green Lake CountyWI22540Fond du Lac, WI
Jefferson CountyWI33340Milwaukee-Waukesha-West Allis, WI
Walworth CountyWI33340Milwaukee-Waukesha-West Allis, WI

a. New Lugar Areas for FY 2015

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28077), we stated that of the 127 qualifying counties identified as Lugar counties based on the new OMB delineations, 58 counties would be newly designated as Lugar for FY 2015 if we finalize our proposed adoption of the new OMB delineations. Hospitals in these counties, with at least 25 percent of their workers commuting to a higher wage area, effective October 1, 2014, would be deemed to be located in the CBSA to which the highest number of their workers commute (which is identified in the column titled “Lugar Designated CBSA” in the table above). Hospitals in these counties would receive the reclassified urban wage index of the corresponding Lugar Designated CBSA, unless they choose to waive their Lugar status, as discussed later in this section.

In the proposed rule (79 FR 28077), we stated that some areas that are currently urban counties would be geographically rural if we adopted the new OMB delineations and would meet the requirements for redesignation as Lugar areas. As described in section III.B.2.e.(2) of the preamble of the proposed rule, we proposed a 3-year hold harmless transitional wage index adjustment for hospitals located in urban counties that become rural under the new OMB delineations. Because Lugar status is a form of redesignation, hospitals that currently are located in urban counties that would become rural under the new OMB delineations and are also considered Lugar areas under the new OMB delineations would not be eligible for the 3-year transition wage index adjustment unless they chose to waive Lugar status for FY 2015 (as discussed later in this section) and sought no other form of wage index reclassification.

As discussed above, we did not receive any public comments with regard to our proposal to use the new OMB delineations to identify rural counties that would qualify as “Lugar” under section 1886(d)(8)(B) of the Act, and we are finalizing the policy as proposed. We refer readers to the summary of public comments and our responses regarding the proposed transition policies for the wage index as a result of adoption of the OMB delineations for FY 2015 in section III.B.2.e. of the preamble of this final rule.

b. Hospitals Redesignated Under Section 1886(d)(8)(B) of the Act Seeking Reclassification by the MGCRB

As in the past, hospitals redesignated under section 1886(d)(8)(B) of the Act are also eligible to be reclassified to a different area by the MGCRB. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28077), we stated that by using Table 4C associated with the proposed rule (which is available via the Internet on the CMS Web site), affected hospitals could compare the reclassified wage index for the labor market area into which they would be reclassified by the MGCRB to the reclassified wage index for the area to which they are redesignated under section 1886(d)(8)(B) of the Act. We stated that hospitals may withdraw from an MGCRB reclassification within 45 days of the publication of the FY 2015 proposed rule. (We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51598 through 51599) for the procedural rules and requirements for a hospital that is redesignated under section 1886(d)(8)(B) of the Act and seeking reclassification under the MGCRB, as well as our policy of measuring the urban area, exclusive of the Lugar County, for purposes of meeting proximity requirements.)

We treat New England deemed counties in a manner consistent with how we treat Lugar counties. (We refer readers to the FY 2008 IPPS final rule with comment period (72 FR 47337 through 47338) for a discussion of this policy.)

Since publication of the proposed rule, we discovered that there are four hospitals in rural counties that are newly deemed Lugar areas for FY 2015 that also have MGCRB reclassifications to the same CBSAs to which they are redesignated as Lugar. Lugar hospitals are treated like reclassified hospitals for purposes of determining their applicable wage index and receive the reclassified wage index for the urban area to which they have been redesignated. Because the Lugar redesignated CBSA is now the same as the MGCRB reclassified CBSA, the MGCRB reclassification becomes redundant. We note that hospitals with Lugar redesignations and hospitals with MGCRB reclassifications receive the wage index for hospitals that are reclassified as provided in Table 4C-2 associated with this final rule (which is available via the Internet on the CMS Web site). Table 9A-2 associated with this final rule (which is available via the Internet on the CMS Web site) reflects the reclassified and redesignated hospitals. Hospitals that are redesignated as Lugar are indicated as such when the “Lugar” column is populated. Although we did indicate in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28077) that hospitals redesignated as Lugar that also had an MGCRB reclassification may compare the reclassified wage index for the labor market area into which they would be reclassified by the MGCRB to the reclassified wage index for the area to which they are redesignated under section 1886(d)(8)(B) of the Act, and terminate or withdraw from an MGCRB reclassification within 45 days of the publication of the proposed rule, we acknowledge that we did not highlight these four hospitals that also are Lugar that would have redundant reclassifications. We also note that these hospitals did not send requests to the MGCRB to terminate their reclassifications. Because the new Lugar status would deem these hospitals redesignated to the same area to which they have an approved MGCRB reclassification, the reclassified wage index would be the same for these four hospitals in either scenario. We realize that, for this reason, the hospitals may not have seen a need to withdraw the MGCRB reclassification. Because we did not state in the proposed rule that we would expect that these affected hospitals would be terminating the remaining years of their 3-year reclassification period, for FY 2015 we are not updating the Lugar column on Table 9A-2 for this final rule. However, we have indicated in a footnote that, under the new OMB delineations, these providers are now redesignated as Lugar to the same area to which they have an existing MGCRB reclassification that they did not terminate. We emphasize that the effect on the wage index of these four hospitals is immaterial because hospitals redesignated as Lugar as well as hospitals with approved MGCRB reclassifications both receive the reclassified wage index for the urban area to which they have been redesignated or reclassified.

Hospitals Redesignated as Lugar to an Area Where They Have an Approved MGCRB Reclassification for FY 2015

CMS Certification No. (CCN)Rural county nameLugar CBSAMGCRB reclassification CBSA
150076Marshall County, IN4378043780
190017St. Landry Parish, LA2918029180
390016Lawrence County, PA3830038300
420030Colleton County, SC1670016700

c. Rural Counties No Longer Meeting the Criteria To Be Redesignated as Lugar

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28077 through 28078), we discussed that if we adopted the new OMB delineations, 29 rural counties would no longer meet the qualifying criteria to be redesignated as Lugar effective October 1, 2014, either because they would be geographically located in an urban area, or they would fail to meet the 25 percent cumulative out-migration threshold with application of the new 2010 Census commuting data. Since the publication of the proposed rule, we have discovered a mistake where we inadvertently did not account for Davidson County, NC. Therefore, the number of rural counties that will no longer meet the qualifying criteria to be redesignated as Lugar effective October 1, 2014, as indicated above, is 30 as opposed to 29. We are correcting this oversight in the preamble of this final rule.

Counties that were deemed urban under section 1886(d)(8)(B) of the Act in FY 2014, but would be geographically located in an urban area under the new OMB delineations for FY 2015 are:

Windham County, CT

Flagler County, FL

Walton County, FL

Morgan County, GA

Peach County, GA

De Witt County, IL

Allen County, KY

St. James Parrish, LA

Montcalm County, MI

Fillmore County, MN

Davidson County, NC

Lincoln County, NC

Cotton County, OK

Linn County, OR

Adams County, PA

Monroe County, PA

Falls County, TX

Buckingham County, VA

Floyd County, VA

Green County, WI

Counties that would fail to meet the 25-percent threshold in FY 2015 are:

Banks County, GA

Hendry County, FL

Bingham County, ID

Oceana County, MI

Columbia County, NY

Sullivan County, NY

Wyoming County, NY

Oconee County, SC

Middlesex County, VA

Wahkiakum County, WA

In section III.B.2.e.(2) of the preamble of the proposed rule, to help ease dramatic negative impacts in payment for hospitals designated as urban under the current FY 2014 OMB delineations, but would be classified as rural under the new OMB delineations, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we proposed to assign these hospitals the FY 2015 area wage index value of the urban CBSA to which they geographically belonged in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index). (For purposes of the wage index computation, the wage data of these hospitals would remain assigned to the statewide rural area in which they are located.) Similarly, we proposed that the same 3-year transition apply to hospitals located in those counties that would lose their deemed urban designation under section 1886(d)(8)(B) of the Act and would become rural if we adopt the new OMB delineations. Because these hospitals would, in fact, lose their designated urban status, we proposed to extend the 3-year hold harmless transitional wage index adjustment to these hospitals located in counties formerly designated as urban under section 1886(d)(8)(B) of the Act. That is, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we proposed to assign these hospitals the FY 2015 area wage index value of the urban CBSA to which they were designated as urban in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied). We proposed to use the wage data from these hospitals as part of computing the rural wage index. In addition, during this 3-year transition period, these hospitals would be eligible to apply for reclassification by the MGCRB. As discussed in section III.B.2.e.(3) of the preamble of the proposed rule, we proposed that if a hospital is currently located in an urban county that would become rural for FY 2015 under the new OMB delineations, and such hospital seeks and is granted any reclassification or redesignation during FYs 2015, 2016, or 2017, the hospital would permanently lose its 3-year transitional assigned wage index, and would not be able to reinstate it. Similarly, we proposed that this policy also apply to hospitals located in those counties that would lose their deemed urban designation under section 1886(d)(8)(B) of the Act and would become rural if we adopt the new OMB delineations. In FY 2018, we proposed that these hospitals would receive their statewide rural wage index.

As indicated earlier, we did not receive any public comments with regard to our proposal to use the new OMB delineations to identify rural counties that would qualify as “Lugar” under section 1886(d)(8)(B) of the Act. Therefore, we are finalizing the policy and designations as proposed. As discussed previously, for FYs 2015, 2016, and 2017, assuming no other form of wage index reclassification or redesignation is granted, we are assigning hospitals that are in urban counties that will become rural under the new OMB delineations to the FY 2015 area wage index value of the urban CBSA to which they geographically belonged in FY 2014 (with the rural and imputed floors applied and with the rural floor budget neutrality adjustment applied to the area wage index). (For purposes of the wage index computation, the wage data of these hospitals will remain assigned to the statewide rural area in which they are located.) Similarly, the same 3-year transition will apply to hospitals located in those counties that will lose their deemed urban designation under section 1886(d)(8)(B) of the Act and will become rural under the new OMB delineations. We will use the wage data from these hospitals as part of computing the rural wage index. In FY 2018, these hospitals will receive their statewide rural wage index. Furthermore, if any such hospital seeks and is granted any reclassification or redesignation during FYs 2015, 2016, or 2017, the hospital will permanently lose its 3-year transitional assigned wage index and will not be able to reinstate it. We refer readers to summaries of public comments and our responses regarding proposed transition policies for the wage index in section III.B.2.e. of the preamble of this final rule.

4. Waiving Lugar Redesignation for the Out-Migration Adjustment

In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51599 through 51600), we adopted the policy that, beginning with FY 2012, an eligible hospital that waives its Lugar status in order to receive the out-migration adjustment has effectively waived its deemed urban status and, thus, is rural for all purposes under the IPPS, including being considered rural for the DSH payment adjustment, effective for the fiscal year in which the hospital receives the out-migration adjustment. (We refer readers to a discussion of DSH payment adjustment under section IV.F. of the preamble of this final rule.)

In addition, we adopted a minor procedural change in that rule that would allow a Lugar hospital that qualifies for and accepts the out-migration adjustment (through written notification to CMS within 45 days from the publication of the proposed rule) to waive its urban status for the full 3-year period for which its out-migration adjustment is effective. By doing so, such a Lugar hospital would no longer be required during the second and third years of eligibility for the out-migration adjustment to advise us annually that it prefers to continue being treated as rural and receive the out-migration adjustment. Therefore, under the procedural change, a Lugar hospital that requests to waive its urban status in order to receive the rural wage index in addition to the out-migration adjustment would be deemed to have accepted the out-migration adjustment and agrees to be treated as rural for the duration of its 3-year eligibility period, unless, prior to its second or third year of eligibility, the hospital explicitly notifies CMS in writing, within the required period (generally 45 days from the publication of the proposed rule), that it instead elects to return to its deemed urban status and no longer wishes to accept the out-migration adjustment. If the hospital does notify CMS that it is electing to return to its deemed urban status, it would again be treated as urban for all IPPS payment purposes.

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51599 through 51600) for a detailed discussion of the policy and process for waiving Lugar status for the out-migration adjustment.

Comment: One commenter sought clarification about whether a hospital can waive Lugar status in other instances, such as to retain a special rural status such as CAH, SCH, or MDH, and not just when a hospital is eligible for the out-migration adjustment.

Response: As stated in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51599 through 51600, the statute provides two methods for a Lugar hospital to be treated as rural for Medicare payment purposes: (1) If the hospital is eligible for an out-migration adjustment under section 1886(d)(13) of the Act; or (2) if the hospital applies for an urban to rural reclassification under section 1886(d)(8)(E) of the Act. There are no other provisions under the Medicare statute that would allow a Lugar hospital to be treated as a rural provider.

5. Update of Application of Urban to Rural Reclassification Criteria

Section 401(a) of the Medicare, Medicaid, and SCHIP Balanced Budget Refinement Act of 1999 (Pub. L. 106-113), which amended section 1886(d)(8) of the Act by adding a new paragraph (E), directed the Secretary to treat any subsection (d) hospital located in an urban area as being located in the rural area of the State in which the hospital is located, providing that the hospital applied for reclassification in a manner determined by the Secretary and met certain criteria. As discussed in the FY 2001 interim final rule (65 FR 47029 through 47031), we codified in regulation at § 412.103 the application process and the qualifying criteria for any hospital seeking rural reclassification.

In order to be approved for a rural reclassification, a hospital that is located in an urban area must meet one of the following four criteria under section 1886(d)(8)(E)(ii) of the Act (codified at § 412.103): (1) The hospital is located in a rural census tract of an MSA, as determined under the most recent version of the Goldsmith Modification, the Rural-Urban Commuting Area (RUCA) codes; (2) the hospital is located in an area designated by any law or regulation of such State as a rural area or is designated by such State as a rural hospital; (3) the hospital would qualify as a RRC or SCH if the hospital were located in an urban area; and (4) the hospital meets such other criteria as the Secretary may specify.

On February 28, 2013, OMB issued OMB Bulletin No. 13-01, which established revised delineations for Metropolitan Statistical Areas, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the use of the delineations of these statistical areas. These delineations are based on 2010 decennial Census data. Several modifications of RUCA codes were necessary to take into account updated commuting data and revised OMB delineations. We refer readers to the U.S. Department of Agriculture's Economic Research Service Web site for a detailed listing of updated RUCA codes found at: http://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx. The updated RUCA code definitions were introduced in late 2013.

As discussed at § 412.103(f), the duration of an approved rural reclassification remains in effect without need for reapproval unless there is a change in the circumstances under which the classification was approved. If a hospital located in an urban area was approved for a rural reclassification under § 412.103(a)(1), that reclassification would no longer be valid if the hospital is no longer located within a rural census tract of an MSA defined as an RUCA. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28078), we encouraged all hospitals with active rural reclassifications under section 1886(d)(8)(E) of the Act to review their original reclassification application and determine whether the reclassification status would still apply. As discussed in section VI.C.2. of the preamble of the proposed rule, we proposed a 2-year grace period allowing affected CAHs additional time to seek a new rural reclassification without the threat of losing its CAH status. As discussed in section VI.C.2. of the preamble of the proposed rule, we did not propose a grace period for other types of hospitals to seek a new rural reclassification. We noted that rural reclassification status under § 412.103 is effective as of the filing date of the application. Therefore, if the change in RUCA codes invalidates any hospital's rural reclassification status, we believe hospitals will have adequate time to apply for a new reclassification using an alternative qualification criterion specified at either § 412.103(a)(2) or § 412.103(a)(3). A rural referral center (RRC) or a sole community hospital (SCH) that continues to meet the appropriate qualification criteria would, in itself, qualify for a rural reclassification. If a complete application is received before October 1, 2014, and is approved by the CMS Regional Office, the hospital would experience no interruption in its rural status.

Comment: Several commenters requested that additional provider types (SCHs and MDHs) be afforded the 2-year transition period of deemed rural status that was granted to CAHs. Commenters stated the critical role these hospitals serve in their communities, and cited the administrative burden that would be required to obtain rural status in order to maintain their provider type. Commenters asserted that hospitals that obtain an urban to rural reclassification are not entitled to receive an outmigration adjustment and would require additional time to assess their appropriate options.

Response: We thank commenters for sharing their concerns. However, we do not believe that extending a 2-year transition period of deemed rural status is necessary for additional provider types. While it is true that there are potential payment consequences for a CAH, SCH, or MDH currently located in a rural area that becomes urban under the new OMB delineations, the payment consequences for CAHs are generally greater, because, unlike SCHs and MDHs, CAHs are entirely excluded from the IPPS and would face an end to payments based on 101 percent of their reasonable costs. In addition, given the different Conditions of Participation (CoPs) for CAHs, and that it would be generally more difficult for a CAH to have to meet the hospital CoPs instead of the CAH CoPs, only a CAH also faces the potential loss of its ability to continue to participate in the Medicare and Medicaid programs. Specifically, to avoid termination not only of its CAH status (and associated cost-based reimbursement), but of its Medicare agreement in its entirety, the CAH would have to convert back to a hospital, including demonstrating via a survey that it complies with the hospital CoPs, which are generally more stringent than those for CAHs. We believe that the combination of the generally greater payment consequences for CAHs relative to other provider types combined with the unique consequences for CAHs with respect to the CoPs make it appropriate for CAHs to be afforded a 2-year transition period in which to reclassify not afforded to other provider types.

SCHs and MDHs that were located in rural areas that became urban under the new OMB delineations could have known of the upcoming change since February 2013 (when OMB published the new delineations); thus, these hospitals have had adequate time to assess options. SCHs and MDHs still can seek approval for rural reclassification for FY 2015 under § 412.103 if they meet the requirements of this section, provided that they apply before the beginning of FY 2015. This approval of rural status would be effective as of the date of the application. If any hospital's wage index is negatively affected due to the adoption of the new OMB delineations, the hospital will receive a 50/50 blended wage index for FY 2015 (as discussed previously).

With respect to the out-migration adjustment, commenters noted correctly that hospitals reclassified rural under section 412.103 are not eligible to receive an out-migration adjustment. Section 1886(d)(13)(G) of the Act specifies that a hospital is not eligible to receive an out-migration adjustment if it is granted any form of wage index reclassification, including urban to rural reclassification. We believe that a hospital that chooses to reclassify to a particular labor market area should not also receive an additional payment benefit to reflect commuting patterns within its home area.

After consideration of the public comments we received, we are not implementing any additional changes to grant other provider types a transition period during which to reclassify as rural similar to that being adopted for CAHs. We refer readers to section VI.C.2. of the preamble of this final rule for a discussion of the CAH transition period policy.

I. FY 2015 Wage Index Adjustment Based on Commuting Patterns of Hospital Employees

In accordance with section 1886(d)(13) of the Act, as added by section 505 of Public Law 108-173, beginning with FY 2005, we established a process to make adjustments to the hospital wage index based on commuting patterns of hospital employees (the “out-migration” adjustment). The process, outlined in the FY 2005 IPPS final rule (69 FR 49061), provides for an increase in the wage index for hospitals located in certain counties that have a relatively high percentage of hospital employees who reside in the county but work in a different county (or counties) with a higher wage index.

When this provision was implemented for the FY 2005 wage index, we analyzed commuting data compiled by the U.S. Census Bureau which was derived from a special tabulation of the 2000 Census journey-to-work data for all industries (CMS extracted data applicable to hospitals). These data were compiled from responses to the “long-form” survey, which the Census Bureau used at the time, and it contained questions on where residents in each county worked (69 FR 49062). However, the 2010 Census was “short form” only; therefore, this information was not collected as part of the 2010 Census. The Census Bureau is working with CMS to provide an alternative dataset based on the latest available data that is expected to meet our needs for developing a new out-migration adjustment. We believe we will have the necessary time to obtain, review and analyze the data in order to propose new out-migration adjustments based on new commuting patterns developed from the 2010 Census data beginning with FY 2016. Section 1886(d)(13)(B) of the Act requires the Secretary to use data the Secretary determines to be appropriate to establish the qualifying counties. The data used for the FY 2014 out-migration adjustment are the most recent data that have been analyzed, and we believe that these data are appropriate to establish the qualifying counties. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28079 through 28080), we proposed that the FY 2015 out-migration adjustments continue to be based on the 2000 Census data. We also proposed that the FY 2015 out-migration adjustments continue to be based on the policies, procedures, and computation that were used for the FY 2014 out-migration adjustment.

We did not receive any public comments with regard to the out-migration adjustment for FY 2015. Therefore, for FY 2015, we are finalizing our proposal that the FY 2015 out-migration adjustment continue to be based on the 2000 Census data used for the FY 2014 out-migration adjustment. We also are finalizing our proposal that the out-migration adjustment be based on the policies, procedures, and computation that were used for the FY 2014 out-migration adjustment. (We refer readers to a full discussion of the adjustment, including rules on deeming hospitals reclassified under section 1886(d)(8) or section 1886(d)(10) of the Act to have waived the out-migration adjustment, in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51601 through 51602)). Table 4J, which is available via the Internet on the CMS Web site, lists the out-migration adjustments for the FY 2015 wage index.

Section 1886(d)(13)(F) of the Act states that “[a] wage index increase under this paragraph shall be effective for a period of 3 fiscal years, except that the Secretary shall establish procedures under which a subsection (d) hospital may elect to waive the application of such wage index increase.” Therefore, for FY 2015, because we are continuing to use the out-migration adjustment data used for FY 2014, consistent with the statute, we also proposed to allow hospitals that qualified in FY 2013 or FY 2014 to receive the out-migration adjustment based on the commuting data and the CBSA delineations used for FY 2014 to continue to receive the same out-migration adjustment for the remainder of their 3-year qualification period. Similarly, if a hospital qualifies for and opts to receive the out-migration adjustment for the first time in FY 2015, we also proposed to allow that hospital to receive the out-migration adjustment based on the data used for FY 2014 for FYs 2015, 2016, and 2017. Accordingly, even if we propose to adopt new out-migration adjustment data for FY 2016, as we believe we will be able to do, hospitals that are already receiving an out-migration adjustment beginning with a fiscal year prior to FY 2016 would still receive their out-migration adjustment based on the data used for FY 2014 for the years that remain of their 3-year qualification period in FY 2016 and after.

We did not receive any public comments with regard to our proposals. Therefore, we are finalizing our proposal that hospitals that qualified in FY 2013 or FY 2014 to receive the out-migration adjustment based on the commuting data and the CBSA delineations used for FY 2014 will continue to receive the same out-migration adjustment for the remainder of their 3-year qualification period. If a hospital qualifies for and opts to receive the out-migration adjustment for the first time in FY 2015, we will allow that hospital to receive the out-migration adjustment based on the data used for FY 2014 for FYs 2015, 2016, and 2017.

We intend to address application of the FY 2016 out-migration adjustment in greater detail in the FY 2016 proposed rule. However, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28079), we solicited public comments on how to implement the new out-migration adjustment data for FY 2016, given the statutory requirement at section 1886(d)(13)(F) of the Act that an out-migration adjustment be effective for 3 fiscal years. We did not receive any public comments on how to implement the new out-migration adjustment data for FY 2016.

As discussed in section III.B. of the preamble of this final rule, we are using OMB's new labor market area delineations based on the 2010 Census data to identify counties qualifying as Lugar counties for FY 2015. In section III.H.3 of the preamble of this final rule, we discuss hospitals located in rural counties that are deemed to be urban under section 1886(d)(8)(B) of the Act. These rural counties are known as “Lugar” counties. Under the new OMB delineations, there are counties newly qualifying as Lugar as well as counties that were previously Lugar counties that will no longer meet the criteria to be redesignated as Lugar. As discussed in section III.H.4. of the preamble of this final rule, if a Lugar hospital qualifies for and accepts the out-migration adjustment, it must waive its deemed urban status and can do so for the 3-year period for which the out-migration adjustment is effective. Therefore, hospitals located in counties newly designated as Lugar due to the new OMB delineations will have the choice to either maintain their Lugar status or waive it in order to receive the out-migration adjustment in FY 2015 based on the out-migration adjustment data used for FY 2014.

On the other hand, there are hospitals in counties deemed to be Lugar under the current OMB delineations that waived their Lugar status for the out-migration adjustment, but are not Lugar under the new OMB delineations. These hospitals will continue to receive the out-migration adjustment for the 3-year eligibility period through FY 2015 or FY 2016. However, these hospitals that are located in urban counties under the new OMB delineations, and wish to continue to maintain their rural status effective October 1, 2014, must do so by reclassifying from urban to rural under § 412.103. Section 1886(d)(13)(G) of the Act states that a hospital cannot simultaneously receive the out-migration adjustment and be subject to a reclassification under section 1886(d)(8) or 1886(d)(10) of the Act. Therefore, if such hospital is not located in a geographically rural area under the new OMB delineations, and reclassifies under § 412.103 of the regulations in order to be treated as rural for IPPS purposes, the hospital is ineligible to receive an out-migration adjustment, even if the 3-year eligibility period has not expired.

As discussed in section III.B.5. of the preamble of this final rule, we are finalizing our proposal to apply a 1-year blended wage index for any provider that experiences a decrease in wage index value due to the implementation of the new OMB labor market area delineations. This policy creates a wage index that is 50 percent of the wage index derived using the current FY 2014 OMB delineations, and 50 percent of the wage index based on the new OMB delineations. As discussed in section III.B.2.e.(4) of the preamble of this final rule, as we proposed, we are applying this blended wage index value to any affected hospital in a budget neutral manner. However, we proposed that hospitals receiving the out-migration adjustment would have it added to the result of the 50/50 blended wage index, after budget neutrality is applied. We established the blended wage index transition adjustment specifically to address any negative impact that may be caused by the adoption of the new OMB delineations in FY 2015. To specifically identify and address any such negative payment impact, we proposed to apply the out-migration adjustment independent of the blended wage index and other wage index adjustments (for example, the rural floor) and related budget neutrality adjustments. This is consistent with our current policy to apply the out-migration adjustment after all other wage index adjustments and related budget neutrality adjustments have been applied. Therefore, we believe the out-migration adjustment would be properly applied as a supplemental addition to a hospital's final wage index value, similar to our treatment of hospitals receiving the frontier State floor value of 1.00, as described under 42 CFR 412.64(m), that also qualify for an out-migration adjustment and would receive that adjustment.

One group of commenters suggested CMS made an error in calculating the rural wage index for Connecticut under the old OMB delineations (as discussed in section III.B.2.e.(4) of the preamble of this final rule) for the purpose of applying the proposed transition blend. We respond to this comment in section III.B.2.e.(4) of the preamble of this final rule, and we refer readers to this section for further discussion.

After consideration of the public comments we received, we are finalizing our proposal without modification that we will add the out-migration adjustment for hospitals receiving such adjustment to the result of the 50/50 blended wage index, after budget neutrality is applied. Therefore, we will apply the out-migration adjustment independent of the blended wage index and other wage index adjustments (for example, the rural floor) and related budget neutrality adjustments.

J. Process for Requests for Wage Index Data Corrections

The preliminary, unaudited Worksheet S-3 wage data and occupational mix survey data files for the proposed FY 2015 wage index were made available on September 13, 2013, through the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Files-Items/FY-2015-Wage-Index-Home-Page.html.

In the interest of meeting the data needs of the public, beginning with the proposed FY 2009 wage index, we post an additional public use file on our Web site that reflects the actual data that are used in computing the proposed wage index. The release of this file does not alter the current wage index process or schedule. We notify the hospital community of the availability of these data as we do with the current public use wage data files through our Hospital Open Door forum. We encourage hospitals to sign up for automatic notifications of information about hospital issues and the scheduling of the Hospital Open Door forums at the CMS Web site at: http://www.cms.gov/Outreach-and-Education/Outreach/OpenDoorForums/index.html.

In a memorandum dated September 16, 2013, we instructed all MACs to inform the IPPS hospitals they service of the availability of the wage index data files and the process and timeframe for requesting revisions (including the specific deadlines listed below). We also instructed the MACs to advise hospitals that these data were also made available directly through their representative hospital organizations.

If a hospital wished to request a change to its data as shown in the September 13, 2013 wage and occupational mix data files, the hospital was to submit corrections along with complete, detailed supporting documentation to its MAC by November 21, 2013. Hospitals were notified of this deadline and of all other deadlines and requirements, including the requirement to review and verify their data as posted in the preliminary wage index data files on the Internet, through the September 16, 2013 memorandum referenced above.

In the September 16, 2013 memorandum, we also specified that a hospital requesting revisions to its occupational mix survey data was to copy its record(s) from the CY 2010 occupational mix preliminary files posted to the CMS Web site in September, highlight the revised cells on its spreadsheet, and submit its spreadsheet(s) and complete documentation to its MAC no later than November 21, 2013.

The MACs notified the hospitals by early-February 2014 of any changes to the wage index data as a result of the desk reviews and the resolution of the hospitals' late-November revision requests. The MACs also submitted the revised data to CMS by late January 2014. CMS published the proposed wage index public use files that included hospitals' revised wage index data on February 20, 2014. Hospitals had until March 3, 2014, to submit requests to the MACs for reconsideration of adjustments made by the MACs as a result of the desk review, and to correct errors due to CMS' or the MAC's mishandling of the wage index data. Hospitals also were required to submit sufficient documentation to support their requests.

After reviewing requested changes submitted by hospitals, MACs were required to transmit to CMS any additional revisions resulting from the hospitals' reconsideration requests by April 9, 2014. The deadline for a hospital to request CMS intervention in cases where the hospital disagreed with the MAC's policy interpretations was April 16, 2014. We note that, beginning with the FY 2015 wage index, in accordance with the FY 2015 wage index timeline posted on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Downloads/FY2015-WI-Timeline.pdf,, the April appeals had to be sent via mail and email. We refer readers to the wage index timeline for complete details.

Hospitals were given the opportunity to examine Table 2, which was listed in section VI. of the Addendum to the proposed rule and available via the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Files-Items/FY-2015-Wage-Index-Home-Page.html. Table 2 contained each hospital's proposed adjusted average hourly wage used to construct the wage index values for the past 3 years, including the FY 2011 data used to construct the proposed FY 2015 wage index. We noted that the proposed hospital average hourly wages shown in Table 2 only reflected changes made to a hospital's data that were transmitted to CMS by February 26, 2014.

The final wage index data public use files were posted on May 2, 2014 on the Internet at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Files-Items/FY-2015-Wage-Index-Home-Page.html. The May 2014 public use files are made available solely for the limited purpose of identifying any potential errors made by CMS or the MAC in the entry of the final wage index data that resulted from the correction process described above (revisions submitted to CMS by the MACs by April 9, 2014).

After the release of the May 2014 wage index data files, changes to the wage and occupational mix data could only be made in those very limited situations involving an error by the MAC or CMS that the hospital could not have known about before its review of the final wage index data files. Specifically, neither the MAC nor CMS will approve the following types of requests:

  • Requests for wage index data corrections that were submitted too late to be included in the data transmitted to CMS by the MACs on or before April 9, 2014.
  • Requests for correction of errors that were not, but could have been, identified during the hospital's review of the February 20, 2014 wage index public use files.
  • Requests to revisit factual determinations or policy interpretations made by the MAC or CMS during the wage index data correction process.

If, after reviewing the May 2014 final public use files, a hospital believed that its wage or occupational mix data were incorrect due to a MAC or CMS error in the entry or tabulation of the final data, the hospital was given the opportunity to notify both its MAC and CMS regarding why the hospital believes an error exists and provide all supporting information, including relevant dates (for example, when it first became aware of the error). The hospital was required to send its request to CMS and to the MAC no later than June 2, 2014. Similar to the April appeals, beginning with the FY 2015 wage index, in accordance with the FY 2015 wage index timeline posted on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Downloads/FY2015-WI-Timeline.pdf,, the June appeals were required to be sent via mail and email to CMS and the MACs. We refer readers to the wage index timeline for complete details. (We refer readers to section II.K. of the preamble to this final rule where we are making revisions to the wage index timetable.)

Verified corrections to the wage index data received timely by CMS and the MACs (that is, by June 2, 2014) were incorporated into the final wage index in this FY 2015 IPPS/LTCH PPS final rule, which will be effective October 1, 2014.

We created the processes described above to resolve all substantive wage index data correction disputes before we finalize the wage and occupational mix data for the FY 2015 payment rates. Accordingly, hospitals that did not meet the procedural deadlines set forth above will not be afforded a later opportunity to submit wage index data corrections or to dispute the MAC's decision with respect to requested changes. Specifically, our policy is that hospitals that do not meet the procedural deadlines set forth above will not be permitted to challenge later, before the PRRB, the failure of CMS to make a requested data revision. We refer readers also to the FY 2000 IPPS final rule (64 FR 41513) for a discussion of the parameters for appeals to the PRRB for wage index data corrections.

Again, we believe the wage index data correction process described above provides hospitals with sufficient opportunity to bring errors in their wage and occupational mix data to the MAC's attention. Moreover, because hospitals had access to the final wage index data by early May 2014, they had the opportunity to detect any data entry or tabulation errors made by the MAC or CMS before the development and publication of the final FY 2015 wage index by August 2014, and the implementation of the FY 2015 wage index on October 1, 2014. Given these processes, the wage index implemented on October 1 should be accurate. Nevertheless, in the event that errors are identified by hospitals and brought to our attention after June 2, 2014, we retain the right to make midyear changes to the wage index under very limited circumstances.

Specifically, in accordance with 42 CFR 412.64(k)(1) of our existing regulations, we make midyear corrections to the wage index for an area only if a hospital can show that: (1) the MAC or CMS made an error in tabulating its data; and (2) the requesting hospital could not have known about the error or did not have an opportunity to correct the error, before the beginning of the fiscal year. For purposes of this provision, “before the beginning of the fiscal year” means by the June deadline for making corrections to the wage data for the following fiscal year's wage index (for example, June 2, 2014, for the FY 2015 wage index). This provision is not available to a hospital seeking to revise another hospital's data that may be affecting the requesting hospital's wage index for the labor market area. As indicated earlier, because CMS makes the wage index data available to hospitals on the CMS Web site prior to publishing both the proposed and final IPPS rules, and the MACs notify hospitals directly of any wage index data changes after completing their desk reviews, we do not expect that midyear corrections will be necessary. However, under our current policy, if the correction of a data error changes the wage index value for an area, the revised wage index value will be effective prospectively from the date the correction is made.

In the FY 2006 IPPS final rule (70 FR 47385 through 47387 and 47485), we revised 42 CFR 412.64(k)(2) to specify that, effective on October 1, 2005, that is, beginning with the FY 2006 wage index, a change to the wage index can be made retroactive to the beginning of the Federal fiscal year only when CMS determines all of the following: (1) the MAC or CMS made an error in tabulating data used for the wage index calculation; (2) the hospital knew about the error and requested that the MAC and CMS correct the error using the established process and within the established schedule for requesting corrections to the wage index data, before the beginning of the fiscal year for the applicable IPPS update (that is, by the June 2, 2014 deadline for the FY 2015 wage index); and (3) CMS agreed before October 1 that the MAC or CMS made an error in tabulating the hospital's wage index data and the wage index should be corrected.

In those circumstances where a hospital requested a correction to its wage index data before CMS calculated the final wage index (that is, by the June 2, 2014 deadline for the FY 2015 wage index), and CMS acknowledges that the error in the hospital's wage index data was caused by CMS' or the MAC's mishandling of the data, we believe that the hospital should not be penalized by our delay in publishing or implementing the correction. As with our current policy, we indicated that the provision is not available to a hospital seeking to revise another hospital's data. In addition, the provision cannot be used to correct prior years' wage index data; and it can only be used for the current Federal fiscal year. In situations where our policies would allow midyear corrections other than those specified in 42 CFR 412.64(k)(2)(ii), we continue to believe that it is appropriate to make prospective-only corrections to the wage index.

We note that, as with prospective changes to the wage index, the final retroactive correction will be made irrespective of whether the change increases or decreases a hospital's payment rate. In addition, we note that the policy of retroactive adjustment will still apply in those instances where a final judicial decision reverses a CMS denial of a hospital's wage index data revision request.

K. Notice of Change to Wage Index Development Timetable

As explained in section III.J. of the preamble of this final rule, the preliminary, unaudited Worksheet S-3 wage data and occupational mix survey data files for the proposed FY 2015 wage index were made available on September 13, 2013, through the Internet on the CMS Web site. The posting of these preliminary files initiates what is virtually a year-long cycle for developing the wage index associated with the following IPPS fiscal year. This lengthy, almost year-long cycle is unique to the development of the IPPS wage index, and occurs independently from the development of the IPPS proposed and final rules, which typically are published in the spring and summer each year. In addition, the wage index, which is based on hospitals' wage data reported on Worksheets S-3, Parts II and III of Form CMS-2552-10 of the Medicare cost report and occupational mix data, is the only portion of the IPPS that historically has been subject to its own annual review process, first by the MACs, and then by CMS, followed by distinct opportunities for hospitals to appeal decisions made by the MACs or CMS. This process is separate and independent from the standard cost report settlement and appeals processes established under the regulations at 42 CFR 405.1800 through 405.1889.

Although this unique wage index development timetable has been in place since the early days of the IPPS, the current timetable is rooted in changes adopted in the FY 1998 IPPS final rule with comment period (62 FR 45990 through 45993). However, with numerous legislative and regulatory changes made to the IPPS since FY 1998, the demands on hospitals, MACs, and CMS have increased substantially. As a result, it has become increasingly challenging for wage index stakeholders to manage the wage index timetable with competing priorities. For the FY 2015 wage index, CMS made slight changes to the wage index development timetable, by posting the preliminary public use file (PUF) in September 2013 rather than in October 2013, which, in turn, moved back the deadline for hospitals to request revisions to the data displayed in that preliminary PUF to November 2013, instead of December 2013. In addition, the date for the MACs to complete desk reviews on that data was similarly moved to a slightly earlier deadline in early CY 2014. The FY 2015 Wage Index Development Timetable, which is posted on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Downloads/FY2015-WI-Timeline.pdf,, shows that hospitals have a little more than 2 months to request revisions to their data displayed in the September 13, 2013 preliminary PUF, until the commencement of the desk review process by the MACs on November 21, 2013. The MACs also have a little more than 2 months to complete the desk reviews and submit revised cost report data to CMS by January 29, 2014. Less than a month later, on February 20, 2014, the revised FY 2015 wage index and occupational mix PUFs were posted on the CMS Web site. Ensuring the accuracy of the February PUF is extremely important and beneficial to hospitals because, as the timetable shows, it is the basis for hospitals to appeal data that are incorrect, with March 3, 2014 being the last date that hospitals can request revisions to errors in the February 20, 2014 PUF.

Therefore, we want to take steps to improve the accuracy of the February PUF, most importantly by proposing changes to the wage index timetables for future IPPS fiscal years that are much more significant and fundamental than the slight revisions to the timetable implemented for FY 2015. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28082), we stated that we believe that the changes we proposed in that proposed rule would not only improve the accuracy of the February PUF, but also would reduce the number of hospital appeals based on the February PUF. For example, as specified below, instead of the current timetable which only provides CMS with less than a month to review the MACs' desk reviews and prepare the February PUF, we proposed approximately 3 months between the date that the MACs' desk reviews would end and the date that CMS would post the subsequent PUF. To allow hospitals and MACs adequate time to prepare for the changes to the wage index development timetable, we proposed to make significant changes beginning with the FY 2017 wage index cycle. We listed the proposed changes for FY 2017 in a table in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28082) shown below side by side with the existing timetable so that commenters could read the proposed changes in the context of the existing timetable. Under the proposed changes for FY 2017, although we did not provide exact dates for the FY 2017 wage index timetable, we noted that, with every change listed, we intend to provide hospitals and MACs with the same or somewhat more time than under the current timetable to complete reviews and request revisions. We stated that the proposed revisions would not reduce the amount of time that either hospitals or MACs have to review wage data. Therefore, the proposed changes would not result in additional work on the part of the hospitals or MACs; in fact, in shifting the various dates, we expect that more time would be provided to hospitals, MACs, and CMS to ensure an even more accurate wage index.

DeadlinesFY 2015 timetableProposed FY 2017 timetable
Posting of Preliminary PUF on CMS Web siteSeptember 13, 2013Mid-May 2015.
Deadline for Hospitals to Request Revisions to Preliminary PUFNovember 21, 2013Early August 2015.
Deadline for MACs to Complete Desk ReviewsJanuary 29, 2014Mid-October 2015.
Posting of February PUF on CMS Web siteFebruary 20, 2014Late January 2016.
Deadline Following Posting of February PUF for Hospitals to Request RevisionsMarch 3, 2014Mid-February 2016.
Completion of Appeals by MACs and Transmission of Final Wage Data to CMSApril 9, 2014Mid- to Late March 2016.
Deadline for Hospitals to Appeal in AprilApril 16, 2014Early April 2016.
Posting of Final Rule PUFMay 2, 2014Late April 2016.
Deadline for Hospitals to Appeal in JuneJune 2, 2014Late May 2016.
Expected Issuance of IPPS final ruleAugust 1, 2014August 1, 2016.

With regard to the FY 2016 wage index cycle, we believe it can serve as a transition to the more significant changes we proposed for the FY 2017 wage index cycle. We believe that there are steps we can take to improve the accuracy of the February 2016 PUF by building in more time to the FY 2016 wage index review process as well. Specifically, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28083), we stated that we were notifying hospitals of changes to the deadlines only in the beginning of the FY 2016 wage index timetable, as a transition to the more significant proposed changes for the entire FY 2017 wage index timetable. That is, for FY 2016, we were only changing the following four dates: The posting of the preliminary wage index PUF; the posting of the CY 2013 occupational mix survey data preliminary PUF; the deadline for hospitals to request revisions to the wage data and occupational mix data preliminary PUFs; and the deadline for MACs to complete the desk reviews. We stated that we were not changing the remainder of the FY 2016 timetable at this time. We stated that we expect that making these changes for the FY 2016 timetable will improve the accuracy of the February 2016 PUF, and also mitigate the number of hospital appeals based on the February 2016 PUF. In addition, we believe these changes will help hospitals, MACs, and CMS adjust to the more significant timeline changes proposed for FY 2017. We listed only the changes for FY 2016 in the table shown below side by side with the existing FY 2015 timetable so that commenters could read the FY 2016 changes in the context of the existing timetable. We stated that we were not listing dates that would remain unchanged for FY 2016.

DeadlinesFY 2015 timetableAdjusted FY 2016 timetable
Posting of Preliminary Wage Data PUF on CMS Web siteSeptember 13, 2013Late May 2014
Posting of Preliminary CY 2013 Occupational Mix Data PUF on CMS Web siteSeptember 13, 2013Early to Mid-July 2014
Deadline for Hospitals to Request Revisions to Preliminary PUFNovember 21, 2013Early October 2014
Deadline for MACs to Complete Desk ReviewsJanuary 29, 2014Mid-December 2014

Typically, the preliminary PUF initiating the start of an IPPS wage index fiscal year contains one spreadsheet with the Worksheet S-3 wage data for the applicable fiscal year on one tab, and another tab with the preliminary occupational mix data for that fiscal year. For the FY 2016 wage index, new occupational mix survey data will be available for use, based on the CY 2013 occupational mix survey. Hospitals were required to submit their CY 2013 occupational mix surveys to their MACs no later than July 1, 2014. Therefore, we did not have the preliminary CY 2013 occupational mix survey data in time to post it simultaneously in late May 2014 with the preliminary FY 2016 wage data. Accordingly, as the table above indicates, we posted the preliminary FY 2016 wage data by itself first in late May 2014, followed by a separate posting of the preliminary CY 2013 occupational mix survey data when the data became available, in mid-July 2014.

We invited public comments on our proposals set forth above to make revisions to the wage index timetables for FY 2017.

Comment: Numerous commenters were supportive of the general concept of changing the wage index timeline, and that the overall accuracy of the wage index could be improved by altering the timing of the process. Commenters generally agreed with CMS' adjusted FY 2016 timetable, which specified that the preliminary PUF would be posted in May 2014, and hospitals would request revisions to the preliminary PUF by early October, 2014. Commenters believed the extra time between the posting of the preliminary PUF and the desk review program would allow hospitals more time to “scrub” their data. However, commenters also asked that CMS work with its MACs to ensure that the MACs also are meeting their respective deadlines, as some hospitals have noticed that their MACs missed deadlines to submit revisions to CMS.

With respect to the adjustments to the FY 2017 timetable, the commenters believed that an early August 2015 deadline for hospitals to request revisions to the May 2015 preliminary PUFs was too ambitious because it would not provide sufficient time for hospitals to review their data, particularly when key personnel may be on vacation during the summer months. The commenters added that an August deadline would leave less time to compare the preliminary wage index information to the prior year's wage index data, given that the prior year's data are not even finalized and available to the public before August 1. Some commenters recommended an early October deadline, while others stated that an early September, mid-September, or a late September deadline would be feasible. One commenter believed that a December deadline would be best for hospitals with June 30 fiscal year ends, while another commenter stated that a late September or early October deadline would be acceptable for such hospitals. One commenter stated that the proposed FY 2017 deadline does not provide enough time for hospitals to incorporate their pension data into the desk review process because the Internal Revenue Service (IRS) Form 5500 (used as the basis for reporting pension contributions for defined benefit plans) is due 7 months after the end of the plan year (July 31), with possible extensions through mid-September. The commenter recommended that CMS either move the proposed deadline to October, or allow hospitals to submit their revisions for pension data during the MAC desk review process.

Response: We appreciate the commenters' general support for our proposed revisions to the wage index timetable. We listed general timeframes in the FY 2016 timetable but will communicate the exact dates for the FY 2016 timetable to hospitals through their MACs after issuance of this FY 2015 final rule. Regarding the FY 2017 Wage Index Timetable, we understand the commenters' concerns that an August deadline for hospitals to submit revisions to their preliminary wage data may be too challenging to meet. However, while almost all of the commenters believed that an August deadline was too ambitious, there was no consensus from the commenters regarding when the deadline should be, with recommendations ranging from early September to December. We also partially agree with the commenter who raised the point that hospitals may not be able to provide their pension data until October, as further discussed below. In addition, we noted that commenters requested that CMS work with the MACs to ensure that the MACs are meeting their respective deadlines. We understand that the MACs have also faced pressure to accurately complete desk reviews and submit to CMS the appropriate revisions on behalf of hospitals in a timely fashion. The longer the time that hospitals have to submit revisions to their preliminary wage data, the less time the MACs have to conduct their desk reviews. Therefore, we believe that it is important to accommodate both the hospitals' and MACs' need for more time to adequately review the wage and occupational mix data. Because the earliest deadline that commenters stated would be feasible is early September, we are finalizing a date within the first week of September 2015 (rather than early August) as the deadline for hospitals to request revisions to their FY 2017 preliminary wage and occupational mix data. A deadline in early September would be manageable for hospitals, yet also provide the MACs with the most amount of time possible to complete their desk reviews. In addition to a general deadline of early September, we are providing a limited exception for submission of a certain hospital's pension data. Specifically, we are only providing an extension for hospitals that have a fiscal year begin date on or after August 15 of a year to submit their pension data by mid-October because hospitals with fiscal year begin dates prior to August 15 would have already made their 3-year pension contributions by the end of September. We believe that the majority of hospitals, which do have fiscal year begin dates prior to August 15 of a year, would be able to submit their pension data, along with the remainder of their wage index documentation, to their MACs by the beginning of September each year. In this final rule, we are changing our wage index timetable for FY 2016 and after so that hospitals with fiscal years that begin on or after August 15 may submit their pension data to their MACs by mid-October. However, in future rulemaking, we may consider revisions to the 3-year average pension policy, which would allow all hospitals to submit their pension data at the same time. For FY 2017, the MACs would work on the desk reviews until mid-November 2015 (instead of mid October, as proposed). Following are the revised FY 2016 and FY 2017 Wage Index Timetables that we are finalizing:

FY 2016 Wage Index Timetable

DeadlinesFY 2015 timetableAdjusted FY 2016 timetable
Posting of Preliminary Wage Data PUF on CMS Web siteSeptember 13, 2013May 23, 2014.
Posting of Preliminary CY 2013 Occupational Mix Data PUF on CMS Web siteSeptember 13, 2013July 11, 2014.
Deadline for Hospitals to Request Revisions to Preliminary PUFNovember 21, 2013Early October 2014.
Deadline for Hospitals with FYBs on or after August 15 to Submit Pension Data to MACsNovember 21, 2013Mid October 2014.
Deadline for MACs to Complete Desk ReviewsJanuary 29, 2014Mid-December 2014.

FY 2017 Wage Index Timetable

DeadlinesFY 2015 timetableFY 2017 timetable
Posting of Preliminary PUF on CMS Web siteSeptember 13, 2013Mid-May 2015.
Deadline for Hospitals to Request Revisions to Preliminary PUFNovember 21, 2013First week of September 2015.
Deadline for Hospitals with FYBs on or after August 15 to Submit Pension Data to MACsNovember 21, 2013Mid-October 2015.
Deadline for MACs to Complete Desk ReviewsJanuary 29, 2014Mid-November 2015.
Posting of February PUF on CMS Web siteFebruary 20, 2014Late January 2016
Deadline Following Posting of February PUF for Hospitals to Request RevisionsMarch 3, 2014Mid-February 2016.
Completion of Appeals by MACs and Transmission of Final Wage Data to CMSApril 9, 2014Mid- to Late March 2016.
Deadline for Hospitals to Appeal in AprilApril 16, 2014Early April 2016.
Posting of Final Rule PUFMay 2, 2014Late April 2016.
Deadline for Hospitals to Appeal in JuneJune 2, 2014Late May 2016.
Expected Issuance of IPPS final ruleAugust 1, 2014August 1, 2016.

Comment: Commenters asked that CMS instruct MACs to notify State hospital associations of aberrant data, in addition to the current practice of notifying State hospital associations about hospitals that do not respond to requests for data. In addition, commenters recommended that CMS provide more instructions to MACs and hospitals regarding how to correct errors and the timeframe for correcting errors. They believed that this action is necessary because the notification to hospital associations would be after the deadline for hospitals to request data adjustments. Another commenter suggested that accuracy and consistency in wage index verification would be improved if CMS would assign a single MAC to review all wage index data.

Response: We will take these comments into consideration as we develop the details of the Wage Index Timetables and the desk review instructions that we provide to the MACs.

L. Labor-Related Share for the FY 2015 Wage Index

Section 1886(d)(3)(E) of the Act directs the Secretary to adjust the proportion of the national prospective payment system base payment rates that are attributable to wages and wage-related costs by a factor that reflects the relative differences in labor costs among geographic areas. It also directs the Secretary to estimate from time to time the proportion of hospital costs that are labor-related: “The Secretary shall adjust the proportion (as estimated by the Secretary from time to time) of hospitals' costs which are attributable to wages and wage-related costs of the DRG prospective payment rates. . . .” We refer to the portion of hospital costs attributable to wages and wage-related costs as the labor-related share. The labor-related share of the prospective payment rate is adjusted by an index of relative labor costs, which is referred to as the wage index.

Section 403 of Public Law 108-173 amended section 1886(d)(3)(E) of the Act to provide that the Secretary must employ 62 percent as the labor-related share unless this “would result in lower payments to a hospital than would otherwise be made.” However, this provision of Public Law 108-173 did not change the legal requirement that the Secretary estimate “from time to time” the proportion of hospitals' costs that are “attributable to wages and wage-related costs.” Thus, hospitals receive payment based on either a 62-percent labor-related share, or the labor-related share estimated from time to time by the Secretary, depending on which labor-related share resulted in a higher payment.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50596 through 50607), we rebased and revised the hospital market basket. We established a FY 2010-based IPPS hospital market basket to replace the FY 2006-based IPPS hospital market basket, effective October 1, 2013. In that final rule, we presented our analysis and conclusions regarding the frequency and methodology for updating the labor-related share for FY 2014. Using the FY 2010-based IPPS market basket, we finalized a labor-related share for FY 2014 of 69.6 percent. In addition, we implemented this revised and rebased labor-related share in a budget neutral manner. However, consistent with section 1886(d)(3)(E) of the Act, we did not take into account the additional payments that would be made as a result of hospitals with a wage index less than or equal to 1.0000 being paid using a labor-related share lower than the labor-related share of hospitals with a wage index greater than 1.0000.

The labor-related share is used to determine the proportion of the national IPPS base payment rate to which the area wage index is applied. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28083), for FY 2015, we did not propose to not make any further changes to the national average proportion of operating costs that are attributable to wages and salaries, employee benefits, contract labor, the labor-related portion of professional fees, administrative and facilities support services, and all other labor-related services. Therefore, for FY 2015, we proposed to continue to use a labor-related share of 69.6 percent for discharges occurring on or after October 1, 2014.

Tables 1A and 1B, which were published in section VI. of the Addendum to the FY 2015 IPPS/LTCH PPS proposed rule and available via the Internet on the CMS Web site, reflected this proposed labor-related share. For FY 2015, for all IPPS hospitals whose wage indexes are less than or equal to 1.0000, we proposed to apply the wage index to a labor-related share of 62 percent of the national standardized amount. For all IPPS hospitals whose wage indexes are greater than 1.0000, for FY 2015, we proposed to apply the wage index to a proposed labor-related share of 69.6 percent of the national standardized amount. We note that, for Puerto Rico hospitals, the national labor-related share is 62 percent because the national wage index for all Puerto Rico hospitals is less than 1.0000.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50601 through 50603), we also rebased and revised the labor-related share for the Puerto Rico-specific standardized amounts using FY 2010 as a base year. We finalized a labor-related share for the Puerto Rico-specific standardized amounts for FY 2014 of 63.2 percent. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28084), for FY 2015, we did not propose to make any further changes to the Puerto Rico specific average proportion of operating costs that are attributable to wages and salaries, employee benefits, contract labor, the labor-related portion of professional fees, administrative and facilities support services, and all other labor-related services. For FY 2015, we proposed to continue to use a labor-related share for the Puerto Rico-specific standardized amounts of 63.2 percent for discharges occurring on or after October 1, 2014. Puerto Rico hospitals are paid based on 75 percent of the national standardized amounts and 25 percent of the Puerto Rico-specific standardized amounts. For FY 2015, we proposed that the labor-related share of a hospital's Puerto Rico-specific rate would be either the Puerto Rico-specific labor-related share of 63.2 percent or 62 percent, depending on which results in higher payments to the hospital. If the hospital has a Puerto Rico-specific wage index greater than 1.000 for FY 2015, we proposed to set the hospital's rates using a labor-related share of 63.2 percent for the 25 percent portion of the hospital's payment determined by the Puerto Rico standardized amounts because this amount would result in higher payments. Conversely, a hospital with a Puerto Rico-specific wage index of less than or equal to 1.000 for FY 2015 would be paid using the Puerto Rico-specific labor-related share of 62 percent of the Puerto Rico-specific rates because the lower labor-related share would result in higher payments. The proposed Puerto Rico labor-related share of 63.2 percent for FY 2015 is reflected in Table 1C, which was published in section VI. of the Addendum to the FY 2015 IPPS/LTCH PPS proposed rule and available via the Internet on the CMS Web site.

Comment: One commenter believed that CMS has provided incentives for hospitals to reduce costs through a declining wage index. The commenter stated that CMS has not kept pace by adjusting the labor-related share of 62 percent for hospitals with a wage index below 1.0000. The commenter noted that current law requires a labor-related share of 62 percent for hospitals with a wage index less than or equal to 1.0000. However, the commenter requested that, despite current law, in consideration of its comments, CMS lower the labor-related share from 62 percent to 42 percent for hospitals with a wage index below 1.0000.

One commenter recommended that CMS compute an alternative labor and nonlabor-related share percentage under the national standardized amount for hospitals in Puerto Rico. The commenter explained that the current labor-related share percentage of 62 percent under the national standardized amounts meets the statutory definition in section 1886(d)(3)(E) of the Act, resulting in lower payments for providers in Puerto Rico. Therefore, the commenter believed that CMS should calculate an alternative national labor-related share percentage for hospitals in Puerto Rico that is lower than 62 percent.

Response: As mentioned by the commenter, current law requires that the labor-related share be set at 62 percent for hospitals with a wage index less than or equal to 1.0000. Specifically, as discussed above, section 403 of Public Law 108-173 amended section 1886(d)(3)(E) of the Act to provide that the Secretary must employ 62 percent as the labor-related share unless this “would result in lower payments to a hospital than would otherwise be made.” Therefore, we are unable to change the labor-related share of 62 percent. In addition, the commenters did not provide any empirical data to demonstrate why a lower labor-related share percentage is justified. Therefore, we are unable to verify the commenters' statement.

After consideration of public comments received, we are finalizing our proposals without modification. For FY 2015, we are continuing to use a labor-related share of 69.6 percent for discharges occurring on or after October 1, 2014. Tables 1A and 1B, which are published in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site, reflect this labor-related share. For FY 2015, for all IPPS hospitals whose wage indexes are less than or equal to 1.0000, we are applying the wage index to a labor-related share of 62 percent of the national standardized amount. For all IPPS hospitals whose wage indexes are greater than 1.0000, for FY 2015, we are applying the wage index to a labor-related share of 69.6 percent of the national standardized amount. For Puerto Rico hospitals, the national labor-related share is 62 percent because the national wage index for all Puerto Rico hospitals is less than 1.0000. For FY 2015, we also are continuing to use a labor-related share for the Puerto Rico-specific standardized amounts of 63.2 percent for discharges occurring on or after October 1, 2014. Puerto Rico hospitals are paid based on 75 percent of the national standardized amounts and 25 percent of the Puerto Rico-specific standardized amounts. For FY 2015, the labor-related share of a hospital's Puerto Rico-specific rate will be either the Puerto Rico-specific labor-related share of 63.2 percent or 62 percent, depending on which results in higher payments to the hospital. If the hospital has a Puerto Rico-specific wage index greater than 1.000 for FY 2015, we will set the hospital's rates using a labor-related share of 63.2 percent for the 25-percent portion of the hospital's payment determined by the Puerto Rico standardized amounts because this amount will result in higher payments. The Puerto Rico labor-related share of 63.2 percent for FY 2015 is reflected in Table 1C, which is published in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site.

IV. Other Decisions and Changes to the IPPS for Operating Costs and Graduate Medical Education (GME) Costs

A. Changes to MS-DRGs Subject to the Postacute Care Transfer Policy (§ 412.4)

1. Background

Existing regulations at § 412.4(a) define discharges under the IPPS as situations in which a patient is formally released from an acute care hospital or dies in the hospital. Section 412.4(b) defines acute care transfers, and § 412.4(c) defines postacute care transfers. Our policy, set forth in § 412.4(f), provides that when a patient is transferred and his or her length of stay is less than the geometric mean length of stay for the MS-DRG to which the case is assigned, the transferring hospital is generally paid based on a graduated per diem rate for each day of stay, not to exceed the full MS-DRG payment that would have been made if the patient had been discharged without being transferred.

The per diem rate paid to a transferring hospital is calculated by dividing the full DRG payment by the geometric mean length of stay for the MS-DRG. Based on an analysis that showed that the first day of hospitalization is the most expensive (60 FR 45804), our policy generally provides for payment that is twice the per diem amount for the first day, with each subsequent day paid at the per diem amount up to the full MS-DRG payment (§ 412.4(f)(1)). Transfer cases are also eligible for outlier payments. In general, the outlier threshold for transfer cases, as described in § 412.80(b), is equal to the fixed-loss outlier threshold for nontransfer cases (adjusted for geographic variations in costs), divided by the geometric mean length of stay for the MS-DRG, and multiplied by the length of stay for the case, plus one day.

We established the criteria set forth in § 412.4(d) for determining which DRGs qualify for postacute care transfer payments in the FY 2006 IPPS final rule (70 FR 47419 through 47420). The determination of whether a DRG is subject to the postacute care transfer policy was initially based on the Medicare Version 23.0 GROUPER (FY 2006) and data from the FY 2004 MedPAR file. However, if a DRG did not exist in Version 23.0 or a DRG included in Version 23.0 is revised, we use the current version of the Medicare GROUPER and the most recent complete year of MedPAR data to determine if the DRG is subject to the postacute care transfer policy. Specifically, if the MS-DRG's total number of discharges to postacute care equals or exceeds the 55th percentile for all MS-DRGs and the proportion of short-stay discharges to postacute care to total discharges in the MS-DRG exceeds the 55th percentile for all MS-DRGs, CMS will apply the postacute care transfer policy to that MS-DRG and to any other MS-DRG that shares the same base MS-DRG. In the preamble to the FY 2006 IPPS final rule (70 FR 47419), we stated that “we will not revise the list of DRGs subject to the postacute care transfer policy annually unless we are making a change to a specific DRG.”

To account for MS-DRGs subject to the postacute care transfer policy that exhibit exceptionally higher shares of costs very early in the hospital stay, § 412.4(f) also includes a special payment methodology. For these MS-DRGs, hospitals receive 50 percent of the full MS-DRG payment, plus the single per diem payment, for the first day of the stay, as well as a per diem payment for subsequent days (up to the full MS-DRG payment (§ 412.4(f)(6)). For an MS-DRG to qualify for the special payment methodology, the geometric mean length of stay must be greater than 4 days, and the average charges of 1-day discharge cases in the MS-DRG must be at least 50 percent of the average charges for all cases within the MS-DRG. MS-DRGs that are part of an MS-DRG group will qualify under the DRG special payment policy if any one of the MS-DRGs that share that same base MS-DRG qualifies (§ 412.4(f)(6)).

2. Changes to the Postacute Care Transfer MS-DRGs

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28084 through 28086), we discussed that, based on our annual review of MS-DRGs, we had identified a number of MS-DRGs that should be included on the list of MS-DRGs subject to the postacute care transfer policy. In response to public comments and based on our analysis of FY 2013 MedPAR claims data, we proposed to make several changes to MS-DRGs to better capture certain severity of illness levels, to be effective for FY 2015. Specifically, we proposed to modify the assignment of endovascular cardiac valve replacements currently assigned to MS-DRGs 216 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with MCC), 217 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization with CC), 218 (Cardiac Valve & Other Major Cardiothoracic Procedures with Cardiac Catheterization without CC/MCC), 219 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with MCC), 220 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization with CC), and 221 (Cardiac Valve & Other Major Cardiothoracic Procedures without Cardiac Catheterization without CC/MCC) to MS-DRGs 266 and 267 (Endovascular Cardiac Valve Replacement with and without MCC, respectively) to better reflect the differences in patients receiving endovascular cardiac valve replacements from patients who undergo an open chest cardiac valve replacement. We also proposed to further refine back and neck procedures currently assigned to MS-DRGs 490 and 491 (Back & Neck Procedure Except Spinal Fusion with CC/MCC or Disc Device/Neurostimulator and without CC/MCC or Disc Device/Neurostimulator, respectively) into additional severity levels, now identified as MS-DRGs 518, 519, and 520 (Back & Neck Procedure Except Spinal Fusion with MCC or Disc Device/Neurostimulator, with CC, and without MCC/CC, respectively). Finally, we proposed to remove the severity levels for reverse shoulder replacements, merging MS-DRGs 483 and 484 (Major Joint & Limb Reattachment Procedure of Upper Extremity with CC/MCC and without CC/MCC, respectively) into MS-DRG 483 (Major Joint/Limb Reattachment Procedure of Upper Extremities). A discussion of these proposed changes can be found in section II.G.4.c., II.G.5.c. and II.G.5.a., respectively, of the preamble of the proposed rule.

In light of these proposed changes to the MS-DRGs according to the regulations under § 412.4(c), we evaluated these proposed FY 2015 MS-DRGs against the general postacute care transfer policy criteria using the FY 2013 MedPAR data. If an MS-DRG qualified for the postacute care transfer policy, we also evaluated that MS-DRG under the special payment methodology criteria according to regulations at § 412.4(f)(6). We continue believe it is appropriate to reassess MS-DRGs when proposing reassignment of diagnostic codes that would result in material changes to an MS-DRG. As a result of our review, we found that MS-DRGs 216 through 221 would require no revisions in postacute care transfer or special payment policy status. However, we proposed to update the list of MS-DRGs that are subject to the postacute care transfer policy to include the proposed new MS-DRGs 266, 267, 518, 519, and 520. (These MS-DRGs are reflected in Table 5, which is listed in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site, and also are listed in the charts at the end of this section.)

In addition, based on our evaluation of the proposed FY 2015 MS-DRGs using the FY 2013 Med PAR data, we determined that proposed revised MS-DRG 483 would no longer meet the postacute care transfer criteria. Therefore, we proposed that it be removed from the list of MS-DRGs subject to the postacute care transfer policy, effective FY 2015. We refer readers to the asterisk (*) bolded text in the following table for which criterion was not met in our analysis for each MS-DRG removed from the postacute care transfer policy list.

List of MS-DRGs That Would Change Postacute Care Transfer Policy Status In FY 2015

MS-DRGMS-DRG titleTotal casesPostacute care transfers (55th percentile: 1,471)Short-stay postacute care transfersPercent of short-stay postacute care transfers to all cases (55th percentile: 7.9060%)Postacute transfer policy status
266Endovascular Cardiac Valve Replacement with MCC4,0862,8511,03025.21YES.
267Endovascular Cardiac Valve Replacement w/o MCC4,4762,80083518.66YES.
483Major Joint/Limb Reattachment Procedure of Upper Extremities41,37217,2892,271* 5.49NO.
518Back & Neck Procedure Except Spinal Fusion with MCC or Disc Device/Neurostimulator3,8442,13641210.72YES.
519Back & Neck Procedure Except Spinal Fusion with CC15,2387,4051,126* 7.39YES.**
520Back & Neck Procedure Except Spinal Fusion without CC/MCC)31,7927,8590* 0.00YES.**
* Indicates a current postacute care transfer policy criterion that the MS-DRG did not meet.
** As described in the policy at 42 CFR 412.4(d)(3)(ii)(D), MS-DRGs that share the same base MS-DRG will all qualify under the postacute care transfer policy if any one of the MS-DRGs that share that same base MS-DRG qualifies.

Finally, we determined that MS-DRGs 266, 267, 518, 519, and 520 also would meet the criteria for the special payment methodology. Therefore, we proposed that they would be subject to the MS-DRG special payment methodology, effective FY 2015.

List of MS-DRGs That Changed DRG Special Payment Policy Status in FY 2015

MS-DRGMS-DRG titleGeometric mean length of stayAverage charges of 1-day discharges50% of average charges for all cases within MS-DRGSpecial pay policy status
266Endovascular Cardiac Valve Replacement with MCC8.3643$42,081$126,326YES.*
267Endovascular Cardiac Valve Replacement without MCC5.0271128,01395,141YES.
518Back & Neck Procedure Except Spinal Fusion with MCC or Disc Device/Neurostimulator4.288268,51543,514YES.
519Back & Neck Procedure Except Spinal Fusion with CC3.050700YES.*
520Back & Neck Procedure Except Spinal Fusion without CC/MCC)1.731500YES.*
*As described in the policy at 42 CFR 412.4(d)(6)(iv), MS-DRGs that share the same base MS-DRG will all qualify under the DRG special payment policy if any one of the MS-DRGs that share that same base MS-DRG qualifies.

We did not receive any public comments regarding our proposals to change the postacute care transfer and the special payment policy status for the identified MS-DRGs. Therefore, we are adopting the proposed changes as final for FY 2015.

B. Changes in the Inpatient Hospital Update for FY 2015 (§ 412.64(d))

1. FY 2015 Inpatient Hospital Update

In accordance with section 1886(b)(3)(B)(i) of the Act, each year we update the national standardized amount for inpatient operating costs by a factor called the “applicable percentage increase.” In FY 2014, consistent with section 1886(b)(3)(B) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act, we set the applicable percentage increase under the IPPS by applying the following adjustments in the following sequence. Specifically, the applicable percentage increase under the IPPS is equal to the rate-of-increase in the hospital market basket for IPPS hospitals in all areas, subject to a reduction of 2.0 percentage points if the hospital fails to submit quality information under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act, and then subject to an adjustment based on changes in economy-wide productivity (the multifactor productivity (MFP) adjustment), and an additional reduction of 0.3 percentage point as required by section 1886(b)(3)(B)(xii) of the Act. Sections 1886(b)(3)(B)(xi) and (b)(3)(B)(xii) of the Act, as added by section 3401(a) of the Affordable Care Act, state that application of the MFP adjustment and the additional FY 2014 adjustment of 0.3 percentage point may result in the applicable percentage increase being less than zero.

For FY 2015, there are three statutory changes to the applicable percentage increase compared to FY 2014. First, under section 1886(b)(3)(B)(viii) of the Act, beginning with FY 2015, the reduction in the applicable percentage increase for hospitals that fail to submit quality information under rules established by the Secretary is one-quarter of the applicable percentage increase (prior to the application of statutory adjustments under sections 1886(b)(3)(B)(ix), 1886(b)(3)(B)(xi), and 1886(b)(3)(B)(xii) of the Act) or one-quarter of the applicable market basket update. For FY 2014, the reduction to the applicable percentage increase for hospitals that failed to submit quality information under rules established by the Secretary was 2.0 percentage points. Second, beginning with FY 2015, section 1886(b)(3)(B)(ix) of the Act requires that any hospital that is not a meaningful electronic health record (EHR) user (as defined in section 1886(n)(3) of the Act and not subject to an exception under section 1886(b)(3)(B)(ix) of the Act)) will have “three-quarters” of the applicable percentage increase (prior to the application of statutory adjustments under sections 1886(b)(3)(B)(viii), 1886(b)(3)(B)(xi), and 1886(b)(3)(B)(xii) of the Act), or three-quarters of the applicable market basket update, reduced by 331/3 percent. The reduction to three-quarters of the applicable percentage increase for those hospitals that are not meaningful EHR users increases to 662/3 percent for FY 2016, and, for FY 2017 and subsequent fiscal years, to 100 percent. Third, for FY 2015, section 1886(b)(3)(B)(xii) of the Act applies an additional reduction of 0.2 percentage point compared to 0.3 percentage point for FY 2014.

To summarize, for FY 2015, consistent with section 1886(b)(3)(B) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act, we are setting the applicable percentage increase by applying the following adjustments in the following sequence. Specifically, the applicable percentage increase under the IPPS is equal to the rate-of-increase in the hospital market basket for IPPS hospitals in all areas, subject to a reduction of one-quarter of the applicable percentage increase (prior to the application of other statutory adjustments; also referred to as the market basket update or rate-of-increase (with no adjustments)) for hospitals that fail to submit quality information under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act and a 331/3 percent reduction to three-fourths of the applicable percentage increase (prior to the application of other statutory adjustments; also referred to as the market basket update or rate-of-increase (with no adjustments)) for hospitals not considered to be meaningful EHR users in accordance with section 1886(b)(3)(B)(ix) of the Act, and then subject to an adjustment based on changes in economy-wide productivity (the multifactor productivity (MFP) adjustment), and an additional reduction of 0.2 percentage point as required by section 1886(b)(3)(B)(xii) of the Act. As noted previously, sections 1886(b)(3)(B)(xi) and (b)(3)(B)(xii) of the Act, as added by section 3401(a) of the Affordable Care Act, state that application of the MFP adjustment and the additional FY 2015 adjustment of 0.2 percentage point may result in the applicable percentage increase being less than zero.

We note that, in compliance with section 404 of the MMA, in the FY 2014 IPPS/LTCH PPS final rule, we replaced the FY 2006-based IPPS operating and capital market baskets with the revised and rebased FY 2010-based IPPS operating and capital market baskets for FY 2014. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28086), we proposed to continue to use the FY 2010-based IPPS operating and capital market baskets for FY 2015. We also proposed to continue to use a labor-related share that is reflective of the FY 2010 base year. For FY 2015, we proposed to continue using the labor-related share of 69.6 percent, which is based on the FY 2010-based IPPS market basket. We did not receive any public comments on this proposal and, therefore, for FY 2015, we will continue to use the FY 2010-based IPPS operating and capital market baskets and the labor-related share of 69.6 percent.

Based on the most recent data available for the FY 2015 proposed rule, in accordance with section 1886(b)(3)(B) of the Act, we proposed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28087) to base the proposed FY 2015 market basket update used to determine the applicable percentage increase for the IPPS on IHS Global Insight, Inc.'s (IGI's) first quarter 2014 forecast of the FY 2010-based IPPS market basket rate-of-increase with historical data through fourth quarter 2013, which was estimated to be 2.7 percent. We proposed that if more recent data became subsequently available (for example, a more recent estimate of the market basket and the MFP adjustment), we would use such data, if appropriate, to determine the FY 2015 market basket update and MFP adjustment in the final rule.

Based on updated data for this FY 2015 IPPS/LTCH PPS final rule, that is, the IGI's second quarter 2014 forecast of the FY 2010-based IPPS market basket rate-of-increase with historical data through first quarter 2014, we estimate that the FY 2015 market basket update used to determine the applicable percentage increase for the IPPS is 2.9 percent.

In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51689 through 51692), we finalized our methodology for calculating and applying the MFP adjustment. For FY 2015, we did not propose to make any change in our methodology for calculating and applying the MFP adjustment. For FY 2015, we proposed a MFP adjustment of −0.4 percentage point. Similar to the market basket adjustment, for the proposed rule, we used the most recent data available to compute the MFP adjustment.

Based on updated data for this final rule, we computed an MFP adjustment is 0.5 percentage point for FY 2015.

Comment: One commenter stated that the FY 2015 update factor is understated, as the productivity adjustment should be 0.4 (as projected in the proposed rule), not 0.5. The commenter stated that, as a result, instead of a 1.2 percent update factor, the projection should use a 1.3 percent update factor.

Response: As stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28087), the proposed productivity adjustment for FY 2015 was 0.4 percent. Furthermore, we proposed to make a 1.3 percent update to the national standardized amount (79 FR 28355), which reflects a proposed 2.7 percent market basket update, the proposed reduction of 0.4 percentage point for the multifactor productivity adjustment, the 0.2 percentage point reduction in accordance with the Affordable Care Act and the proposed FY 2015 documentation and coding recoupment adjustment of −0.8 percent on the national standardized amount as part of the recoupment required by section 631 of the ATRA.

As stated in the proposed rule, we proposed to use more recently available data to determine the final market basket and multifactor productivity adjustment. We did not receive any public comments on this proposal. Therefore, for this final rule, we are finalizing a market basket update of 2.9 percent and an MFP adjustment of 0.5 percent based on more recently available data.

For FY 2015, depending on whether a hospital submits quality data under the rules established in accordance with section 1886(b)(3)(B)(viii) of the Act (hereafter referred to as a hospital that submits quality data) and is a meaningful EHR user under section 1886(b)(3)(B)(ix) of the Act (hereafter referred to as a hospital that is a meaningful EHR user), we discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28087) that there are four possible applicable percentage increases that can be applied to the standardized amount. As noted above, we proposed that if more recent data became subsequently available (for example, a more recent estimate of the market basket and the MFP adjustment), we would use such data, if appropriate, to determine the FY 2015 market basket update and MFP adjustment in the final rule.

We did not receive any public comments on the four applicable percentage increases and our finalizing our proposal. Based on the more recent data described earlier, we have determined final applicable percentage increases to the standardized amount for FY 2015, as specified below.

  • For a hospital that submits quality data and is a meaningful EHR user, we are finalizing an applicable percentage increase to the FY 2015 operating standardized amount of 2.2 percent (that is, the FY 2015 estimate of the market basket rate-of-increase of 2.9 percent less an adjustment of 0.5 percentage point for economy-wide productivity (that is, the MFP adjustment) and less 0.2 percentage point).
  • For a hospital that submits quality data and is not a meaningful EHR user, we are finalizing an applicable percentage increase to the operating standardized amount of 1.475 percent (that is, the FY 2015 estimate of the market basket rate-of-increase of 2.9 percent, less an adjustment of 0.725 percentage point (the market basket rate-of-increase of 2.9 percent × 0.75)/3) for failure to be a meaningful EHR user, less an adjustment of 0.5 percentage point for the MFP adjustment, and less an additional adjustment of 0.2 percentage point).
  • For a hospital that does not submit quality data and is a meaningful EHR user, we are finalizing an applicable percentage increase to the operating standardized amount of 1.475 percent (that is, the FY 2015 estimate of the market basket rate-of-increase of 2.9 percent, less an adjustment of 0.725 percentage point (the market basket rate-of-increase of 2.9 percent/4) for failure to submit quality data, less an adjustment of 0.5 percentage point for the MFP adjustment, and less an additional adjustment of 0.2 percentage point).
  • For a hospital that does not submit quality data and is not a meaningful EHR user, we are finalizing an applicable percentage increase to the operating standardized amount of 0.75 percent (that is, the FY 2015 estimate of the market basket rate-of-increase of 2.9 percent, less an adjustment of 0.725 percentage point (the market basket rate-of-increase of 2.9 percent/4) for failure to submit quality data, less an adjustment of 0.725 percentage point (the market basket rate-of-increase of 2.9 percent × 0.75)/3) for failure to be a meaningful EHR user, less an adjustment of 0.5 percentage point for the MFP adjustment, and less an additional adjustment of 0.2 percentage point). Below we provide a table summarizing the four final applicable percentage increases.

Final FY 2015 Applicable Percentage Increases for the IPPS

FY 2015Hospital submitted quality data and is a meaningful EHR userHospital submitted quality data and is NOT a meaningful EHR userHospital did NOT submit quality data and is a meaningful EHR userHospital did NOT submit quality data and is NOT a meaningful EHR user
Market Basket Rate-of-Increase2.92.92.92.9
Adjustment for Failure to Submit Quality Data under Section 1886(b)(3)(B)(viii) of the Act.0.00.0−0.725−0.725
Adjustment for Failure to be a Meaningful EHR User under Section 1886(b)(3)(B)(ix) of the Act.0.0−0.7250.0−0.725
MFP Adjustment under Section 1886(b)(3)(B)(xi) of the Act−0.5−0.5−0.5−0.5
Statutory Adjustment under Section 1886(b)(3)(B)(xii) of the Act−0.2−0.2−0.2−0.2
Final Applicable Percentage Increase Applied to Standardized Amount2.21.4751.4750.75

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28087), we proposed to revise the existing regulations at 42 CFR 412.64(d) to reflect the current law for the FY 2015 update. Specifically, in accordance with section 1886(b)(3)(B) of the Act, we proposed to add a new paragraph (vi) to § 412.64(d)(1) to reflect the applicable percentage increase to the FY 2015 operating standardized amount as the percentage increase in the market basket index, subject to a reduction of one-fourth of the applicable percentage increase (prior to the application of other statutory adjustments) if the hospital fails to submit quality information (under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act) and a 331/3 percent reduction to three-fourths of the applicable percentage increase (prior to the application of other statutory adjustments) for a hospital that is not a meaningful EHR user in accordance with section 1886(b)(3)(B)(ix) of the Act, less an MFP adjustment and less an additional reduction of 0.2 percentage point.

In addition, we proposed to make technical changes to §§ 412.64(d)(1), (d)(1)(i) through (d)(1)(v), (d)(2)(i), (d)(2)(ii), and (d)(3) introductory text to reflect the order in which CMS applies the statutory adjustments to the applicable percentage increase under section 1886(b)(3)(B) of the Act. As mentioned above, consistent with section 1886(b)(3)(B) of the Act, CMS sets the applicable percentage increase under the IPPS by applying the following adjustments in the following sequence. Specifically, we set the applicable percentage increase under the IPPS equal to the rate-of-increase in the hospital market basket for IPPS hospitals in all areas subject to a reduction for hospitals that fail to submit quality information under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act and, beginning in FY 2015, a reduction for hospitals not considered to be meaningful EHR users in accordance with section 1886(b)(3)(B)(ix) of the Act; and then subject to an adjustment based on changes in economy-wide productivity (the MFP adjustment), and an additional reduction as required by section 1886(b)(3)(B)(xii) of the Act.

The existing regulation text at § 412.64(d)(2) and (d)(3) describes the reductions for hospitals that fail to submit quality information under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act and hospitals not considered to be meaningful EHR users in accordance with section 1886(b)(3)(B)(ix) of the Act as reductions to “the applicable percentage change specified in paragraph (d)(1) of this section.” Section 412.64(d)(1) describes the applicable percentage change for the applicable fiscal year as the percentage increase in the market basket index less the MFP adjustment and less the additional reduction required by section 1886(b)(3)(B)(xii) of the Act. This text suggests that CMS applies the reduction for hospitals that fail to submit quality information and, beginning in FY 2015, the reduction for hospitals not considered to be meaningful EHR users, after it applies the MFP adjustment and the additional reduction under section 1886(b)(3)(B)(xii) of the Act. Therefore, we proposed to revise the regulations in § 412.64(d) to reflect the order in which CMS applies the adjustments to the applicable percentage increase under section 1886(b)(3)(B) of the Act. We note that we also proposed clarifying amendments to the regulatory text for prior fiscal years under §§ 412.64(d)(1)(i) through (d)(1)(v) to reflect the determination of the applicable percentage change for those prior years as well as other technical changes for readability.

We did not receive any public comments on our proposed changes to the regulations at §§ 412.64(d)(1), (d)(1)(i) through (d)(1)(v), (d)(2)(i), (d)(2)(ii), and (d)(3) introductory text and therefore are finalizing these proposed changes without modification.

Section 1886(b)(3)(B)(iv) of the Act provides that the applicable percentage increase to the hospital-specific rates for SCHs and MDHs equals the applicable percentage increase set forth in section 1886(b)(3)(B)(i) of the Act (that is, the same update factor as for all other hospitals subject to the IPPS). Therefore, the update to the hospital-specific rates for SCHs and MDHs is also subject to section 1886(b)(3)(B)(i) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act. Accordingly, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28088), for FY 2015, we proposed the following updates to the hospital-specific rates applicable to SCHs and MDHs: An update of 2.1 percent for a hospital that submits quality data and is a meaningful EHR user; an update of 1.425 percent for a hospital that fails to submit quality data and is a meaningful EHR user; an update of 1.425 percent for a hospital that submits quality data and is not a meaningful EHR user; an update of 0.75 percent for a hospital that fails to submit quality data and is not a meaningful EHR user. (As noted below, under current law, the MDH program is effective for discharges occurring on or before March 31, 2015.) For FY 2015, the existing regulations in §§ 412.73(c)(16), 412.75(d), 412.77(e), 412.78(e), and 412.79(d) contain provisions that set the update factor for SCHs and MDHs equal to the update factor applied to the national standardized amount for all IPPS hospitals. Therefore, we did not propose to make any further changes to these five regulatory provisions to reflect the FY 2015 update factor for the hospital-specific rates of SCHs and MDHs. As mentioned above, for the proposed rule, we used IGI's first quarter 2014 forecast of the FY 2010-based IPPS market basket update with historical data through fourth quarter 2013. Similarly, we used IGI's first quarter 2014 forecast of the MFP adjustment. For the final rule, we proposed to use the most recent data available. We did not receive any public comments on these proposals and therefore our finalizing them as proposed to set the update for SCHs and MDHs in this final rule using the most recent data available.

As discussed above, based on the more recent data for IGI's second quarter 2014 forecast of the FY 2010-based IPPS market basket update with historical data through first quarter 2014, we estimate that the FY 2015 market basket update used to determine the update factor for this final rule for the hospital-specific rates of SCHs and MDHs is 2.9 percent. Similarly, for this final rule, we used IGI's second quarter 2014 forecast of the MFP adjustment, which is estimated at 0.5 percentage point for FY 2015. Accordingly, we are finalizing the following updates to the hospital-specific rates applicable to SCHs and MDHs: An update of 2.2 percent for a hospital that submits quality data and is a meaningful EHR user; an update of 1.475 percent for a hospital that fails to submit quality data and is a meaningful EHR user; an update of 1.475 percent for a hospital that submits quality data and is not a meaningful EHR user; an update of 0.75 percent for a hospital that fails to submit quality data and is not a meaningful EHR user.

We note that, as discussed in section IV.G. of the preamble of this final rule, section 1106 of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67), enacted on December 26, 2013, extended the MDH program from the end of FY 2013 through the first half of FY 2014 (that is, for discharges occurring before April 1, 2014). Subsequently, section 106 of the Protecting Access to Medicare Act of 2014, Public Law 113-93, enacted on April 1, 2014, further extended the MDH program through the first half of FY 2015 (that is, for discharges occurring before April l, 2015). Prior to the enactment of Public Law 113-67, the MDH program was to be in effect through the end of FY 2013 only. The MDH program expires for discharges beginning on April 1, 2015 under current law. Accordingly, the update of the hospital-specific rates for FY 2015 for MDHs will apply in determining payments for FY 2015 discharges occurring before April 1, 2015.

2. FY 2015 Puerto Rico Hospital Update

Puerto Rico hospitals are paid a blended rate for their inpatient operating costs based on 75 percent of the national standardized amount and 25 percent of the Puerto Rico-specific standardized amount. Section 1886(d)(9)(C)(i) of the Act is the basis for determining the applicable percentage increase applied to the Puerto Rico-specific standardized amount. Section 401(c) of Public Law 108-173 amended section 1886(d)(9)(C)(i) of the Act, which states that, for discharges occurring in a fiscal year (beginning with FY 2004), the Secretary shall compute an average standardized amount for hospitals located in any area of Puerto Rico that is equal to the average standardized amount computed under subclause (I) for fiscal year 2003 for hospitals in a large urban area (or, beginning with FY 2005, for all hospitals in the previous fiscal year) increased by the applicable percentage increase under subsection (b)(3)(B) for the fiscal year involved. Therefore, the update to the Puerto Rico-specific operating standardized amount equals the applicable percentage increase set forth in section 1886(b)(3)(B)(i) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act (that is, the same update factor as for all other hospitals subject to the IPPS). Accordingly, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28088), we proposed an applicable percentage increase to the Puerto Rico-specific operating standardized amount of 2.1 percent for FY 2015. We also proposed, for the final rule, to use the most recent data available to determine the FY 2015 applicable percentage increase. We note that the provisions of section 1886(b)(3)(B)(viii) of the Act, which specify the adjustments to the applicable percentage increase for “subsection (d)” hospitals that do not submit quality data under the rules established by the Secretary, and the provisions of section 1886(b)(3)(B)(ix) of the Act, which specify the adjustments to the applicable percentage increase for “subsection (d)” hospitals that are not meaningful EHR users, are not applicable to hospitals located in Puerto Rico.

We did not receive any public comments concerning our proposal. Therefore, using the most recent data available, we are finalizing an applicable percentage increase to the Puerto Rico-specific operating amount of 2.2 percent for FY 2015. As we noted above, for the proposed rule, we used the first quarter 2014 forecast of the FY 2010-based IPPS market basket update with historical data through fourth quarter 2013. For this final rule, we used the most recent data available, which is IGI's second quarter 2014 forecast of the FY 2010-based IPPS market basket update with historical data through first quarter 2014. Similarly, for the proposed rule, we used IGI's first quarter 2014 forecast of the MFP adjustment. For this final rule, we used the most recent data available, which was IGI's second quarter 2014 forecast of the MFP adjustment.

For FY 2015, the existing regulations in § 412.211(c) set the update factor for Puerto Rico-specific standardized amount equal to the update factor applied to the national standardized amount for all IPPS hospitals. Therefore, we are not making any further changes to this regulatory provision to reflect the FY 2015 update factor for the Puerto Rico-specific standardized amount.

Comment: One commenter indicated that the nonlabor costs in Puerto Rico are closer or equal to those in the United States. It is unclear what the commenter was requesting. Based on our interpretation of the comment, it appears that the commenter may be requesting that CMS make equal the nonlabor payment amount of the Puerto Rico-specific standardized amount to the nonlabor payment amount of the national standardized amount.

Response: The commenter did not provide any empirical data to demonstrate how the nonlabor costs in Puerto Rico are equal to those in the United States. Therefore, we are unable to verify the commenter's statement. In addition, we did not propose to make any updates to the national or Puerto Rico-specific standardized amounts aside from applying the statutory updates as discussed earlier. We will continue to work with Puerto Rico and other stakeholders to ensure we are using appropriate data for ratesettting.

C. Rural Referral Centers (RRCs): Annual Updates to Case-Mix Index and Discharge Criteria (§ 412.96)

Under the authority of section 1886(d)(5)(C)(i) of the Act, the regulations at § 412.96 set forth the criteria that a hospital must meet in order to qualify under the IPPS as a rural referral center (RRC). RRCs receive some special treatment under both the DSH payment adjustment and the criteria for geographic reclassification.

Section 402 of Public Law 108-173 raised the DSH payment adjustment for RRCs such that they are not subject to the 12-percent cap on DSH payments that is applicable to other rural hospitals. RRCs are also not subject to the proximity criteria when applying for geographic reclassification. In addition, they do not have to meet the requirement that a hospital's average hourly wage must exceed, by a certain percentage, the average hourly wage of the labor market area where the hospital is located.

Section 4202(b) of Public Law 105-33 states, in part, “[a]ny hospital classified as an RRC by the Secretary . . . for fiscal year 1991 shall be classified as such an RRC for fiscal year 1998 and each subsequent year.” In the August 29, 1997 IPPS final rule with comment period (62 FR 45999), CMS reinstated RRC status for all hospitals that lost the status due to triennial review or MGCRB reclassification. However, CMS did not reinstate the status of hospitals that lost RRC status because they were now urban for all purposes because of the OMB designation of their geographic area as urban. Subsequently, in the August 1, 2000 IPPS final rule (65 FR 47089), we indicated that we were revisiting that decision. Specifically, we stated that we would permit hospitals that previously qualified as an RRC and lost their status due to OMB redesignation of the county in which they are located from rural to urban, to be reinstated as an RRC. Otherwise, a hospital seeking RRC status must satisfy all of the other applicable criteria. We use the definitions of “urban” and “rural” specified in Subpart D of 42 CFR Part 412. One of the criteria under which a hospital may qualify as an RRC is to have 275 or more beds available for use (§ 412.96(b)(1)(ii)). A rural hospital that does not meet the bed size requirement can qualify as an RRC if the hospital meets two mandatory prerequisites (a minimum CMI and a minimum number of discharges), and at least one of three optional criteria (relating to specialty composition of medical staff, source of inpatients, or referral volume). (We refer readers to § 412.96(c)(1) through (c)(5) and the September 30, 1988 Federal Register (53 FR 38513).) With respect to the two mandatory prerequisites, a hospital may be classified as an RRC if—

  • The hospital's CMI is at least equal to the lower of the median CMI for urban hospitals in its census region, excluding hospitals with approved teaching programs, or the median CMI for all urban hospitals nationally; and
  • The hospital's number of discharges is at least 5,000 per year, or, if fewer, the median number of discharges for urban hospitals in the census region in which the hospital is located. (The number of discharges criterion for an osteopathic hospital is at least 3,000 discharges per year, as specified in section 1886(d)(5)(C)(i) of the Act.)

1. Case-Mix Index (CMI)

Section 412.96(c)(1) provides that CMS establish updated national and regional CMI values in each year's annual notice of prospective payment rates for purposes of determining RRC status. The methodology we used to determine the national and regional CMI values is set forth in the regulations at § 412.96(c)(1)(ii). The national median CMI value for FY 2015 is based on the CMI values of all urban hospitals nationwide, and the regional median CMI values for FY 2015 are based on the CMI values of all urban hospitals within each census region, excluding those hospitals with approved teaching programs (that is, those hospitals that train residents in an approved GME program as provided in § 413.75). These values are based on discharges occurring during FY 2013 (October 1, 2012 through September 30, 2013), and include bills posted to CMS' records through March 2014.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28089), we proposed that, in addition to meeting other criteria, if rural hospitals with fewer than 275 beds are to qualify for initial RRC status for cost reporting periods beginning on or after October 1, 2014, they must have a CMI value for FY 2013 that is at least—

  • 1.5730; or
  • The median CMI value (not transfer-adjusted) for urban hospitals (excluding hospitals with approved teaching programs as identified in § 413.75) calculated by CMS for the census region in which the hospital is located. (We refer readers to the table set forth in the FY 2015 IPPS/LTCH PPS proposed rule at 79 FR 28089.)

The final CMI values for FY 2015 are based on the latest available data (FY 2013 bills received through March 2014). In addition to meeting other criteria, if rural hospitals with fewer than 275 beds are to qualify for initial RRC status for cost reporting periods beginning on or after October 1, 2014, they must have a CMI value for FY 2013 that is at least—

  • 1.5723; or
  • The median CMI value (not transfer-adjusted) for urban hospitals (excluding hospitals with approved teaching programs as identified in § 413.75) calculated by CMS for the census region in which the hospital is located.

The final CMI values by region are set forth in the following table:

RegionCase-mix index value
1. New England (CT, ME, MA, NH, RI, VT)1.3587
2. Middle Atlantic (PA, NJ, NY)1.4318
3. South Atlantic (DE, DC, FL, GA, MD, NC, SC, VA, WV)1.4807
4. East North Central (IL, IN, MI, OH, WI)1.4938
5. East South Central (AL, KY, MS, TN)1.4107
6. West North Central (IA, KS, MN, MO, NE, ND, SD)1.5459
7. West South Central (AR, LA, OK, TX)1.6039
8. Mountain (AZ, CO, ID, MT, NV, NM, UT, WY)1.6586
9. Pacific (AK, CA, HI, OR, WA)1.5658

A hospital seeking to qualify as an RRC should obtain its hospital-specific CMI value (not transfer-adjusted) from its fiscal intermediary or MAC. Data are available on the Provider Statistical and Reimbursement (PS&R) System. In keeping with our policy on discharges, the CMI values are computed based on all Medicare patient discharges subject to the IPPS MS-DRG-based payment.

2. Discharges

Section 412.96(c)(2)(i) provides that CMS set forth the national and regional numbers of discharges in each year's annual notice of prospective payment rates for purposes of determining RRC status. As specified in section 1886(d)(5)(C)(ii) of the Act, the national standard is set at 5,000 discharges. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28090), we proposed to update the regional standards based on discharges for urban hospitals' cost reporting periods that began during FY 2012 (that is October 1, 2011 through September 30, 2012), which are the latest cost report data available at the time the proposed rule was developed.

We proposed that, in addition to meeting other criteria, a hospital, if it is to qualify for initial RRC status for cost reporting periods beginning on or after October 1, 2014, must have, as the number of discharges for its cost reporting period that began during FY 2012, at least—

  • 5,000 (3,000 for an osteopathic hospital); or
  • The median number of discharges for urban hospitals in the census region in which the hospital is located. (We refer readers to the table set forth in the FY 2015 IPPS/LTCH PPS proposed rule at 79 FR 28090.)

Based on the latest discharge data available at this time (that is, based on FY 2012 cost report data), the final median number of discharges for urban hospitals by census region are set forth in the following table:

RegionNumber of discharges
1. New England (CT, ME, MA, NH, RI, VT)7,635
2. Middle Atlantic (PA, NJ, NY)10,841
3. South Atlantic (DE, DC, FL, GA, MD, NC, SC, VA, WV)10,642
4. East North Central (IL, IN, MI, OH, WI)8,530
5. East South Central (AL, KY, MS, TN)7,975
6. West North Central (IA, KS, MN, MO, NE, ND, SD)7,925
7. West South Central (AR, LA, OK, TX)4,960
8. Mountain (AZ, CO, ID, MT, NV, NM, UT, WY)8,525
9. Pacific (AK, CA, HI, OR, WA)8,504

We reiterate that, if an osteopathic hospital is to qualify for RRC status for cost reporting periods beginning on or after October 1, 2014, the hospital would be required to have at least 3,000 discharges for its cost reporting period that began during FY 2012.

D. Payment Adjustment for Low-Volume Hospitals (§ 412.101)

1. Background

Section 1886(d)(12) of the Act provides for an additional payment to each qualifying low-volume hospital that is paid under IPPS beginning in FY 2005. Sections 3125 and 10314 of the Affordable Care Act provided for a temporary change in the low-volume hospital payment policy for FYs 2011 and 2012. Section 605 of the American Taxpayer Relief Act of 2012 (ATRA) extended, for FY 2013, the temporary changes in the low-volume hospital payment policy provided for in FYs 2011 and 2012 by the Affordable Care Act. Prior to the enactment of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67) on December 26, 2013, and section 106 of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93) on April l, 2014, for FY 2014 (and subsequent years), the low-volume hospital qualifying criteria and payment adjustment returned to the statutory requirements under section 1886(d)(12) of the Act that were in effect prior to the amendments made by the Affordable Care Act and the ATRA. (For additional information on the expiration of the temporary changes in the low-volume hospital payment policy for FYs 2011 through 2013 provided for by the Affordable Care Act and the ATRA, we refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50610 through 50613).)

Section 1105 of the Pathway for SGR Reform Act extended, for the first 6 months of FY 2014 (that is, through March 31, 2014), the temporary changes in the low-volume hospital payment policy provided for in FYs 2011 and 2012 by the Affordable Care Act and extended through FY 2013 by the ATRA. We addressed the extension of the temporary changes to the low-volume hospital payment policy through March 31, 2014 under the Pathway for SGR Reform Act in an interim final rule with comment period that appeared in the Federal Register on March 18, 2014 (79 FR 15022 through 15025) (hereafter referred to as the “March 2014 IFC”). In that March 2014 IFC, we also amended the regulations at 42 CFR 412.101 to reflect the extension of the temporary changes to the qualifying criteria and the payment adjustment for low-volume hospitals through March 31, 2014. (In section IV.P. of the preamble of this final rule, we are responding to the public comments we received on the March 2014 IFC and are stating our finalized policy for the extension of the temporary changes to the low-volume hospital payment policy through March 31, 2014, under the Pathway for SGR Reform Act.)

2. Provisions of the Protecting Access to Medicare Act of 2014

Section 105 of the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93) extends, for an additional year (that is, through March 31, 2015), the temporary changes in the low-volume hospital payment policy provided for in FYs 2011 and 2012 by the Affordable Care Act and extended through FY 2013 by the ATRA and the first half of FY 2014 by the Pathway for SGR Reform Act. We addressed the extension of the temporary changes to the low-volume hospital payment policy for the second half of FY 2014 (that is, from April 1, 2014 through September 30, 2014) under the PAMA in a notice that appeared in the Federal Register on June 17, 2014 (79 FR 34444). However, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28090), we proposed to make conforming changes to the existing regulations text at § 412.101 to reflect the extension of the changes to the qualifying criteria and the payment adjustment methodology for low-volume hospitals through the first half of FY 2015 (that is, through March 31, 2015) in accordance with section 105 of the PAMA. Specifically, we proposed to revise paragraphs (b)(2)(i), (b)(2)(ii), (c)(1), (c)(2), and (d) of § 412.101. Under these proposed changes to § 412.101, beginning with FY 2015 discharges occurring on or after April 1, 2015, consistent with section 1886(d)(12) of the Act, as amended, the low-volume hospital qualifying criteria and payment adjustment methodology would revert to that which was in effect prior to the amendments made by the Affordable Care Act and subsequent legislation (that is, the low-volume hospital payment adjustment policy in effect for FYs 2005 through 2010).

We did not receive any public comments on our proposed conforming changes to the existing regulations text at § 412.101 to reflect the extension of the changes to the qualifying criteria and the payment adjustment methodology for low-volume hospitals through the first half of FY 2015 (that is, through March 31, 2015) in accordance with section 105 of the PAMA. Therefore, in this final rule, we are adopting our proposed revisions to paragraphs (b)(2)(i), (b)(2)(ii), (c)(1), (c)(2), and (d) of § 412.101 as final without modification. We note that these revisions supersede the conforming changes to these same regulatory provisions made in the March 2014 IFC to reflect the extension of the changes to the qualifying criteria and the payment adjustment methodology for low-volume hospitals through March 31, 2014, under the Pathway for SGR Reform Act, as discussed in section IV.P. of the preamble of this final rule. The public comments we received on our proposals related to the low-volume hospital payment policy for FY 2015 and our responses are presented in section IV.D.3. of the preamble of this final rule.

3. Low-Volume Hospital Definition and Payment Adjustment for FY 2015

As discussed above, under section 1886(d)(12) of the Act, as amended, the temporary changes in the low-volume hospital payment policy originally provided by the Affordable Care Act and extended through subsequent legislation, are effective for FY 2015 discharges occurring before April 1, 2015. To implement the extension of the temporary change in the low-volume hospital payment policy through the first half of FY 2015 (that is, for discharges occurring through March 31, 2015) provided for by the PAMA, in accordance with proposed § 412.101(b)(2)(ii) and consistent with our historical approach, we proposed to update the discharge data source used to identify qualifying low-volume hospitals and calculate the payment adjustment (percentage increase) for FY 2015 discharges occurring before April 1, 2015. Under existing § 412.101(b)(2)(ii), for the applicable fiscal years, a hospital's Medicare discharges from the most recently available MedPAR data, as determined by CMS, are used to determine if the hospital meets the discharge criteria to receive the low-volume payment adjustment in the current year. The applicable low-volume percentage increase, as originally provided for by the Affordable Care Act, is determined using a continuous linear sliding scale equation that results in a low-volume hospital payment adjustment ranging from an additional 25 percent for hospitals with 200 or fewer Medicare discharges to a zero percent additional payment adjustment for hospitals with 1,600 or more Medicare discharges. For FY 2015 discharges occurring before April 1, 2015, consistent with our historical policy, we proposed that qualifying low-volume hospitals and their payment adjustment would be determined using the most recently available Medicare discharge data from the FY 2013 MedPAR file, as these data are the most recent data available. Table 14 listed in the Addendum of the proposed rule (which is available only through the Internet on the CMS Web site at http://www.cms.hhs.gov/AcuteInpatientPPS/01_overview.asp) lists the “subsection (d)” hospitals with fewer than 1,600 Medicare discharges based on the December 2013 update of the FY 2013 MedPAR file and their proposed low-volume payment adjustment for FY 2015 discharges occurring before April 1, 2015 (if eligible). We noted that the list of hospitals with fewer than 1,600 Medicare discharges in Table 14 did not reflect whether or not the hospital meets the mileage criterion. Eligibility for the low-volume hospital payment adjustment for the first 6 months of FY 2015 would also be dependent upon meeting the mileage criterion specified at proposed § 412.101(b)(2)(ii); that is, the hospital is located more than 15 road miles from any other IPPS hospital. In addition, we indicated that if more recent Medicare discharge data become available, we intended to use updated data to determine the list of “subsection (d)” hospitals with fewer than 1,600 Medicare discharges based on the March 2014 update of the FY 2013 MedPAR file and their potential low-volume payment adjustment for FY 2015 discharges occurring before April 1, 2015 (if eligible) in Table 14 of the final rule.

We did not receive any public comments on our proposal that qualifying low-volume hospitals and their payment adjustment for FY 2015 discharges occurring before April 1, 2015 would be determined using the most recently available Medicare discharge data from the FY 2013 MedPAR file, as these data are the most recent data available. Therefore, in this final rule, as we proposed, we are establishing that qualifying low-volume hospitals (that is, the list of “subsection (d)” hospitals with fewer than 1,600 Medicare discharges) and their potential low-volume payment adjustment for FY 2015 discharges occurring before April 1, 2015 (if eligible) will be based on Medicare discharge data from the March 2014 update of the FY 2013 MedPAR file. Table 14 listed in the Addendum of this final rule (which is available only through the Internet on the CMS Web site at http://www.cms.hhs.gov/AcuteInpatientPPS/01_overview.asp) lists the “subsection (d)” hospitals with fewer than 1,600 Medicare discharges based on the March 2014 update of the FY 2013 MedPAR file and their low-volume payment adjustment for FY 2015 discharges occurring before April 1, 2015 (if eligible). We note that the list of hospitals with fewer than 1,600 Medicare discharges in Table 14 does not reflect whether or not the hospital meets the mileage criterion. Eligibility for the low-volume hospital payment adjustment for FY 2015 discharges occurring before April 1, 2015, is also dependent upon meeting (in the case of a hospital that did not qualify for the low-volume hospital payment adjustment in FY 2014) or continuing to meet (in the case of a hospital that did qualify for the low-volume hospital payment adjustment in FY 2014) the mileage criterion specified at revised § 412.101(b)(2)(ii) (that is, the hospital is located more than 15 road miles from any other subsection (d) hospital).

In accordance with section 1886(d)(12) of the Act, as amended, beginning with FY 2015 discharges occurring on or after April 1, 2015, the low-volume hospital definition and payment adjustment methodology will revert back to the statutory requirements that were in effect prior to the amendments made by the Affordable Care Act and subsequent legislation (including the PAMA). Therefore, as we stated in the proposed rule, consistent with section 1886(d)(12) of the Act, as amended, effective for FY 2015 discharges occurring on or after April 1, 2015 and subsequent years, in order to qualify as a low-volume hospital, a subsection (d) hospital must be more than 25 road miles from another subsection (d) hospital and have less than 200 discharges (that is, less than 200 discharges total, including both Medicare and non-Medicare discharges) during the fiscal year. Consistent with our existing policy for FYs 2005 through 2010, we stated that, effective for FY 2015 discharges occurring on or after April 1, 2015 and subsequent years, qualifying hospitals would receive the low-volume hospital payment adjustment of an additional 25 percent for discharges occurring during the fiscal year (or portion of the fiscal year). Also consistent with our existing policy for FYs 2005 through 2010, for FY 2015 discharges occurring on or after April 1, 2015 (and subsequent years), we stated that the discharge determination for the low-volume hospital payment adjustment would be made based on the hospital's number of total discharges, that is, Medicare and non-Medicare discharges based on the hospital's most recently submitted cost report. We use cost report data to determine if a hospital meets the discharge criterion because these data are the best available data source that includes information on both Medicare and non-Medicare discharges. In addition to a discharge criterion, eligibility for the low-volume hospital payment adjustment also depends on the hospital meeting a mileage criterion. As specified at § 412.101(b)(2)(i), to meet the mileage criterion to qualify for the low-volume hospital payment adjustment for FY 2015 discharges occurring on or after April 1, 2015 (and subsequent years), a hospital must be located more than 25 road miles from the nearest subsection (d) hospital.

Comment: A few commenters expressed concern about the financial impact of the expiration of the temporary changes in the low-volume hospital adjustment originally provided for by the Affordable Care Act. Some of these commenters requested that CMS permanently adopt the temporary changes in the low-volume hospital adjustment, while other commenters urged CMS to support legislative efforts to permanently extend these provisions beyond the current March 31, 2015 statutory expiration date. (These comments are similar to comments we received previously, prior to the statutory extensions of the temporary changes in the low-volume hospital adjustment for FYs 2013 and 2014 provided by subsequent legislation.)

Response: While we appreciate the commenters' concerns about the change to the low-volume hospital policy that will occur for discharges occurring on or after April 1, 2015 under current law, we are unable to extend the temporary changes to the low-volume hospital adjustment originally provided for by the Affordable Care Act beyond the current March 31, 2015 statutory expiration date. As discussed in response to similar comment in both the FY 2013 IPPS/LTCH PPS final rule (77 FR 53408 through 53409) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50612 through 50613), to implement the original low-volume hospital payment adjustment provision, and as mandated by statute, we developed an empirically justified adjustment based on the relationship between costs and total discharges of hospitals. (For more information on this analysis, we refer readers to the FY 2005 IPPS final rule (69 FR 49101 through 49102).) Under current law, the low-volume hospital definition and payment adjustment methodology will revert back to the policy established under statutory requirements that were in effect prior to the amendments made by the Affordable Care Act and subsequent legislation (include the PAMA) beginning with discharges occurring on after April 1, 2015.

Therefore, consistent with section 1886(d)(12) of the Act, as amended, under the conforming changes to § 412.101(b)(2), effective for FY 2015 discharges occurring on or after April 1, 2015, and subsequent years, in order to qualify as a low-volume hospital, a subsection (d) hospital must be more than 25 road miles from another subsection (d) hospital and have less than 200 discharges (that is, less than 200 discharges total, including both Medicare and non-Medicare discharges) during the fiscal year. Consistent with our existing policy for FYs 2005 through 2010, effective for FY 2015 discharges occurring on or after April 1, 2015, and subsequent years, qualifying hospitals will receive the low-volume hospital payment adjustment of an additional 25 percent for discharges occurring during the fiscal year (or portion of the fiscal year). The discharge determination for the low-volume hospital payment adjustment will be made based on the hospital's number of total discharges, that is, Medicare and non-Medicare discharges, as specified at § 412.101(b)(2)(i). The hospital's most recently submitted cost report is used to determine if the hospital meets the discharge criterion to receive the low-volume hospital payment adjustment in the current fiscal year. We use cost report data to determine if a hospital meets the discharge criterion because these data are the best available data source that includes information on both Medicare and non-Medicare discharges. In addition to a discharge criterion, eligibility for the low-volume hospital payment adjustment also depends on the hospital meeting a mileage criterion. As specified at § 412.101(b)(2)(i), to meet the mileage criterion to qualify for the low-volume hospital payment adjustment for FY 2015 discharges occurring on or after April 1, 2015 (and subsequent years), a hospital must be located more than 25 road miles from the nearest subsection (d) hospital.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28091 through 28092), for FY 2015, we proposed a process for requesting and obtaining the low-volume hospital payment adjustment that was consistent with our previously established procedure. We proposed that in order to receive a low-volume hospital payment adjustment under § 412.101, a hospital must notify and provide documentation to its MAC that it meets the discharge and distance requirements under proposed § 412.101(b)(2)(ii) for FY 2015 discharges occurring before April 1, 2015, and under proposed § 412.101(b)(2)(i) for FY 2015 discharges occurring on or after April 1, 2015, if also applicable. Specifically, for FY 2015, we proposed that a hospital must make a written request for low-volume hospital status that is received by its MAC no later than September 1, 2014, in order for the applicable low-volume hospital payment adjustment to be applied to payments for its discharges occurring on or after October 1, 2014, and through March 31, 2015, or through September 30, 2015, for hospitals that also meet the low-volume hospital payment adjustment qualifying criteria for discharges occurring during the second half of FY 2015. Under this proposal, a hospital that qualified for the low-volume payment adjustment in FY 2014 may continue to receive a low-volume payment adjustment for FY 2015 discharges occurring before April 1, 2015, without reapplying if it continues to meet the Medicare discharge criterion established for FY 2015 and the distance criterion. However, the hospital must send written verification that is received by its MAC no later than September 1, 2014, stating that it continues to be more than 15 miles from any other “subsection (d)” hospital. We also proposed that if a hospital's written request for low-volume hospital status for FY 2015 is received after September 1, 2014, and if the MAC determines that the hospital meets the criteria to qualify as a low-volume hospital, the MAC would apply the applicable low-volume hospital payment adjustment to determine the payment for the hospital's FY 2015 discharges, effective prospectively within 30 days of the date of its low-volume hospital status determination.

Comment: One commenter requested that CMS not impose a notification requirement for hospitals that qualified for the low-volume hospital payment adjustment in FY 2014. The commenter stated that eliminating this verification would reduce the administrative burden for those hospitals and their MACs.

Response: We appreciate the commenter's suggestion to reduce the administrative burden for hospitals and MACs by not having a notification requirement under the FY 2015 low-volume hospital policy for hospitals that qualified for the low-volume hospital payment adjustment in FY 2014. However, as we explained in the proposed rule, under our proposal a hospital that qualified for the low-volume payment adjustment in FY 2014 does not need to reapply for FY 2015 if it continues to meet the applicable discharge and distance criteria (that is, such a hospital would not have to resubmit a low-volume hospital request with supporting documentation to demonstrate that it meets the mileage criterion). Rather, such a hospital would only be required to send written verification that it continues to meet the distance criterion that is received by the MAC by the proposed notification deadline. This written verification could be a brief letter to the MAC stating that the hospital continues to meet the low-volume hospital distance criterion as documented in a prior low-volume hospital status request. We proposed this abridged notification requirement for hospitals that qualified for the low-volume payment adjustment in FY 2014 because we believe compliance with the statutory low-volume hospital criteria should be monitored while recognizing that it is not necessary to have such hospitals resubmit a low-volume hospital request with the necessary documentation. In addition, if we were to consider no longer requiring verification for hospitals that qualified for the low-volume hospital payment adjustment in the prior year, we may also want to develop alternative policies for monitoring compliance with the statutory low-volume hospital qualifying criteria. Therefore, we are not adopting the commenter's suggestion regarding hospitals that qualified for the low-volume hospital payment adjustment in FY 2014. However, should the temporary changes to the low-volume hospital adjustment be extended beyond March 31, 2015, by subsequent legislation, we may consider modifying the verification process in conjunction with developing an alternative compliance policy.

In this final rule, we are adopting our policy as proposed without modification. Therefore, in order to receive a low-volume hospital payment adjustment under § 412.101, a hospital must notify and provide documentation to its MAC that it meets the discharge and distance requirements under revised § 412.101(b)(2)(ii) for FY 2015 discharges occurring before April 1, 2015, and under revised § 412.101(b)(2)(i) for FY 2015 discharges occurring on or after April 1, 2015, if also applicable. The MAC will determine, based on the most recent data available, if the hospital qualifies as a low-volume hospital, so that the hospital would know in advance whether or not it will receive a payment adjustment. The MAC and CMS may review available data, in addition to the data the hospital submits with its request for low-volume hospital status, in order to determine whether or not the hospital meets the qualifying criteria. Consistent with our previously established procedure, for FY 2015, a hospital must make a written request for low-volume hospital status that is received by its MAC no later than September 1, 2014, in order for the applicable low-volume hospital payment adjustment to be applied to payments for its discharges occurring on or after October 1, 2014, and through March 31, 2015, under revised § 412.101(b)(2)(ii) or through September 30, 2015, for hospitals that also qualify under revised § 412.101(b)(2)(i). A hospital that qualified for the low-volume payment adjustment in FY 2014 may continue to receive a low-volume payment adjustment for FY 2015 discharges occurring before April l, 2015, without reapplying if it continues to meet the Medicare discharge criterion established for FY 2015 (shown in Table 14 of this final rule, which is available via the Internet on the CMS Web site) and the distance criterion. However, the hospital must send written verification that is received by its MAC no later than September 1, 2014, that it continues to be more than 15 miles from any other “subsection (d)” hospital. This written verification could be a brief letter to the MAC stating that the hospital continues to meet the low-volume hospital distance criterion as documented in a prior low-volume hospital status request.

If a hospital's written request for low-volume hospital status for FY 2015 is received after September 1, 2014, and if the MAC determines that the hospital meets the criteria to qualify as a low-volume hospital under revised § 412.101(b)(2)(ii), the MAC will apply the applicable low-volume hospital payment adjustment to determine the payment for the hospital's FY 2015 discharges, effective prospectively within 30 days of the date of its low-volume hospital status determination through discharges occurring on or before March 31, 2015. If the hospital also qualifies under revised § 412.101(b)(2)(i), the MAC will apply the 25-percent low-volume hospital payment adjustment to determine the payment for the hospital's FY 2015 discharges occurring on or after April 1, 2015. If a hospital's written request for low-volume hospital status for FY 2015 is received on a later date such that the prospective effective date would be on or after April 1, 2015, and the hospital qualifies under revised § 412.101(b)(2)(i), the MAC will apply the 25-percent low-volume hospital payment adjustment to determine the payment for the hospital's FY 2015 discharges occurring from the prospective effective date through September 30, 2015. (For additional details on our established process for the low-volume hospital payment adjustment, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53408).)

E. Indirect Medical Education (IME) Payment Adjustment (§ 412.105)

1. IME Adjustment Factor for FY 2015

Under the IPPS, an additional payment amount is made to hospitals with residents in an approved graduate medical education (GME) program in order to reflect the higher indirect patient care costs of teaching hospitals relative to nonteaching hospitals. The payment amount is determined by use of a statutorily specified adjustment factor. The regulations regarding the calculation of this additional payment, known as the IME adjustment, are located at § 412.105. We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51680) for a full discussion of the IME adjustment and IME adjustment factor. Section 1886(d)(5)(B) of the Act states that, for discharges occurring during FY 2008 and fiscal years thereafter, the IME formula multiplier is 1.35. Accordingly, for discharges occurring during FY 2015, the formula multiplier is 1.35. We estimate that application of this formula multiplier for the FY 2015 IME adjustment will result in an increase in IPPS payment of 5.5 percent for every approximately 10 percent increase in the hospital's resident to bed ratio.

Comment: One commenter stated it has a longstanding commitment to graduate medical education, the practice of academic medicine, and successful training of surgical residents. The commenter expressed appreciation of Federal support of IME payments. The commenter stated these payments are an important part of ensuring a strong general surgery workforce, which is currently experiencing a growing shortage.

Response: We acknowledge the commenter's support. We note that the IME formula multiplier is set by Congress. We are specifying in this final rule that the IME formula multiplier for FY 2015 is set at 1.35.

2. IME Medicare Part C Add-On Payments to Sole Community Hospitals (SCHs) That Are Paid According to Their Hospital-Specific Rates and Change in Methodology in Determining Payment to SCHs

Section 1886(d)(11) of the Act provides for an additional payment amount to a subsection (d) teaching hospital that has an approved medical residency training program for each applicable discharge of any individual who is enrolled under Medicare Managed Care under Part C. The amount of such payment is specified in section 1886(d)(11)(C) of the Act and “shall be equal to the applicable percentage (as defined in subsection (h)(3)(D)(ii)) of the estimated average per discharge amount that would otherwise have been paid under paragraph (5)(B) if the individuals had not been enrolled as described in subparagraph (B).”

Under section 1886(d)(5)(D) of the Act, sole community hospitals (SCHs) are paid based on their hospital-specific rate from specified base years or the IPPS Federal rate, whichever yields the greatest aggregate payment for the hospital's cost reporting period. Payments based on the Federal rate are based on the IPPS standardized amount and include all applicable IPPS add-on payments, such as outliers, DSH, and IME, while payments based on the hospital-specific rate include no add-on payments. Under CMS' current payment system, both the IME add-on payment for Medicare Part A patient discharges under section 1886(d)(5)(B) of the Act and the IME add-on payment for Medicare Part C patient discharges under section 1886(d)(11) of the Act are included as part of the Federal rate payment, whereas neither of these add-on payments are included as part of the hospital-specific rate payment. We note that SCHs that are paid based on their hospital-specific rate do not receive a separate IME add-on payment for Medicare Part A patient discharges because, generally, the hospital-specific rate already reflects the additional costs that a teaching hospital incurs for its Medicare Part A patients. In the case of Medicare Part C patients, there is no component of the hospital-specific rate that already accounts for the additional costs that SCHs incur for their Medicare Part C patients, and there is currently no payment mechanism for SCHs paid based on their hospital-specific rate to receive the IME add-on payment for Medicare Part C patients.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28092), for the reasons specified below, effective for discharges occurring in cost reporting periods beginning on or after October 1, 2014, we proposed: (1) To provide all SCHs that are subsection (d) teaching hospitals IME add-on payments for applicable discharges of Medicare Part C patients in accordance with section 1886(d)(11) of the Act, regardless of whether the SCH is paid based on the Federal rate or its hospital-specific rate; and (2) that, for purposes of the comparison of payments based on the Federal rate (hereinafter also referred to as the “Federal rate payment”) and payments based on the hospital-specific rate (hereinafter also referred to as the “hospital-specific rate payment”) under section 1886(d)(5)(D) of the Act, IME payments under section 1886(d)(11) of the Act for Medicare Part C patients will no longer be included as part of the Federal rate payment. After the higher of the Federal rate payment amount or the hospital-specific rate payment amount is determined, any IME add-on payments under section 1886(d)(11) of the Act for Medicare Part C patient discharges would be added to that payment for purposes of determining the hospital's total payment amount.

As noted above, under section 1886(d)(5)(D) of the Act, SCHs are paid based on their hospital-specific rate or the Federal rate, whichever yields the higher payment for the hospital's cost reporting period. For each cost reporting period, the MAC determines which of the payment options will yield the higher aggregate payment. Interim payments are automatically made on a claim-by-claim basis at the higher rate using the best data available at the time the MAC makes the payment determination for each discharge. However, it may not be possible for the MAC to determine in advance precisely which of the rates will yield the higher aggregate payment by year's end. In many cases, it is not possible to forecast outlier payments or the final amount of the DSH payment adjustment or the IME adjustment until cost report settlement. As noted above, these adjustment amounts are included only as part of the payments based on the Federal rate but not payments based on the hospital-specific rate. The MAC makes a final adjustment at cost report settlement after it determines precisely which of the two payment rates would yield the higher aggregate payment to the hospital for its cost reporting period. This payment methodology makes SCHs unique because SCH payments can change on a yearly basis from payments based on the hospital-specific rate to payments based on the Federal rate, or vice versa.

As we stated earlier, section 1886(d)(11) of the Act provides for an additional payment for each applicable discharge of any subsection (d) teaching hospital for treating Medicare Part C patients. Section 1886(d)(11)(C) of the Act specifies that the amount of the payment “shall be equal to the applicable percentage (as defined in subsection (h)(3)(D)(ii)) of the estimated average per discharge amount that would otherwise have been paid under paragraph (5)(B) if the individuals had not been enrolled as described in subparagraph (B)” (emphasis added). Because an SCH that is paid based on its hospital-specific rate does not receive any IME add-on payment for Medicare Part A patients as provided under section 1886(d)(5)(B) of the Act, CMS has interpreted section 1886(d)(11)(C) of the Act to mean that an SCH that is paid based on its hospital-specific rate also is not entitled to receive payment for discharges of Medicare Part C patients under section 1886(d)(11) of the Act.

After further consideration of the language at section 1886(d)(11) of the Act, we believe that the statute would allow an SCH that is paid based on its hospital-specific rate to receive IME add-on payments for its Medicare Part C patient discharges. Section 1886(d)(11)(A) of the Act provides for an additional payment amount for each applicable discharge of a Medicare Part C patient of a subsection (d) hospital that has an approved medical residency training program. Section 1886(d)(11)(C) of the Act sets forth the amount of this additional payment, by reference to the amount that would otherwise have been paid under section 1886(d)(5)(B) of the Act. We believe that section 1886(d)(11)(C) of the Act can be interpreted as simply establishing the methodology for calculating the amount of the add-on payment, without limiting the applicability of the add-on payment to those SCHs that are paid based on the Federal rate.

As noted earlier, currently, in making the comparison of SCH payments under the Federal rate and the hospital-specific rate under section 1886(d)(5)(D) of the Act, the aggregate Federal rate payments are based on the IPPS standardized amount and include IME add-on payments for both Medicare Part A and Medicare Part C patient discharges. Payments based on the hospital-specific rate do not include the Medicare Part A IME add-on payment under section 1886(d)(5)(B) of the Act, under the rationale that, generally, the hospital-specific rate already reflects the additional costs that a teaching hospital incurs for its Medicare Part A patients. Payments based on the hospital-specific rate do not include the IME add-on payment for Medicare Part C patient discharges under section 1886(d)(11) of the Act. As a result, under the current methodology, if an SCH that is a teaching hospital is paid based on its hospital-specific rate, it receives no IPPS payment that reflects or accounts for the additional costs that a teaching hospital incurs for its Medicare Part C patients.

In conjunction with our proposal to provide IME add-on payments under section 1886(d)(11) of the Act to SCHs, regardless of whether the SCH is paid based on the Federal rate or its hospital-specific rate, we also believe that, for purposes of the comparison of payments based on the Federal rate and the hospital-specific rate, it would be appropriate for IME add-on payments under section 1886(d)(11) of the Act to no longer be included as part of the Federal rate payment. Therefore, we proposed to no longer include these payments in the comparison in order to more accurately reflect comparable payments for Medicare Part A patient discharges. In addition, because the IME add-on payment for Medicare Part C patient discharges for a given provider would be the same, regardless of whether it is paid based on the Federal rate or its hospital-specific rate, there would be no need to include the IME add-on payment for Medicare Part C patient discharges in the comparison. This is because the Part C IME adjustment is always multiplied by the Federal rate that is used under section 1886(d)(5)(B) of the Act, regardless of whether the hospital-specific rate is higher, in accordance with section 1886(d)(11) of the Act, which states that the IME Part C add-on amount “shall be equal to the applicable percentage . . . of the estimated average per discharge amount that would otherwise have been paid under paragraph (5)(B).”

We invited public comments on both of these proposals and any alternatives that we should consider.

Comment: Several commenters supported CMS' proposal to make IME add-on payments for Medicare Part C discharges to SCHs paid based on the hospital-specific rate. Some of these commenters also supported the proposal to change the methodology in determining whether an SCH is paid based on the Federal rate or the hospital-specific rate by excluding the IME add-on amount for Medicare Part C discharges from the comparison.

Although commenters supported the proposal to make IME add-on payments for Medicare Part C discharges to SCHs that are paid based on the hospital-specific rate, several commenters objected to the proposal to make a corresponding change to the methodology for determining whether an SCH is paid based on the Federal rate or the hospital-specific rate by excluding the IME add-on amount for Medicare Part C discharges from the comparison. The commenters claimed that this change would have the unintended consequence of precluding hospitals from receiving DSH and uncompensated care payments, which would disadvantage a subset of SCHs that receive payment based on the hospital-specific rate. They recommended making no changes to the comparison.

Response: We appreciate the commenters' support of our proposal to make IME add-on payments for Medicare Part C discharges to SCHs that are paid based on the hospital-specific rate. While we agree that a provider that receives payment based on the hospital-specific rate would not be eligible for DSH or uncompensated care payments, we do not agree that exclusion of the IME add-on payment for Medicare Part C discharges from the comparison of the Federal rate payments to the hospital-specific rate payments would disadvantage a given hospital. Our proposal does not preclude a provider from receiving payment based on the Federal rate (which includes DSH and uncompensated care payments as applicable), if the Federal rate payment is higher than the hospital-specific rate payment. However, it is true that a provider that receives payment based on the hospital-specific rate would not be eligible for DSH or uncompensated care payments.

As we stated in the proposed rule, we believe that the proposed methodology more accurately reflects the comparable payments for Medicare Part A discharges for SCHs. Generally the hospital-specific rate payment already reflects the additional costs that a teaching hospital incurs for its Medicare Part A patients. However, because the costs associated with Medicare Part C patient discharges are not reflected in the hospital-specific rate, we believe that excluding these amounts from the Federal rate payment provides for a more accurate comparison of payments for Medicare Part A discharges. The commenters did not provide any explanation in support of maintaining our current methodology of comparing the Federal rate payment with the IME add-on amount for Medicare Part C discharges to the hospital-specific rate payment. Moreover, these commenters did not include any explanation of how our proposal to exclude the IME add-on payments for Medicare Part C discharges from both sides of the comparison would specifically disadvantage a given provider by precluding it from receiving DSH and uncompensated care payments. For these reasons, we are not adopting the commenters' suggestion to maintain the current comparison methodology.

Comment: One commenter urged CMS to extend the same payment IME add-on for Part C patients to MDHs because they also are paid the higher of the Federal rate payment or “the blended rate incorporating a hospital-specific rate.”

Response: Unlike SCHs, an MDH receives the higher of the Federal rate or the Federal rate payment plus 75 percent of the amount by which the Federal rate payment is exceeded by its hospital-specific rate payments (that is, payment based on the highest of its hospital-specific rates based on costs in one of its base years). Because payment, whether in whole or in part to an MDH, is always based on the Federal rate, an MDH that is a teaching hospital receives IME add-on payments for Medicare Part A patient discharges under section 1886(d)(5)(B) of the Act, and, therefore, under our historical interpretation of section 1886(d)(11)(C) of the Act, is entitled to receive IME add-on payments for Medicare Part C patient discharges. Consequently, there is no need to “extend” this payment add-on to MDHs that are teaching hospitals because they are already receiving IME add-on payments for Medicare Part C discharges. We also note that, as explained elsewhere, the Federal rate payment used in the MDH payment methodology is the same Federal rate payment that is used in the SCH payment methodology (79 FR 28096). This means that, under the proposed change to the comparison methodology to exclude IME add-on payments for Medicare Part C discharges from the Federal rate payment, the Federal rate payment used for the purpose of the MDH payment methodology, that is, to calculate the 75 percent of the amount by which the Federal rate payment is exceeded by the highest of its hospital-specific rate payments based on costs in one of the MDH's base years, would likewise exclude the IME add-on payment for Medicare Part C discharges. After determining the higher of the Federal rate payment or the Federal rate payment plus 75 percent of the amount by which the Federal rate payment is exceeded by the hospital-specific rate payment, any add-on payments under section 1886(d)(11) of the Act for Medicare Part C patient discharges will be added to that payment for purposes of determining the hospital's total payment amount.

Comment: One commenter addressed the general payment methodology for SCHs and the limited number of specified years upon which the hospital-specific rate is based. The commenter stated that the proposal to make additional IME Part C add-on payments to SCHs does not cover IME costs for SCHs that did not have a teaching program during or prior to FY 2006. The commenter suggested allowing rural hospitals to rebase their hospital-specific rate in the fiscal year following the start of a new residency program.

Response: We consider this comment to be outside of the scope of the proposals described above. We also note that the fiscal years upon which the hospital-specific rates are based are specified in the statute. CMS does not have authority to authorize a rebasing of hospital-specific rates absent additional legislation.

After consideration of the public comments we received, we are adopting our proposals without modification. In summary, effective with discharges occurring in cost reporting periods beginning on or after October 1, 2014, our final policies are: (1) To provide all SCHs that are subsection (d) teaching hospitals IME add-on payments for Medicare Part C patient discharges in accordance with section 1886(d)(11) of the Act; and (2) for purposes of the comparison of payments based on the Federal rate and the hospital-specific rate for SCHs under section 1886(d)(5)(D) of the Act, IME add-on payments under section 1886(d)(11) of the Act for Medicare Part C patient discharges will no longer be included in the aggregate payment based on the Federal rate. After the higher of the Federal rate payment or the hospital-specific rate payment under section 1886(d)(5)(D) of the Act is determined, the Part C IME adjustment factor is multiplied by the Federal rate to determine the add-on payment amount under section 1886(d)(11) of the Act, and then any IME add-on payments under section 1886(d)(11) of the Act are added to the payment amount under section 1886(d)(5)(D) of the Act for purposes of determining the hospital's total payment amount.

3. Other Policy Changes Affecting IME

In section IV.K. of the preamble of this final rule, we present other policy changes relating to GME payments, which may also apply to IME payments. We refer readers to that section of the preamble of this proposed rule where we present these policies.

F. Payment Adjustment for Medicare Disproportionate Share Hospitals (DSHs) (§ 412.106)

1. Background

Section 1886(d)(5)(F) of the Act provides for additional Medicare payments to subsection (d) hospitals that serve a significantly disproportionate number of low-income patients. The Act specifies two methods by which a hospital may qualify for the Medicare disproportionate share hospital (DSH) adjustment. Under the first method, hospitals that are located in an urban area and have 100 or more beds may receive a Medicare DSH payment adjustment if the hospital can demonstrate that, during its cost reporting period, more than 30 percent of its net inpatient care revenues are derived from State and local government payments for care furnished to needy patients with low incomes. This method is commonly referred to as the “Pickle method.” The second method for qualifying for the DSH payment adjustment, which is the most common, is based on a complex statutory formula under which the DSH payment adjustment is based on the hospital's geographic designation, the number of beds in the hospital, and the level of the hospital's disproportionate patient percentage (DPP). A hospital's DPP is the sum of two fractions: The “Medicare fraction” and the “Medicaid fraction.” The Medicare fraction (also known as the “SSI fraction” or “SSI ratio”) is computed by dividing the number of the hospital's inpatient days that are furnished to patients who were entitled to both Medicare Part A and Supplemental Security Income (SSI) benefits by the hospital's total number of patient days furnished to patients entitled to benefits under Medicare Part A. The Medicaid fraction is computed by dividing the hospital's number of inpatient days furnished to patients who, for such days, were eligible for Medicaid, but were not entitled to benefits under Medicare Part A, by the hospital's total number of inpatient days in the same period.

Because the DSH payment adjustment is part of the IPPS, the DSH statutory references (under section 1886(d)(5)(F) of the Act) to “days” apply only to hospital acute care inpatient days. Regulations located at § 412.106 govern the Medicare DSH payment adjustment and specify how the DPP is calculated as well as how beds and patient days are counted in determining the Medicare DSH payment adjustment. Under § 412.106(a)(1)(i), the number of beds for the Medicare DSH payment adjustment is determined in accordance with bed counting rules for the IME adjustment under § 412.105(b).

2. Impact on Medicare DSH Payment Adjustment of Implementation of New OMB Labor Market Delineations

As discussed in section III.B. of the preamble of this final rule, in the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to implement the new OMB labor market area delineations (which are based on 2010 Decennial Census data) for the FY 2015 wage index. We stated that this proposal also would have an impact on the calculation of Medicare DSH payments to certain hospitals. Hospitals that are designated as rural with less than 500 beds and that are not rural referral centers (RRCs) are subject to a maximum DSH payment adjustment of 12 percent. Accordingly, hospitals with less than 500 beds that are currently in urban counties that would become rural if we adopt the new OMB delineations, and that do not become RRCs, would be subject to a maximum DSH payment adjustment of 12 percent. (We note that urban hospitals are only subject to a maximum DSH payment adjustment of 12 percent if they have less than 100 beds.)

Under existing regulations at 42 CFR 412.102, a hospital located in an area that is reclassified from urban to rural, as defined in the regulations, may receive an adjustment to its rural Federal payment amount for operating costs for two successive fiscal years. Specifically, the regulations state that, in the first year after a hospital loses urban status, the hospital will receive an additional payment that equals two-thirds of the difference between the urban standardized amount and disproportionate share payments as applicable to the hospital before its redesignation from urban to rural and the rural standardized amount and disproportionate share payments otherwise applicable to the hospital subsequent to its redesignation from urban to rural. In the second year after a hospital loses urban status, the hospital will receive an additional payment that equals one-third of the difference between the urban standardized amount and disproportionate share payments applicable to the hospital before its redesignation from urban to rural and the rural standardized amount and disproportionate share payments otherwise applicable to the hospital subsequent to its redesignation from urban to rural.

We note that we no longer make a distinction between the urban standardized amount and the rural standardized amount. Rather, hospitals receive the same standardized amount regardless of their geographic designation. Accordingly, we proposed to revise the regulation at § 412.102 to remove references to the urban and rural standardized amounts.

We did not receive any public comments on this proposal and we are adopting the revisions to the regulation at § 412.102 to remove references to the urban and rural standardized amounts.

The provisions of § 412.102 will continue to apply with respect to the calculation of the DSH payments to hospitals that are currently located in urban counties that will become rural under our adoption of the new OMB delineations as described in section III.B.2. of the preamble to this final rule. Specifically, the regulations state that, in the first year after a hospital loses urban status, the hospital will receive an additional payment that equals two-thirds of the difference between disproportionate share payments as applicable to the hospital before its redesignation from urban to rural and the disproportionate share payments otherwise applicable to the hospital subsequent to its redesignation from urban to rural. In the second year after a hospital loses urban status, the hospital will receive an additional payment that equals one-third of the difference between the disproportionate share payments applicable to the hospital before its redesignation from urban to rural and the disproportionate share payments otherwise applicable to the hospital subsequent to its redesignation from urban to rural.

For the purposes of ratesetting, calculating budget neutrality, and modeling payment impacts for this final rule, any hospital that was previously urban but will change to rural status in FY 2015 as a result of the adoption of the new OMB labor market area delineations will have its DSH payments modeled such that the payment equals the amount of the rural disproportionate share payments plus two-thirds of the difference between the urban disproportionate share payments and the rural disproportionate share payments.

3. Payment Adjustment Methodology for Medicare Disproportionate Share Hospitals (DSHs) Under Section 3133 of the Affordable Care Act (§ 412.106)

a. General Discussion

Section 3133 of the Patient Protection and Affordable Care Act, as amended by section 10316 of the same act and section 1104 of the Health Care and Education Reconciliation Act (Pub. L. 111-152), added a new section 1886(r) to the Act that modifies the methodology for computing the Medicare DSH payment adjustment beginning in FY 2014. For purposes of this proposed rule, we refer to these provisions collectively as section 3133 of the Affordable Care Act.

Medicare DSH adjustment payments are calculated under a statutory formula that considers the hospital's Medicare utilization attributable to beneficiaries who also receive Supplemental Security Income (SSI) benefits and the hospital's Medicaid utilization. Beginning with discharges in FY 2014, hospitals that qualify for Medicare DSH payments under section 1886(d)(5)(F) of the Act receive 25 percent of the amount they previously would have received under the statutory formula for Medicare DSH payments. This provision applies equally to hospitals that qualify for DSH payments under section 1886(d)(5)(F)(i)(I) of the Act and those hospitals that qualify under the Pickle method under section 1886(d)(5)(F)(i)(II) of the Act.

The remaining amount, equal to an estimate of 75 percent of what otherwise would have been paid as Medicare DSH payments, reduced to reflect changes in the percentage of individuals under age 65 who are uninsured, is available to make additional payments to each hospital that qualifies for Medicare DSH payments and that has uncompensated care. The payments to each hospital for a fiscal year are based on the hospital's amount of uncompensated care for a given time period relative to the total amount of uncompensated care for that same time period reported by all hospitals that receive Medicare DSH payments for that fiscal year.

As provided by section 3133 of the Affordable Care Act, section 1886(r) of the Act requires that, for FY 2014 and each subsequent fiscal year, a “subsection (d) hospital” that would otherwise receive a “disproportionate share hospital payment . . . made under subsection (d)(5)(F)” receives two separately calculated payments. Specifically, section 1886(r)(1) of the Act provides that the Secretary shall pay to such a subsection (d) hospital (including a Pickle hospital) 25 percent of the amount the hospital would have received under section 1886(d)(5)(F) of the Act for disproportionate share hospital payments, which represents “the empirically justified amount for such payment, as determined by the Medicare Payment Advisory Commission in its March 2007 Report to the Congress.” We refer to this payment as the “empirically justified Medicare DSH payment.”

In addition to this payment, section 1886(r)(2) of the Act provides that, for FY 2014 and each subsequent fiscal year, the Secretary shall pay to “such subsection (d) hospital an additional amount equal to the product of” three factors. The first factor is the difference between “the aggregate amount of payments that would be made to subsection (d) hospitals under subsection (d)(5)(F) if this subsection did not apply” and “the aggregate amount of payments that are made to subsection (d) hospitals under paragraph (1)” for each fiscal year. Therefore, this factor amounts to 75 percent of the payments that would otherwise be made under section 1886(d)(5)(F) of the Act.

The second factor is, for FYs 2014 through 2017, 1 minus the percent change in the percent of individuals under the age of 65 who are uninsured, determined by comparing the percent of such individuals who are uninsured in 2013, the last year before coverage expansion under the Affordable Care Act (as calculated by the Secretary based on the most recent estimates available from the Director of the Congressional Budget Office before a vote in either House on the Health Care and Education Reconciliation Act of 2010 that, if determined in the affirmative, would clear such Act for enrollment), minus 0.1 percentage points for FY 2014, and minus 0.2 percentage points for FYs 2015 through 2017. For FYs 2014 through 2017, the baseline for the estimate of the change in uninsurance is fixed by the most recent estimate of the Congressional Budget Office before the final vote on the Health Care and Education Reconciliation Act of 2010, which is contained in a March 20, 2010 letter from the Director of the Congressional Budget Office to the Speaker of the House. (A link to this letter is included in section IV.F.3.d.(2) of the preamble of this final rule.)

For FY 2018 and subsequent years, the second factor is 1 minus the percent change in the percent of individuals who are uninsured, as determined by comparing the percent of individuals “who are uninsured in 2013 (as estimated by the Secretary, based on data from the Census Bureau or other sources the Secretary determines appropriate, and certified by the Chief Actuary” of CMS, and the percent of individuals “who are uninsured in the most recent period for which data is available (as so estimated and certified), minus 0.2 percentage points for FYs 2018 and 2019.” Therefore, for FY 2018 and subsequent years, the statute provides some greater flexibility in the choice of the data sources to be used for the estimate of the change in the percent of uninsured individuals.

The third factor is a percent that, for each subsection (d) hospital, “represents the quotient of . . . the amount of uncompensated care for such hospital for a period selected by the Secretary (as estimated by the Secretary, based on appropriate data . . .),” including the use of alternative data “where the Secretary determines that alternative data is available which is a better proxy for the costs of subsection (d) hospitals for . . . treating the uninsured,” and “the aggregate amount of uncompensated care for all subsection (d) hospitals that receive a payment under this subsection.” Therefore, this third factor represents a hospital's uncompensated care amount for a given time period relative to the uncompensated care amount for that same time period for all hospitals that receive Medicare DSH payments in that fiscal year, expressed as a percent. For each hospital, the product of these three factors represents its additional payment for uncompensated care for the applicable fiscal year. We refer to the additional payment determined by these factors as the “uncompensated care payment.”

Section 1886(r) of the Act applies to FY 2014 and each subsequent fiscal year. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50620 through 50647) and the FY 2014 IPPS interim final rule with comment period (78 FR 61191 through 61197), we set forth our policies for implementing the required changes to the DSH payment methodology made by section 3133 of the Affordable Care Act for FY 2014. In those rules, we noted that, because section 1886(r) of the Act modifies the payment required under section 1886(d)(5)(F) of the Act, it affects only the DSH payment under the operating IPPS. It does not revise or replace the capital IPPS DSH payment provided under the regulations at 42 CFR Part 412, Subpart M, which were established through the exercise of the Secretary's discretion in implementing the capital IPPS under section 1886(g)(1)(A) of the Act.

Finally, section 1886(r)(3) of the Act provides that there shall be “no administrative or judicial review under section 1869, section 1878, or otherwise” of “any estimate of the Secretary for purposes of determining the factors described in paragraph (2),” or of “any period selected by the Secretary” for the purpose of determining those factors. Therefore, there is no administrative or judicial review of the estimates developed for purposes of applying the three factors used to determine uncompensated care payments, or the periods selected in order to develop such estimates.

b. Eligibility for Empirically Justified Medicare DSH Payments and Uncompensated Care Payments

As indicated earlier, the payment methodology under section 3133 of the Affordable Care Act applies to “subsection (d) hospitals” that would otherwise receive a “disproportionate share hospital payment . . . made under subsection (d)(5)(F).” Therefore, eligibility for empirically justified Medicare DSH payments is unchanged under section 3133 of the Affordable Care Act. Consistent with the law, hospitals must receive empirically justified Medicare DSH payments in a fiscal year to receive an additional Medicare uncompensated care payment for that year. Specifically, section 1886(r)(2) of the Act states that “[i]n addition to the payment made to a subsection (d) hospital under paragraph (1) . . . the Secretary shall pay to such subsection (d) hospital an additional amount . . .” (emphasis supplied). Because paragraph (1) refers to empirically justified Medicare DSH payments, the additional payment under section 1886(r)(2) of the Act therefore, is limited to hospitals that receive empirically justified Medicare DSH payments in accordance with section 1886(r)(1) of the Act for the applicable fiscal year.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50622) and the FY 2014 IPPS interim final rule with comment period (78 FR 61193), we provided that hospitals that are not eligible to receive empirically justified Medicare DSH payments in a fiscal year will not receive uncompensated care payments for that year. We also specified that we would make a determination concerning eligibility for interim uncompensated care payments based on each hospital's estimated DSH status for the applicable fiscal year (using the most recent data that are available). We indicated that our final determination on the hospital's eligibility for uncompensated care payments would be based on the hospital's actual DSH status on the cost report for that payment year.

In the FY 2014 IPPS/LTCH PPS final rule, we also considered whether several specific classes of hospitals are included within the scope of section 1886(r) of the Act. As we specified in that final rule (78 FR 50623), subsection (d) Puerto Rico hospitals that are eligible for DSH payments also are eligible to receive empirically justified Medicare DSH payments and uncompensated care payments under the new payment methodology.

Comment: Several commenters representing the hospital community of Puerto Rico stated that the DSH payment methodology has historically disadvantaged hospitals in Puerto Rico because U.S. citizens residing in Puerto Rico are not entitled to SSI benefits. Because the formula prior to the enactment section 3133 of the Affordable Care Act relied so heavily on SSI and because SSI is statutorily excluded for citizens residing on Puerto Rico, these commenters asserted that DSH payments to Puerto Rico hospitals were disproportionately depressed in comparison to payments to hospitals in the 50 States. The commenters acknowledged that the new DSH payment formula implemented in FY 2014 represents an improvement because it significantly reduces the value of SSI enrollment in calculating DSH payments. However, the commenters also contended that the continued reliance under the new formula upon SSI enrollment means that payments remain unintentionally and unfairly lowered for hospitals in Puerto Rico. In particular, the commenters noted that one of the three factors in determining the uncompensated care payment is intended to account for a hospital's specific portion of uncompensated care as a percent of uncompensated care by all hospitals. They stated that although CMS has adopted a policy of measuring uncompensated care as the sum of insured low-income Medicaid patient days and SSI days, the use SSI days in determining uncompensated care is not required by statute. Rather, they noted that the statute (section1886(r)(2)(C) of the Act) states only that the Secretary determine uncompensated care “as estimated by the Secretary, based on appropriate data.” Therefore, the commenters pointed out that CMS has the discretion to consider other data in place of SSI days to determine uncompensated care. The commenters maintained that the Secretary is obligated to identify a substitute data source for Puerto Rico because section 1886(d)(9)(D) requires the Secretary to ensure that Medicare DSH payments made to Puerto Rico hospitals are made “in the same manner and to the extent as they apply” to PPS hospitals in the United States. The commenters believed that the revised DSH formula fails to make payments to Puerto Rico hospitals “in the same manner” because it factors in and is based upon an indicator that is not even available in Puerto Rico. Therefore, the commenters believed that DSH payments are applied in a disproportionately reduced manner to Puerto Rico hospitals based upon the inclusion of SSI data. The commenters believed that this outcome is illogical because the main purpose of the DSH payment is to compensate hospitals for the higher costs of treating low-income Medicare patients.

Response: As we discussed in the proposed rule, we believe that SSI data combined with Medicaid data are the best data currently available for estimating hospitals' uncompensated care burdens. Accordingly, we proposed to use both SSI and Medicaid data in our estimates of uncompensated care for all hospitals. We employ the same payment methodology for hospitals in Puerto Rico and the 50 States, and therefore, consistent with section 1886(d)(9)(D) of the Act, Medicare DSH payments are made to subsection (d) Puerto Rico hospitals “in the same manner and to the extent as they apply” elsewhere. Accordingly, we do not agree with the commenters that the statute requires us to develop an alternative methodology for making uncompensated care payments to hospitals in Puerto Rico. Nevertheless, we will consider the issues posed by the commenters for future rulemaking. We would also point out that hospitals in Puerto Rico experienced a significant increase in Medicare DSH payments under the new uncompensated care provision. For example, the impact statement in the FY 2014 IPPS/LTCH PPS final rule (78 FR 51009) showed that Puerto Rico hospitals were expected to experience a 41.3 percent increase in payments from the implementation of the new Medicare DSH payment methodology under section 3133 of the Affordable Care Act.

In addition, in the FY 2014 IPPS/LTCH PPS final rule, we considered whether Maryland hospitals that were paid under section 1814(b)(3) of the Act would be eligible to receive uncompensated care payments. We explained that, under section 1814(b) of the Act, hospitals in the State of Maryland were subject to a waiver from the Medicare payment methodologies under which they would otherwise be paid. Because Maryland waiver hospitals were not paid under the IPPS (section 1886(d) of the Act), in the FY 2014 IPPS/LTCH PPS final rule, we determined that Maryland hospitals that operated under a waiver under section 1814(b)(3) of the Act were not eligible to receive empirically justified Medicare DSH payments and uncompensated care payments under the payment methodology of section 1886(r) of the Act (78 FR 50623). As stated in section IV.H. of the preamble of this final rule, effective January 1, 2014, the State of Maryland elected to no longer have Medicare pay Maryland hospitals in accordance with section 1814(b)(3) of the Act and entered into an agreement with CMS that Maryland hospitals will be paid under the Maryland All-Payer Model. However, under the Maryland All-Payer Model, Maryland hospitals still are not paid under the IPPS. Therefore, they remain ineligible to receive empirically justified Medicare DSH payments or the uncompensated care payments under section 1886(r) of the Act.

SCHs are paid based on their hospital-specific rate from certain specified base years or the IPPS Federal rate, whichever yields the greater aggregate payment for the hospital's cost reporting period. If an SCH is paid under its hospital-specific rate, it is not eligible for Medicare DSH payments. In order to implement the provisions of section 1886(r) of the Act, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50624), we specified that we will continue to determine interim payments for SCHs based on what we estimate and project their DSH status to be prior to the beginning of the Federal fiscal year (based on the best available data at that time), subject to settlement through the cost report. We also specified that SCHs that receive interim empirically justified Medicare DSH payments in a fiscal year would receive interim uncompensated care payments for that fiscal year on a per discharge basis, subject as well to settlement through the cost report. Final eligibility determinations will be made at the end of the cost reporting period at settlement, and both interim empirically justified Medicare DSH payments and uncompensated care payments will be adjusted accordingly. Therefore, we follow the same processes of interim and final payments for SCHs that we follow for eligible IPPS DSH hospitals generally.

Comment: One commenter stated that the uncompensated care payment amount should be excluded from the payment under the Federal rate when being compared to payments under the hospital-specific rate in order to determine which payment rate an SCH receives. The commenter stated that the hospital-specific rate does not include the cost of care for indigent patients and, therefore, the uncompensated care payment amount should not be part of the comparison of the Federal payment and the hospital-specific payment. The commenter also stated that the uncompensated care payment should be given to a qualifying SCH, regardless of whether the SCH is paid under the hospital-specific rate or the Federal rate.

Response: We addressed a similar comment in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50624) where we stated that we did not agree that an SCH that is paid under the hospital-specific rate should also receive an uncompensated care payment. We found that section 1886(r)(2) of the Act specifies that the uncompensated care payment amount is made in addition to the empirically justified Medicare DSH payment under section 1886(r)(1) of the Act. Therefore, in order to receive an uncompensated care payment, a hospital must receive an empirically justified Medicare DSH payment and if an SCH is paid under the hospital-specific rate, it does not receive an empirically justified Medicare DSH payment. Furthermore, for the reasons that we discussed in the FY 2014 IPPS/LTCH PPS final rule, we believe it is appropriate to include the uncompensated care payment amount in the payment under the Federal rate for purposes of making the comparison to the hospital-specific payment rate.

MDHs are paid based on the IPPS Federal rate or, if higher, the IPPS Federal rate plus 75 percent of the amount by which the Federal rate is exceeded by the updated hospital-specific rate from certain specified base years (76 FR 51684). The IPPS Federal rate used in the MDH payment methodology is the same IPPS Federal rate that is used in the SCH payment methodology. Uncompensated care payments to MDHs were not explicitly addressed in the FY 2014 IPPS/LTCH PPS final rule because, at the time of the publication of the final rule, the MDH program was set to expire at the end of FY 2013. Since the publication of the FY 2014 IPPS/LTCH PPS final rule, the MDH program was extended from October 1, 2013, to March 31, 2014, under the Pathway for SGR Reform Act (Pub. L. 113-67) and was further extended an additional year from April 1, 2014, to March 31, 2015, by the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93). Because MDHs are paid under the IPPS Federal rate and, therefore, are eligible to receive Medicare DSH payments if their disproportionate patient percentage is at least 15 percent, we apply the same process to determine eligibility for Medicare DSH and the uncompensated care payment as we do for all other IPPS hospitals. That is, we make a determination concerning eligibility for interim uncompensated care payments based on each hospital's estimated DSH status for the applicable fiscal year (using the most recent data that are available) and our final determination on the hospital's eligibility for uncompensated care payments would be based on the hospital's actual DSH status on the cost report for that payment year. In addition, as we do for all IPPS hospitals, we would calculate a numerator for Factor 3 for all MDHs, regardless of whether they are projected to be eligible for DSH during the fiscal year, but the denominator for Factor 3 would be based on the uncompensated care data from the hospitals that we have projected to be eligible for DSH during the fiscal year.

Furthermore, in the FY 2014 IPPS interim final rule with comment period (79 FR 15027), which addressed MDH payments for the first 6 months of FY 2014, we established a policy of including a pro rata share of the uncompensated care payment amount for that period as part of the Federal rate payment in the comparison of payments under the hospital-specific rate and the Federal rate. Consistent with that policy, for MDH payments for the first 6 months of FY 2015, a pro rata share of the uncompensated care payment amount for that period will be included as part of the Federal rate payment in the comparison of payments under the hospital-specific rate and the Federal rate. That is, in making this comparison at cost report settlement, we will include the pro rata share of the uncompensated care payment amount that reflects the period of time the hospital was paid under the MDH program for its discharges occurring on or after October 1, 2014, and before April 1, 2015. Consistent with the policy for hospitals with Medicare cost reporting periods that span more than 1 Federal fiscal year, this pro rata share will be determined based on the proportion of the applicable Federal fiscal year that is included in that cost reporting period (78 FR 61192 through 61194). As noted previously, section 106 of Public Law 113-93 provides for an extension of the MDH program through March 31, 2015, only. Therefore, beginning April 1, 2015, all hospitals that previously qualified for MDH status will no longer have MDH status under current law.

IPPS hospitals that have elected to participate in the Bundled Payments for Care Improvement initiative receive a payment that links multiple services furnished to a patient during an episode of care. We have stated in previous rulemaking that those hospitals continue to be paid under the IPPS (77 FR 53342). Hospitals that elect to participate in the initiative can still receive DSH payments while participating in the initiative, if they otherwise meet the requirements for receiving such payments. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50625), we specified that we will apply the new DSH payment methodology to the hospitals participating in this initiative, so that eligible hospitals will receive empirically justified Medicare DSH payments and uncompensated care payments.

Section 410A of the Medicare Modernization Act established the Rural Community Hospital Demonstration Program. After the initial 5-year period, the demonstration was extended for an additional 5-year period by sections 3123 and 10313 of the Affordable Care Act. There are 23 hospitals currently participating in the demonstration. Under the payment methodology provided in section 410A, participating hospitals receive payment for Medicare inpatient services on the basis of a cost methodology. Specifically, for discharges occurring in the hospitals' first cost reporting period of the initial 5-year demonstration or the first cost reporting period of the 5-year extension, the hospitals participating in the demonstration receive payments for the reasonable cost of providing such services. For discharges occurring in subsequent cost reporting periods during the applicable 5-year period, hospitals receive the lesser of the current year's reasonable cost-based amount, or the previous year's amount updated by the percentage increase in the IPPS market basket (the target amount). The instructions (Change Request 5020 (April 14, 2006) and Change Request 7505 (July 22, 2011)) for the demonstration require that the MAC not pay Medicare DSH payments in addition to the amount received under the reasonable cost-based payment methodology. Because hospitals participating in the demonstration do not receive DSH payments, we determined in the FY 2014 IPPS/LTCH PPS final rule that these hospitals also are excluded from receiving empirically justified Medicare DSH payments and uncompensated care payments under the new payment methodology (78 FR 50625).

c. Empirically Justified Medicare DSH Payments

As we have discussed earlier, section 1886(r)(1) of the Act requires the Secretary to pay 25 percent of the amount of the DSH payment that would otherwise be made under subsection (d)(5)(F) to a subsection (d) hospital. Because section 1886(r)(1) of the Act merely requires the program to pay a designated percentage of these payments, without revising the criteria governing eligibility for DSH payments or the underlying payment methodology, we stated in the FY 2014 IPPS/LTCH PPS final rule that we did not believe that it is necessary to develop any new operational mechanisms for making such payments. Therefore, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50626), we implemented this provision simply by revising the claims payment methodologies to adjust the interim claim payments to the requisite 25 percent of what would have otherwise been paid. We also made corresponding changes to the hospital cost report so that these empirically justified Medicare DSH payments can be settled at the appropriate level at the time of cost report settlement. We provided more detailed operational instructions and cost report instructions following issuance of the final rule that can be found on the CMS Web site at: http://www.cms.gov/Regulations-and-Guidance/Guidance/Transmittals/2014-Transmittals-Items/R5P240.html.

d. Uncompensated Care Payments

As we have discussed earlier, section 1886(r)(2) of the Act provides that, for each eligible hospital in FY 2014 and subsequent years, the new uncompensated care payment is the product of three factors. These three factors represent our estimate of 75 percent of the amount of Medicare DSH payments that would otherwise have been paid, an adjustment to this amount for the percent change in the national rate of uninsurance compared to the rate of uninsurance in 2013, and each eligible hospital's estimated uncompensated care amount relative to the estimated uncompensated care amount for all eligible hospitals. Below we review the data sources and methodologies for computing each of these factors, our final policies for FY 2014, and our proposed and final policies for FY 2015.

(1) Calculation of Factor 1 for FY 2015

Section 1886(r)(2)(A) of the Act establishes Factor 1 in the calculation of the uncompensated care payment. Section 1886(r)(2)(A) of the Act states that it is a factor “equal to the difference between (i) the aggregate amount of payments that would be made to subsection (d) hospitals under subsection (d)(5)(F) if this subsection did not apply for such fiscal year (as estimated by the Secretary); and (ii) the aggregate amount of payments that are made to subsection (d) hospitals under paragraph (1) for such fiscal year (as so estimated).” Therefore, section 1886(r)(2)(A)(i) of the Act represents the estimated Medicare DSH payment that would have been made under section 1886(d)(5)(F) if section 1886(r) of the Act did not apply for such fiscal year. Under a prospective payment system, we would not know the precise aggregate Medicare DSH payment amount that would be paid for a Federal fiscal year until cost report settlement for all IPPS hospitals is completed, which occurs several years after the end of the Federal fiscal year. Therefore, section 1886(r)(2)(A)(i) of the Act provides authority to estimate this amount, by specifying that, for each fiscal year to which the provision applies, such amount is to be “estimated by the Secretary.” Similarly, section 1886(r)(2)(A)(ii) of the Act represents the estimated empirically justified Medicare DSH payments to be made in a fiscal year, as prescribed under section 1886(r)(1) of the Act. Again, section 1886(r)(2)(A)(ii) of the Act provides authority to estimate this amount.

Therefore, Factor 1 is the difference between our estimates of: (1) The amount that would have been paid in Medicare DSH payments for the fiscal year, in the absence of the new payment provision; and (2) the amount of empirically justified Medicare DSH payments that are made for the fiscal year, which takes into account the requirement to pay 25 percent of what would have otherwise been paid under section 1886(d)(5)(F) of the Act. In other words, this factor represents our estimate of 75 percent (100 percent minus 25 percent) of our estimate of Medicare DSH payments that would otherwise be made, in the absence of section 1886(r) of the Act, for the fiscal year.

In order to determine Factor 1 in the uncompensated care payment formula, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50628 through 50630) and in the FY 2014 IPPS interim final rule with comment period (78 FR 61194), we adopted a policy under which we develop final estimates of both the aggregate amount of Medicare DSH payments that would be made in the absence of section 1886(r)(1) of the Act and the aggregate amount of empirically justified Medicare DSH payments to hospitals under section 1886(r)(1) of the Act prior to each fiscal year to which the new provision applies. These estimates are not revised or updated after we know the final Medicare DSH payments for the fiscal year. Specifically, in order to determine the two elements of Factor 1 (Medicare DSH payments prior to the application of section 1886(r)(1) of the Act, and empirically justified Medicare DSH payments after application of section 1886(r)(1) of the Act), we use the most recently available projections of Medicare DSH payments for the fiscal year, as calculated by CMS' Office of the Actuary. The Office of the Actuary projects Medicare DSH payments on a biannual basis, typically in February of each year (based on data from December of the previous year) as part of the President's Budget, and in July (based on data from June) as part of the Midsession Review. The estimates are based on the most recently filed Medicare hospital cost report with Medicare DSH payment information, cost report data provided by Indian Health Service (IHS) hospitals to CMS, and the most recent Medicare DSH patient percentages and Medicare DSH payment adjustments provided in the IPPS Impact File.

Therefore, for the Office of the Actuary's February 2014 estimate, the data were based on the December 2013 update of the Medicare Hospital Cost Report Information System (HCRIS), cost report data provided by IHS hospitals to CMS as of December 2013 and the FY 2014 IPPS/LTCH PPS final rule IPPS Impact file, published in conjunction with the publication of the FY 2014 IPPS/LTCH PPS final rule. For the July 2014 estimate, the data are based on the March 2014 update of the HCRIS data, cost report data provided by IHS hospitals to CMS as of March 2014, and the FY 2015 IPPS Proposed Rule Impact File, published in conjunction with the FY 2015 IPPS/LTCH PPS proposed rule (and which is available via the Internet on the CMS Web site). For purposes of the proposed rule, we used the February 2014 Medicare DSH estimates to calculate Factor 1 and to model the proposed impact of this provision. For this final rule, we use the July 2014 Medicare DSH estimates to determine Factor 1 and to model the impact of this provision. In addition, because SCHs paid under their hospital-specific payment rate are excluded from the application of section 1886(r) of the Act, we also exclude SCHs that are projected to be paid under their hospital-specific rate from our Medicare DSH estimates. Similarly, because Maryland hospitals participating in the Maryland All-Payer Model and hospitals participating in the Rural Community Hospital Demonstration do not receive DSH payments, we also exclude these hospitals from our Medicare DSH estimates.

Using the data sources discussed above, the Office of the Actuary uses the most recently submitted Medicare cost report data to identify current Medicare DSH payments, cost report data provided by IHS hospitals to CMS, and the most recent DSH payment adjustments provided in the IPPS Impact File, and applies inflation updates and assumptions for future changes in utilization and case-mix to estimate Medicare DSH payments for the upcoming fiscal year. The February 2014 Office of the Actuary estimate for Medicare DSH payments for FY 2015, without regard to the application of section 1886(r)(1) of the Act, was $14.205 billion. This estimate excludes Maryland hospitals participating in the Maryland All-Payer Model, SCHs paid under their hospital-specific payment rate, and hospitals participating in the Rural Community Hospital Demonstration as discussed above. Therefore, based on this estimate, the estimate for empirically justified Medicare DSH payments for FY 2015, with the application of section 1886(r)(1) of the Act, was $14.205 billion (25 percent of the total amount estimated). Under § 412.l06(g)(1)(i) of the regulations, Factor 1 is the difference between these two estimates of the Office of the Actuary. Therefore, for the purpose of modeling Factor 1, we proposed that Factor 1 for FY 2015 would be $10.654 billion ($14.205 billion minus $3.551 billion). We invited public comment on our proposed calculation of Factor 1 for FY 2015.

Comment: A number of commenters supported CMS' methodology for determining Factor 1 and/or the proposed Factor 1 for FY 2015. However, other commenters complained that CMS did not provide enough information in the proposed rule regarding the methodologies, calculations, and data sources used to develop this and other estimates to provide a sufficient basis for comment. With regard to the estimate of Factor 1 in particular, these commenters contend:

  • The estimated DSH payments do not account for the impact of Allina v. Sebelius, by excluding Medicare Advantage days from the SSI ratio and including dual-eligible Medicare Advantage days in the Medicaid fraction, thus understating Factor 1 DSH estimate.
  • The 2012 estimated DSH payments of $11.720 billion figure is understated because the 2012 “update” factor (provided for in the FY 2015 IPPS Proposed Rule DSH Supplemental Data File that displays the Office of the Actuary's assumptions in determining the Medicare DSH estimate) is understated. Specifically, a 1.1 percent increase in light of the Cape Cod litigation result was not applied. As a result, instead of a −0.1 percent update factor, the projection should use a +1.0 percent update factor. Therefore the 2012 estimated DSH amount should be $11.732 billion.
  • The estimate of DSH payments for FY 2015 of $14.205 billion is understated because the 2015 update factor is understated. Specifically, the productivity adjustment should be 0.4 percent (as projected in the FY 2015 IPPS/LTCH PPS proposed rule), not 0.5 percent. As a result, instead of a 1.2 percent update factor, the projection should use a 1.3 percent update factor. Therefore, including the 2012 correction and the cumulative impact, the 2015 estimated DSH amount should be $14.234 billion.
  • The summary analysis of the DSH estimate includes an adjustment factor for discharges. However, CMS has not provided the detail supporting the discharge factor used. In addition, the footnote to the discharge column states that all inpatient hospitals were used, not just IPPS hospitals. Because the purpose of the projection is to estimate the amount of DSH that will go to a subset of all inpatient hospitals, it would seem appropriate that factors that drive the estimate likewise would include only the hospitals projected to share in the payments.
  • The DSH estimate is subject to 100 percent of any documentation and coding adjustments due to MS-DRGs. The FY 2015 IPPS/LTCH PPS proposed rule refers to a recoupment adjustment of “$11 billion over a 4-year period of FYs 2014, 2015, 2016, and 2017.” CMS should model the impact of such adjustments to the DSH and uncompensated care payments before subjecting the DSH estimate to dramatic adjustments.
  • The “Other” column from the Factor 1 source file is supposed to contain the DSH payment impact factor: The “Other” column includes impact of only IPPS discharges and impact of DSH payments increasing or decreasing at a different rate than other IPPS payments. This single input should at least reflect the changes in DSH payments, which will be significantly impacted by the effects of Medicaid/CHIP expansion. According to the February 2014 CBO report, an additional 12 million people are projected to enroll in Medicaid/CHIP during 2014 and 2015. That represents a 35-percent increase in Medicaid/CHIP population. Yet, the latest FY 2014 and 2015 “Other” factor only applied a 3.28 percent and a 2.92 percent increase, respectively. Even the pre-Affordable Care Act FY 2012 and 2013 “Other” factor reflected 4.45 percent and 1.56 percent increases, and that was prior to widespread Medicaid expansion.

In the light of these and other concerns about data sources and methods, the commenters insisted that CMS adopt a process of reconciling the initial estimates of Factor 1 with actual data for the payment year in conjunction with the final settlement of hospital cost reports.

Response: Below we present the Office of the Actuary's updated estimate of Factor 1. In order to satisfy the commenters' request for additional information, we also provide additional information regarding the data sources, assumptions, and methods employed by the actuaries. We acknowledge that commenters have requested that we establish a reconciliation procedure for Factor 1. However, we continue to believe that applying our best estimates prospectively would be most conducive to administrative efficiency, finality, and predictability in payments (78 FR 50628). As we noted in the FY 2014 IPPS/LTCH PPS final rule, we do not know the aggregate Medicare DSH payment amount that would be paid for each Federal fiscal year until the time of cost report settlements, which occur several years after the end of the fiscal year. Furthermore, because the statute provides that Factor 1 shall be determined based on estimates of the aggregate amount of DSH payments that would be made in the absence of section 1886(r) of the Act and the aggregate amount of empirically justified DSH payments that are made under section 1886(r)(1) of the Act, we do not agree with commenters that we should establish such a reconciliation process at this time. However, we note the following about the Office of the Actuary's estimates. Factor 1 is an estimate of the expected DSH payments under the previous DSH payment methodology under section 1886(d)(5)(F) of the Act. We believe it is reasonable that an estimate should represent a 50-percent chance of being too high and a 50-percent chance being too low in comparison to actual experience. In reviewing, the Office of the Actuary's prior estimates for DSH payments compared to actual experience, from FY 2005 to FY 2015, the original estimates have been higher than actual experience for 7 of the 11 years, and lower than actual experience in only 4 years. This result is reasonably consistent with the expectation that an estimate has a 50-percent chance of being too high and a 50-percent chance of being too low.

As indicated above, using the data sources discussed above, the Office of the Actuary uses the most recently submitted Medicare cost report data to identify current Medicare DSH payments, cost report data provided by IHS hospitals to CMS, and the most recent DSH payment adjustments provided in the IPPS Impact File, and applies inflation updates and assumptions for future changes in utilization and case-mix to estimate Medicare DSH payments for the upcoming fiscal year. The July 2014 Medicare DSH estimate for FY 2015, without regard to the application of section 1886(r)(1) of the Act, is $13,383,462,195.71. This estimate excludes Maryland hospitals participating in the Maryland All-Payer Model, SCHs paid under their hospital-specific payment rate, and hospitals participating in the Rural Community Hospital Demonstration as discussed above. Therefore, based on this estimate, the estimate for empirically justified Medicare DSH payments for FY 2015, with the application of section 1886(r)(1) of the Act, is $3,345,865,548.93 (25 percent of the total amount estimated). Under § 412.l06(g)(1)(i) of the regulations, Factor 1 is the difference between these two estimates of the Office of the Actuary. Therefore, in this final rule, we are providing that Factor 1 for FY 2015 is $10,037,596,646.78 ($13,383,462,195.71 minus $3,345,865,548.93). Below we provide additional detail regarding the development of this estimate in response to the commenters.

The Office of the Actuary's estimates begins with a baseline of $11.499 billion in Medicare DSH expenditures for FY 2011. The following table shows the factors applied to update this baseline through the current estimate for FY 2015:

Increases From 2011

FYUpdateDischargeCase-mixOtherTotalDSH
20120.9990.97011.0071.04471.01953711724
20131.0280.97991.0141.01321.03492312133
20141.0090.98551.0051.03551.03481812556
20151.0141.01161.0051.0341.06594213383

In this table, the discharge column shows the increase in the number of Medicare inpatient hospital discharges. The figures for FYs 2012 and 2013 are based on Medicare claims data which have been adjusted by a completion factor. The discharge figure for FY 2014 is based on preliminary data for 2014. The discharge figure for FY 2015 is an assumption based on recent trends recovering back to the long-term trend and assumptions related to how many beneficiaries will be enrolled in Medicare Advantage (MA) plans. The case-mix column shows the increase in case-mix for IPPS hospitals. The case-mix figures for FYs 2012 and 2013 are based on actual data adjusted by a completion factor. The FY 2014 and FY 2015 increases are based on the recommendation of the 2010-2011 Medicare Technical Review Panel. The “other” column shows the increase in other factors which contribute to the Medicare DSH estimates. These factors include the difference between the total inpatient hospital discharges and the IPPS discharges, various adjustments to the payment rates which have been included over the years but are not reflected in the other columns (such as the increase in rates for the Cape Cod litigation and the reduction in rates for the 2-midnight policy). In addition, this column includes a factor for the Medicaid expansion due to the Affordable Care Act. However, the increase due to the Medicaid expansion is not as large as commenters contended due to the actuarial assumption that the new enrollees are healthier than the average Medicaid recipient and, therefore, use fewer hospital services. We have included the impact of the Medicaid expansion in the FY 2015 DSH estimate and note that it was also included in the FY 2014 DSH estimate. Our estimates are as follows:

FY 2014FY 2015
Medicaid enrollment pre-ACA (in millions)56.055.9
Medicaid enrollment post-ACA (in millions)64.769.8
Under 65 pre-ACA enrollment (in millions)50.650.4
Under 65 post-ACA enrollment (in millions)59.364.3
Increase in Medicare DSH4.9%3.4%

As can be seen in the table above, there is assumed to be a 4.9 percent increase in Medicare DSH due to the Medicaid expansion in FY 2014, and an additional 3.4 percent increase in Medicare DSH in FY 2015. This results in approximately an 8.5 percent increase due to the Medicaid expansion by FY 2015. This estimate is lower than the commenters may have expected due to the assumption that the expansion population is healthier than the rest of the Medicaid population and will utilize fewer hospital services. This factor in the estimate is included in the “other” column of the breakdown.

The next table below shows the factors that are included in the “update” column of the above table:

FYMarket basketAffordable Care Act payment reductionsProductivityDocumentation and codingTotal
201230.11−2−0.1
20132.60.10.712.8
20142.50.30.5−0.80.9
20152.90.20.5−0.81.4

In this table, all numbers are based on mid-session review of FY 2015 Budget projections.

With regard to the assumed update factor for FY 2012, the commenters are correct that the update to the Federal standardized amount due to the Cape Cod litigation should be reflected in our DSH estimate. However, we have included it in the DSH estimate and the 1.1 percent increase is reflected in the “other “column. We consider it not to be part of the update and that is consistent with our treatment of the 0.2 percent reduction to the rate in FY 2014 for the 2-midnight policy finalized in the FY 2014 IPPS/LTCH PPS final rule, which is also included in the “other” column.

We agree with the commenters that the update for FY 2015 should include the productivity adjustment finalized for FY 2015 in our FY 2015 Medicare DSH estimates. Accordingly, we have revised our FY 2015 Medicare DSH estimates to reflect this final productivity adjustment. We also agree with the commenters that the DSH estimates are only affected by IPPS discharges. However, the discharge figures reflect all inpatient hospitals, and we adjust the Medicare DSH estimates to take into account the difference between the increase in discharges for all inpatient hospitals and the IPPS hospital discharge increase in the “other” column. If the “discharge” column was limited to IPPS hospitals, the “discharge” column would be lower and the “other” column would be higher, and the increase reflected in the “total” column would be the same.

The commenters also are correct that the documentation and coding numbers for future years could be more than a 0.8 percent reduction to comply with the $11 billion requirement, but those figures have not yet been determined. The reason for the higher possibility is that the number of discharges has decreased significantly.

Lastly, we do not believe that the decision in Allina v. Sebelius is relevant to our estimate of Factor 1 for FY 2015. The decision in Allina did not address the FY 2014 IPPS/LTCH PPS final rule (78 FR 50614 through 50620) in which we readopted the policy of counting Medicare Advantage days in the SSI ratio for FY 2014 and all subsequent fiscal years. Accordingly, consistent with that policy, our estimate of Factor 1 for FY 2015 appropriately accounts for Medicare Advantage days by including them in the SSI ratio.

(2) Calculation of Factor 2 for FY 2015

Section 1886(r)(2)(B) of the Act establishes Factor 2 in the calculation of the uncompensated care payment. Specifically, section 1886(r)(2)(B)(i) of the Act provides: “For each of fiscal years 2014, 2015, 2016, and 2017, a factor equal to 1 minus the percent change in the percent of individuals under the age of 65 who are uninsured, as determined by comparing the percent of such individuals (I) who are uninsured in 2013, the last year before coverage expansion under the Patient Protection and Affordable Care Act (as calculated by the Secretary based on the most recent estimates available from the Director of the Congressional Budget Office before a vote in either House on the Health Care and Education Reconciliation Act of 2010 that, if determined in the affirmative, would clear such Act for enrollment); and (II) who are uninsured in the most recent period for which data is available (as so calculated), minus 0.1 percentage points for fiscal year 2014 and minus 0.2 percentage points for each of fiscal years 2015, 2016, and 2017.”

Section 1886(r)(2)(B)(i)(I) of the Act further indicates that the percent of individuals under 65 without insurance in 2013 must be the percent of such individuals “who are uninsured in 2013, the last year before coverage expansion under the Patient Protection and Affordable Care Act (as calculated by the Secretary based on the most recent estimates available from the Director of the Congressional Budget Office before a vote in either House on the Health Care and Education Reconciliation Act of 2010 that, if determined in the affirmative, would clear such Act for enrollment).” The Health Care and Education Reconciliation Act (Pub. L. 111-152) was enacted on March 30, 2010. It was passed in the House of Representatives on March 21, 2010, and by the Senate on March 25, 2010. Because the House of Representatives was the first House to vote on the Health Care and Education Reconciliation Act of 2010 on March 21, 2010, we have determined that the most recent estimate available from the Director of the Congressional Budget Office “before a vote in either House on the Health Care and Education Reconciliation Act of 2010 . . .” (emphasis added) appeared in a March 20, 2010 letter from the director of the CBO to the Speaker of the House. Therefore, we believe that only the estimates in this March 20, 2010 letter meet the statutory requirement under section 1886(r)(2)(B)(i)(I) of the Act. (To view the March 20, 2010 letter, we refer readers to the Web site at: http://www.cbo.gov/sites/default/files/cbofiles/ftpdocs/113xx/doc11379/amendreconprop.pdf.)

In its March 20, 2010 letter to the Speaker of the House of Representatives, the CBO provided two estimates of the “post-policy uninsured population.” The first estimate is of the “Insured Share of the Nonelderly Population Including All Residents” (82 percent) and the second estimate is of the “Insured Share of the Nonelderly Population Excluding Unauthorized Immigrants” (83 percent). In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50631), we used the first estimate that includes all residents, including unauthorized immigrants. We stated that we believe this estimate is most consistent with the statute which requires us to measure “the percent of individuals under the age of 65 who are uninsured,” and provides no exclusions except for individuals over the age of 65. In addition, we stated that we believe that this estimate more fully reflects the levels of uninsurance in the United States that influence uncompensated care for hospitals than the estimate that reflects only legal residents. The March 20, 2010 CBO letter reports these figures as the estimated percentage of individuals with insurance. However, because section 1886(r)(2)(B)(i) of the Act requires that we compare the percent of individuals who are uninsured in the applicable year with the percent of individuals who were uninsured in 2013, in the FY 2014 IPPS/LTCH PPS final rule, we used the CBO insurance rate figure and subtracted that amount from 100 percent (that is the total population without regard to insurance status) to estimate the 2013 baseline percent of individuals without insurance. Therefore, for FYs 2014 through 2017, our estimate of the uninsurance percentage for 2013 is 18 percent.

Section 1886(r)(2)(B)(i) of the Act requires that we compare the baseline uninsurance rate to the percent of such individuals “who are uninsured in the most recent period for which data is available (as so calculated).” In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50634), we used the same data source, CBO estimates, to calculate this percent of individuals without insurance. In response to public comments, we also agreed that we should normalize the CBO estimates, which are based on the calendar year, for the Federal fiscal years for which each calculation of Factor 2 is made (78 FR 50633). Therefore, in the FY 2014 IPPS/LTCH PPS final rule, we employed the most recently available estimate, specifically CBO's May 2013 estimates of the effects of the Affordable Care Act on health insurance coverage (which are available at: http://www.cbo.gov/sites/default/files/cbofiles/attachments/44190_EffectsAffordableCareActHealthInsuranceCoverage_2.pdf) as amended by CBO's July 2013 estimates of changes in estimates of the effects of insurance coverage provisions in the Affordable Care Act issued in conjunction with a memo regarding “Analysis of the Administration's Announced Delay of Certain Requirements Under the Affordable Care Act,” which are available at: http://www.cbo.gov/sites/default/files/cbofiles/attachments/44465-ACA.pdf. The CBO's May 2013 estimate of the rate of insurance for CY 2013 was 80 percent, and for CY 2014 was 84 percent. Therefore, the calculation of Factor 2 for FY 2014, employing a weighted average of the CBO projections for CY 2013 and CY 2014, was as follows:

  • CY 2013 rate of insurance coverage (May 2013 CBO estimate): 80 percent.
  • CY 2014 rate of insurance coverage (May 2013 CBO estimate, updated with July 2013 CBO estimate): 84 percent.
  • FY 2014 rate of insurance coverage: (80 percent * .25) + (84 percent * .75) = 83 percent.
  • Percent of individuals without insurance for 2013 (March 2010 CBO estimate): 18 percent.
  • Percent of individuals without insurance for FY 2014 (weighted average): 17 percent.

1 − [(0.17 − 0.18)/0.18]| = 1 − 0.056 = 0.944 (94.4 percent).

0.944 (94.4 percent) − 0.001 (0.1 percentage points) = 0.943 (94.3 percent).

0.943 = Factor 2

Therefore, in the FY 2014 IPPS/LTCH PPS final rule, we adopted 0.943 as the final determination of Factor 2 for FY 2014. In conjunction with this determination, we also determined in the FY 2014 IPPS/LTCH PPS final rule and later revised in the FY 2014 IPPS interim final rule with comment period (78 FR 61195) that the amount available for uncompensated care payments for FY 2014 would be approximately $9.046 billion (0.943 times our Factor 1 estimate of $9.593 billion).

For the FY 2015 proposed rule, we used CBO's February 2014 estimates of the effects of the Affordable Care Act on health insurance coverage (which are available at http://www.cbo.gov/publication/43900?utm_source=feedblitz&utm_medium=FeedBlitzEmail&utm_content=812526&utm_campaign=0). The CBO's February 2014 estimate of individuals under the age of 65 with insurance in CY 2014 was 84 percent. Therefore, the CBO's most recent estimate of the rate of uninsurance in CY 2014 at the time of the FY 2015 IPPS/LTCH PPS proposed rule was 16 percent (that is, 100 percent minus 84 percent.) Similarly, the CBO's February 2014 estimate of individuals under the age of 65 with insurance in CY 2015 was 86 percent. Therefore, the CBO's most recent estimate of the rate of uninsurance in CY 2015 available at the time of the FY 2015 IPPS/LTCH PPS proposed rule was 14 percent (that is, 100 percent minus 86 percent.)

The calculation of the proposed Factor 2 for FY 2015, employing a weighted average of the CBO projections for CY 2014 and CY 2015, was as follows:

  • CY 2014 rate of insurance coverage (February 2014 CBO estimate): 84 percent.
  • CY 2015 rate of insurance coverage (February 2014 CBO estimate): 86 percent.
  • FY 2015 rate of insurance coverage: (84 percent * .25) + (86 percent * .75) = 85.5 percent.
  • Percent of individuals without insurance for 2013 (March 2010 CBO estimate): 18 percent.
  • Percent of individuals without insurance for FY 2015 (weighted average): 14.5 percent.

1 − [(0.145 − 0.18)/0.18] = 1 − 0.19444 = 0.80556 (80.556 percent)

0.80556 (80.556 percent) − 0.002 (0.2 percentage points for FY 2015 under section 1886(r)(2)(B)(i) of the Act) = 0.8036 (80.36 percent)

0.8036 = Factor 2

Therefore, we proposed that Factor 2 for FY 2015 would be 0.8036. We indicated that our proposal for Factor 2 was subject to change if more recent CBO estimates of the insurance rate became available at the time of the preparation of the final rule. We invited public comments on our proposed calculation of Factor 2 for FY 2015.

Comment: A number of commenters supported the use of the CBO estimates for determining Factor 2. However, other commenters objected to CMS' proposed calculation of Factor 2. Some commenters found that the calculation of Factor 2 appeared arbitrary. For example, some of the commenters complained that a 2-percent decrease in the percentage of uninsured does not seem reasonable based on current economic conditions. Other commenters asserted that, in their views, the Affordable Care Act was not implemented until January 1, 2014, so that such a large decrease in uninsured is very speculative and without historical data. Commenters requested additional information on how the CBO calculates its insurance estimates, including the assumptions in its estimates. Commenters also requested reconciliation of the Factor 2 estimates with actual data at the time of cost report settlements. While these commenters understood that estimates must be used for interim payments, they believed that more accurate numbers based on actual experience should be available for purposes of determining final payments at the time of cost report settlement.

Response: We note that, in the FY 2014 IPPS/LTCH PPS final rule, we finalized a policy to employ the most recent CBO estimates of the rates of uninsurance in the calculation of Factor 2 for FY 2014 and subsequent years, and did not adopt any policy for reconciling those estimates. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50632), we stated that we believe that employing actual data as the basis for reconciling the projections employed to determine Factor 2 would impose an unacceptable delay in the final determination of uncompensated care payments. Actual data on the rates of insurance and uninsurance would not become available until several years after the payment year, and the initial data for the year would continue to be adjusted for several years after that as further data become available.

In its April 2014 report, the CBO and the Joint Committee on Taxation (JCT) estimated that the Affordable Care Act would result in insurance coverage for 12 million more nonelderly individuals in FY 2014 than in the absence of the Affordable Care Act. The coverage projections included the changes arising from participation in the health insurance exchanges, Medicaid and CHIP enrollment, and changes in employer-sponsored, nongroup and other insurance coverage. Included in the uninsured population are undocumented immigrants who are not eligible for Medicaid and exchange coverage and low-income residents of States not participating in the Medicaid expansion. In addition, other individuals will choose to remain uninsured, despite being eligible for Medicaid or having access through an employer, the exchange, or from an insurer.

Congressional Budget Office. Updated Estimates of the Insurance Coverage Provisions of the Affordable Care Act, April 2014 (April 2014). http://www.cbo.gov/sites/default/files/cbofiles/attachments/45231-ACA_Estimates_OneColumn.pdf.

The CBO and JCT estimate of the increase in insurance coverage represents the number of people who are expected to be insured this year under current law minus the number who would have been insured this year in the absence of the Affordable Care Act. More people are expected to obtain insurance through the exchanges over time due to subsidies and penalties for noncoverage. CBO and JCT expected more people to obtain insurance through Medicaid and CHIP because of increased eligibility due to the Medicaid expansion and more enrollments among those who were previously eligible for Medicaid or CHIP but would not have enrolled in the absence of the Affordable Care Act. Overall, the net coverage effect is a large decrease in the uninsured population.

Because not all States have expanded their Medicaid programs, the CBO and JCT revised their estimates for changes in the insured population due to Medicaid expansion. The table below presents the updated estimates of the change in insurance coverage under Medicaid and CHIP under the Affordable Care Act. The CBO and JCT revised their estimates to indicate a decrease in the number of insured individuals in CYs 2014 and 2015. In addition, CBO and JCT did not rely on State predictions about the Medicaid expansion under the Affordable Care Act. Instead, they projected the approximate shares of the affected population residing in States that will fall into different broad categories. The broad categories range from States that did not expand their Medicaid program to States that choose Medicaid expansion. Due to the uncertainty of States' actions, estimates by the CBO and JCT reflected an assessment of the different outcome probabilities and the middle of the distribution of all possible outcomes. For instance, the CBO's and JCT's estimates considered multiple factors that are associated with a State's choice on whether to expand Medicaid eligibility: Overall budgetary situation; current thresholds for Medicaid eligibility; the amounts that States and local governments spend to provide health care to the uninsured or to pay providers for uncompensated care; the number of people likely to enroll in the program after expansion; the Federal contributions toward the cost of their care, and other factors.

Congressional Budget Office. Estimates for the Insurance Coverage Provisions of the Affordable Care Act Updated for the Recent Supreme Court Decision (July 2012). http://www.cbo.gov/sites/default/files/cbofiles/attachments/43472-07-24-2012-CoverageEstimates.pdf.

Estimates of the Increase in Insurance Coverage Due to Medicaid and CHIP Under the Affordable Care Act *

Last updated date2013201420152016201720182019202020212022
July 201217910101111111111
February 2013181111111111121212
May 2013191212121212131313
February 201481212121212131313
April 201471112121313131313
Source: CBO reports on effects of the Affordable Care Act on health insurance coverage (July 2012-April 2014) http://www.cbo.gov/sites/default/files/cbofiles/attachments/43900-2014-04-ACAtables2.pdf.
* Millions of nonelderly people, by calendar year.

In their April 2014 report, CBO and JCT estimated that an average of 6 million people will be covered by insurance obtained through the exchanges by the end of CY 2014. The estimate was determined at the national level instead of at the level of individual States. Although CBO and JCT did not account for the variations of success in obtaining health insurance through the exchanges by State, they did account for the possibility of individuals moving in and out of insurance coverage over time due to changes in employment, family circumstances, and other factors.

The CBO and JCT estimates therefore do take into account some uncertainties and risks under the Affordable Care Act, including the probabilities of different outcomes of Medicaid expansions and changes in insurance coverage status over time.

For the FY 2015 final rule, we use the CBO's April 2014 estimates of the effects of the Affordable Care Act on health insurance coverage (which are available at http://www.cbo.gov/sites/default/files/cbofiles/attachments/43900-2014-04-ACAtables2.pdf). The CBO's April 2014 estimate of individuals under the age of 65 with insurance in CY 2014 is 84 percent. Therefore, the CBO's most recent estimate of the rate of uninsurance in CY 2014 is 16 percent (that is, 100 percent minus 84 percent.) Similarly, the CBO's April 2014 estimate of individuals under the age of 65 with insurance in CY 2015 is 87 percent. Therefore, the CBO's most recent estimate of the rate of uninsurance in CY 2015 available for this final rule is 13 percent (that is, 100 percent minus 87 percent.)

The calculation of the final Factor 2 for FY 2015, employing a weighted average of the CBO projections for CY 2014 and CY 2015, is as follows:

  • CY 2014 rate of insurance coverage (April 2014 CBO estimate): 84 percent.
  • CY 2015 rate of insurance coverage (April 2014 CBO estimate): 87 percent.
  • FY 2015 rate of insurance coverage: (84 percent * .25) + (87 percent * .75) = 86.25 percent.
  • Percent of individuals without insurance for 2013 (March 2010 CBO estimate): 18 percent.
  • Percent of individuals without insurance for FY 2015 (weighted average): 13.75 percent.

1 − |((0.1375 − 0.18)/0.18)| = 1 − 0.2361 = .7639 (76.39 percent)

0.7639 (76.39 percent) − .002 (0.2 percentage points for FY 2015 under section 1886(r)(2)(B)(i) of the Act) = 0.7619 or 76.19 percent

0.7619 = Factor 2

Therefore, the final Factor 2 for FY 2015 is 76.19 percent.

The FY 2015 Final Uncompensated Care Amount is: $10,037,596,646.78 × 0.7619 = $7,647,644,885.18.

FY 2015 Final Uncompensated Care Total Available—$7,647,644,885.18.

(3) Calculation of Factor 3 for FY 2015

Section 1886(r)(2)(C) of the Act defines Factor 3 in the calculation of the uncompensated care payment. As we have discussed earlier, section 1886(r)(2)(C) of the Act states that Factor 3 is “equal to the percent, for each subsection (d) hospital, that represents the quotient of (i) the amount of uncompensated care for such hospital for a period selected by the Secretary (as estimated by the Secretary, based on appropriate data (including, in the case where the Secretary determines alternative data is available which is a better proxy for the costs of subsection (d) hospitals for treating the uninsured, the use of such alternative data)); and (ii) the aggregate amount of uncompensated care for all subsection (d) hospitals that receive a payment under this subsection for such period (as so estimated, based on such data).”

Therefore, Factor 3 is a hospital-specific value that expresses the proportion of the estimated uncompensated care amount for each subsection (d) hospital and each subsection (d) Puerto Rico hospital with the potential to receive DSH payments relative to the estimated uncompensated care amount for all hospitals estimated to receive DSH payments in the fiscal year for which the uncompensated care payment is to be made. Factor 3 is applied to the product of Factor 1 and Factor 2 to determine the amount of the uncompensated care payment that each eligible hospital will receive for FY 2014 and subsequent fiscal years. In order to implement the statutory requirements for this factor of the uncompensated care payment formula, it was necessary to determine: (1) The definition of uncompensated care or, in other words, the specific items that are to be included in the numerator (that is, the estimated uncompensated care amount for an individual hospital) and denominator (that is, the estimated uncompensated care amount for all hospitals estimated to receive DSH payments in the applicable fiscal year); (2) the data source(s) for the estimated uncompensated care amount; and (3) the timing and manner of computing the quotient for each hospital estimated to receive DSH payments. The statute instructs the Secretary to estimate the amounts of uncompensated care for a period “based on appropriate data.” In addition, we note that the statute permits the Secretary to use alternative data “in the case where the Secretary determines that alternative data is available,” which is a better proxy for the costs of subsection (d) hospitals for treating uninsured individuals.

In the course of considering how to determine Factor 3 during the rulemaking process for FY 2014, we considered defining the amount uncompensated care for a hospital as the uncompensated care costs of each hospital and considered potential data sources for those costs. For purposes of selecting an appropriate data source for this possible definition of uncompensated care costs, we reviewed the literature and available data sources and determined that Worksheet S-10 of the Medicare cost report could potentially provide the most complete data for Medicare hospitals. (We refer readers to the report “Improvements to Medicare Disproportionate Share (DSH) Payments” for a full discussion and evaluation of the available data sources. The report is available on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/dsh.html.) However, we noted that Worksheet S-10 is a relatively new data source that has been used for specific payment purposes only in relatively restricted ways (for example, to provide a source of charity care charges in the computation of EHR incentive payments (75 FR 44456)). We also noted that some stakeholders have expressed concern that hospitals have not had enough time to learn how to submit accurate and consistent data through this reporting mechanism. Other stakeholders have maintained that some instructions for Worksheet S-10 still require clarification in order to ensure standardized and consistent reporting by hospitals. At the same time, we noted that Worksheet S-10 is the only national data source that includes data for all Medicare hospitals and is designed to elicit data on uncompensated care costs. We discussed the possible use of data reported on Worksheet S-10 to determine uncompensated care costs in more detail in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27586).

Because of concerns regarding variations in the data reported on Worksheet S-10 of the Medicare cost report and the completeness of these data, we did not propose to use data from the Worksheet S-10 to determine the amount of uncompensated care. However, we stated our belief that Worksheet S-10 of the Medicare cost report would otherwise be an appropriate data source to determine uncompensated care costs. In particular, we noted that Worksheet S-10 was developed specifically to collect information on uncompensated care costs in response to interest by MedPAC and other stakeholders regarding the topic (for example, MedPAC's March 2007 Report to Congress) and that it is not unreasonable to expect information on the cost report to be used for payment purposes. Furthermore, hospitals attest to the accuracy and completeness of the information reported in the cost report at the time of submission. We indicated that we expect reporting on Worksheet S-10 to improve over time, particularly in the area of charity care which is already being used and audited for payment determinations related to the EHR Incentive Program, and that we will continue to monitor these data. Accordingly, we stated that we may proceed with a proposal to use data on the Worksheet S-10 to determine uncompensated care costs in the future, once hospitals are submitting accurate and consistent data through this reporting mechanism.

As a result of our concerns regarding the data reported on Worksheet S-10 of the Medicare cost report, we believed it was appropriate to consider the use of alternative data, at least in FY 2014, the first year that this provision is in effect, and possibly for additional years until hospitals have adequate experience reporting all of the data elements on Worksheet S-10. We noted that this approach is consistent with input we received from some stakeholders in response to the CMS National Provider Call in January 2013, who stated their belief that existing FY 2010 and FY 2011 data from the Worksheet S-10 should not be used for implementation of section 1886(r) of the Act and who requested the opportunity to resubmit the data once more specific instructions were issued by CMS. Accordingly, we examined alternative data sources that could be used to allow time for hospitals to gain experience with and to improve the accuracy of their reporting on Worksheet S-10 of the Medicare cost report. We stated in the FY 2014 IPPS/LTCH PPS final rule that we believe that data on utilization for insured low-income patients can be a reasonable proxy for the treatment costs of uninsured patients. Moreover, due to the concerns regarding the accuracy and consistency of the data reported on the Worksheet S-10, we also determined that these alternative data, which are currently reported on the Medicare cost report, would be a better proxy for the amount of uncompensated care provided by hospitals. Accordingly, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50639), we adopted the policy of employing the utilization of insured low-income patients defined as inpatient days of Medicaid patients plus inpatient days of Medicare SSI patients as defined in 42 CFR 412.106(b)(4) and 412.106(b)(2)(i), respectively, to determine Factor 3. We also indicated that we remained convinced that the Worksheet S-10 could ultimately serve as an appropriate source of more direct data regarding uncompensated care costs for purposes of determining Factor 3 once hospitals are submitting more accurate and consistent data through this reporting mechanism. In the interim, we indicated that we would take steps such as revising and clarifying cost report instructions, as appropriate. We stated that it is our intention to propose introducing the use of the Worksheet S-10 data for purposes of determining Factor 3 within a reasonable amount of time.

Since the publication of the FY 2014 IPPS/LTCH PPS final rule, we have continued to evaluate and assess the comments we have received from stakeholders about Worksheet S-10 as well as to evaluate what changes might need to be made to the instructions to make the data hospitals submit more accurate and consistent across hospitals. Although we have not yet developed revisions to the Worksheet S-10 instructions at this time, we remain committed to making improvements to Worksheet S-10. For that reason, we believe it would be premature to propose the use of Worksheet S-10 data for purposes of determining Factor 3 for FY 2015. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28101), we proposed to continue to employ the utilization of insured low-income patients defined as inpatient days of Medicaid patients plus inpatient days of Medicare SSI patients, as defined in § 412.106(b)(4) and § 412.106(b)(2)(i), respectively, to determine Factor 3 for FY 2015. Accordingly, we proposed to revise the regulations at 42 CFR 412.106(g)(1)(iii)(C) to state that, for FY 2015, CMS will base its estimates of the amount of hospital uncompensated care on the most recent available data on utilization for Medicaid and Medicare SSI patients, as determined by CMS in accordance with paragraphs (b)(2)(i) and (b)(4) of that section of the regulations. We invited public comments on this proposal and indicated that we will continue to work with the hospital community and others to develop the appropriate clarifications and revisions to Worksheet S-10 of the Medicare cost report for reporting uncompensated care data. In particular, we invited public comments on what would be a reasonable timeline for adopting Worksheet S-10 of the Medicare cost report as the data source for determining Factor 3.

As we did for the FY 2014 IPPS/LTCH PPS proposed rule, for the FY 2015 IPPS/LTCH PPS proposed rule, we published on the CMS Web site a table listing Factor 3 for all hospitals that we estimate would receive empirically justified Medicare DSH payments in a fiscal year (that is, hospitals that we project would receive interim uncompensated care payments during the fiscal year), and for the remaining subsection (d) and subsection (d) Puerto Rico hospitals that have the potential of receiving a DSH payment in the event that they receive an empirically justified Medicare DSH payment for the fiscal year as determined at cost report settlement. Hospitals had 60 days from the date of public display of the FY 2015 IPPS/LTCH PPS proposed rule to review these tables and notify CMS in writing of a change in a hospital's subsection (d) hospital status, such as if a hospital has closed or converted to a CAH.

Comment: Most commenters agreed that the Worksheet S-10 data are not yet sufficiently consistent and reliable to be employed for purposes of determining each hospital's share of uncompensated care payments. The commenters therefore supported the proposal to continue employing SSI days and Medicaid days for this purpose in FY 2015. Some of these commenters did express support for eventually employing the Worksheet S-10 data for this purpose, once cost reporting instructions have been appropriately revised and the hospital community has been adequately instructed to render those data sufficiently consistent and reliable. Some commenters also requested a more specific timetable for adopting the Worksheet S-10 data. However, MedPAC and a few other commenters supported the use of the Worksheet S-10 data for FY 2015. MedPAC expressed disagreement with CMS' statement that the data on utilization for insured low-income patients can serve as a reasonable proxy for the treatment costs of uninsured patients. MedPAC specifically cited its 2007 analysis of data from the GAO and data from the American Hospital Association (AHA), which suggests that Medicaid days and low-income Medicare days are not a good proxy for uncompensated care costs. Given its prior findings that the Medicaid and SSI shares were poor predictors of uncompensated care costs, MedPAC argued that there is a need to transition to new measures. MedPAC therefore supported Worksheet S-10 in the Medicare cost report as an appropriate measure of uncompensated care that could begin to replace the reliance on Medicaid and SSI shares. Specifically, it recommended employing charity care for the uninsured, which is reported on the Worksheet S-10 (line 23, column 1) as a reasonable proxy for the costs of treating the uninsured. In response to concerns about whether the quality of the data reported on Worksheet S-10 is adequate for use in distributing uncompensated care payments, MedPAC argued that it is already better than using Medicaid and SSI days as a proxy for uncompensated care costs, and that the data on Worksheet S-10 will improve over time as they are actually used in making payments. MedPAC also expressed its view that the Worksheet S-10 data currently available should only establish an interim allocation of uncompensated care payments; the final allocation of payments to each hospital should be determined based on the Worksheet S-10 data available at year-end settlement. To prevent financial shocks to hospitals, some commenters suggested that CMS could transition to use of the Worksheet S-10 data over 3 years.

Response: As we stated in the FY 2014 IPPS/LTCH PPS final rule, we believe that data on utilization for insured low-income patients can be a reasonable proxy for the treatment costs of uninsured patients. Moreover, due to the concerns that continue to be expressed by a large majority of commenters regarding the accuracy and consistency of the data reported on the Worksheet S-10, we continue to believe that these alternative data on utilization for insured low-income patients, which are currently reported on the Medicare cost report, remain a better proxy for the amount of uncompensated care provided by hospitals. Accordingly, in this final rule, we are finalizing for FY 2015 the policy that we originally adopted in the FY 2014 IPPS/LTCH PPS final rule, of employing the utilization of insured low-income patients defined as inpatient days of Medicaid patients plus inpatient days of Medicare SSI patients as defined in 42 CFR 412.106(b)(4) and 412.106(b)(2)(i), respectively, to determine Factor 3. However, we also remain convinced that Worksheet S-10 could ultimately serve as an appropriate source of more direct data regarding uncompensated care costs for purposes of determining Factor 3 once hospitals are submitting more accurate and consistent data through this reporting mechanism. In the interim, we will continue to take steps to revise and clarify cost report instructions, as appropriate. We also are undertaking benchmarking analyses to compare available Worksheet S-10 data to other data sources on uncompensated care, such as on uncompensated care costs reported to the IRS on Form 990 by not-for-profit hospitals. Because the data submitted through Form 990 are audited and come from an external source, they represent a suitable standard of comparison. It remains our intention to propose introducing the use of the Worksheet S-10 data for purposes of determining Factor 3 within a reasonable amount of time.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50639), we considered public comments which recommended that we use the wage index to adjust insured low-income days in determining Factor 3 in order to account for the differences in “purchasing power” in different regions of the country. With respect to these public comments, we agreed that there may be regional variation in uncompensated care costs due to regional variations in the costs of care generally. However, we stated that we did not believe that there was sufficient basis for believing that the wage index reflects the variations in uncompensated care costs well enough to adopt it as the basis for adjusting Factor 3. The wage index reflects the relative hospital wage level in the geographic area of the hospital compared to the national average hospital wage level. In computing the wage index, we derive an average hourly wage for each labor market area (total wage costs divided by total hours for all hospitals in the geographic area) and a national average hourly wage (total wage costs divided by total hours for all hospitals surveyed in the nation). A labor market area's wage index value is the ratio of the area's average hourly wage to the national average hourly wage. We note that, for FY 2014, 69.6 percent of the standardized amount is considered to be the labor-related share and, therefore, adjusted by the wage index. However, in addition to the labor-related share of the standardized amount being adjusted by the wage index, the entire standardized amount is also adjusted for the relative weight of the MS-DRG for each individual patient. In other words, the wage index only adjusts for a portion of the variation in costs, and does not address variations in resource use and patient severity. Therefore, we stated that we did not believe that there was sufficient basis for believing that adjusting low-income patient days by the wage index would better reflect variations in uncompensated care costs.

Since the publication of the FY 2014 IPPS/LTCH PPS final rule, we have continued to consider whether to propose employing the wage index to adjust insured low-income days in determining Factor 3. After this consideration, we continue to believe that a wage index adjustment to insured low-income days is not an appropriate measure to account for variations in the costs of uncompensated care among hospitals. The intensity of such care, and therefore the costs, may vary by hospital, but we still lack convincing evidence that the wage index data are an accurate measure of that intensity. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to adopt such an adjustment to low-income days for purposes of calculating Factor 3 in FY 2015.

Comment: Several commenters agreed that applying the wage index to Factor 3 is not an appropriate measure of variations in uncompensated care costs. One commenter stated that CMS should apply a wage and case-mix adjustment to the Medicaid and SSI days using the hospital area wage index and hospital-specific case mix index. The commenter believed that this information is readily available, well-understood, and is appropriate for measuring cost variation among hospitals.

Response: We appreciate the comments and continue to believe it is not appropriate to adopt a wage index adjustment to low-income days to calculate Factor 3 for FY 2015. Although wage index information is readily available, for the reasons discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50639), we continue to believe that is it not an accurate measure of the intensity of uncompensated care costs and would not serve as an appropriate basis for making adjustments to Factor 3.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50639), we also considered public comments that requested that we include insured low-income days from exempt units (specifically, inpatient rehabilitation units paid under the IRF PPS and inpatient psychiatric units paid under the IPF PPS) of the hospital in the computation of Factor 3, in order to better capture the treatment costs of the uninsured by the hospital. In response to those public comments, we stated our belief that there may be some merit to including insured low-income days from exempt units of the hospital in order to better capture the full costs of the treatment of the uninsured by the hospital insofar as those data may be publicly available, subject to audit, and used for payment purposes. We also indicated that we believed it would be prudent to consider the degree to which these data meet these conditions before adopting this recommendation. Therefore, we stated that we would consider including this recommendation among our proposals in future rulemaking.

Since the publication of the FY 2014 IPPS/LTCH PPS final rule, we have conducted an analysis of the impact of adopting this recommendation. That analysis has indicated that the inclusion of Medicaid and Medicare-SSI days for exempt inpatient units does not significantly change the distribution of uncompensated care payments to hospitals, with the exception of a few hospitals with high utilization associated with those exempt units that would see increases in their uncompensated care payments. Furthermore, Medicaid and SSI days for inpatient rehabilitation units have been audited and are used for payment purposes under the IRF PPS; specifically, these data are used to calculate the low-income payment (LIP) adjustment under the IRF PPS. However, the data for inpatient psychiatric units are not generally audited and have not been used previously for payment purposes. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to include those days in the calculation of a hospital's share of uncompensated care payments for FY 2015. As we indicated earlier, we believe it would be appropriate to include such data in the calculation of uncompensated care payments only insofar as those data may be publicly available, subject to audit, and used for payment purposes. The use of data for inpatient psychiatric units would fail the second and third conditions. At the same time, we do not believe that including only inpatient rehabilitation unit days without inpatient psychiatric unit days would improve the accuracy of the uncompensated care payment calculation. We also observe, as we have previously noted, that the statutory references under section 1886(d)(5)(F) of the Act to “days” apply only to hospital acute care inpatient days. Section 412.106(a)(1)(ii) of the regulations therefore provides that, for purposes of DSH payments, “the number of patient days in the hospital includes only those days attributable to units or wards of the hospital providing acute care services generally payable under the prospective payment system and excludes” other days. In the absence of compelling reasons to do otherwise, we believe it is preferable to maintain consistency with this longstanding precedent in the context of this temporary method for determining uncompensated care payments. However, we invited public comments on this issue.

Comment: Several commenters supported the proposal to not include Medicaid and SSI days from excluded units in the calculation. One commenter believed it would be inconsistent to distribute uncompensated care payments based on non-IPPS days and unfair to providers that do not have exempt units. Some commenters supported including Medicaid and SSI days from excluded units in our calculation of Factor 3. One commenter stated that the inclusion of days for psychiatric and rehabilitation units that are exempt from IPPS would improve the accuracy of these data, as IPPS days and exempt unit days combined would function as a proxy for total hospital uncompensated care services.

Response: We thank the commenters for their feedback and continue to believe that we should finalize our proposal to calculate Factor 3 based on a DSH hospital's share of their Medicaid and SSI days associated with their acute care units. We believe that it would be inappropriate to include Medicaid and SSI days from psychiatric units, as those days are not audited for payment purposes, and we do not believe that including only inpatient rehabilitation unit days without inpatient psychiatric unit days would improve the accuracy of the uncompensated care payment calculation.

The statute also allows the Secretary the discretion to determine the time periods from which we will derive the data to estimate the numerator and the denominator of the Factor 3 quotient. Specifically, section 1886(r)(2)(C)(i) of the Act defines the numerator of the quotient as “the amount of uncompensated care for such hospital for a period selected by the Secretary. . . .” (emphasis added). Section 1886(r)(2)(C)(ii) of the Act defines the denominator as “the aggregate amount of uncompensated care for all subsection (d) hospitals that receive a payment under this subsection for such period” (emphasis added). In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50638), we adopted a process of making interim payments with final cost report settlement for both the empirically justified Medicare DSH payments and the uncompensated care payments required by section 3133 of the Affordable Care Act. Consistent with that process, we also determined the time period from which to calculate the numerator and denominator of the Factor 3 quotient in a way that would be consistent with making interim and final payments. Specifically, we must have Factor 3 values available for hospitals that we estimate will qualify for Medicare DSH payments using the most recently available historical data and for those hospitals that we do not estimate will qualify for Medicare DSH payments but that may ultimately qualify for Medicare DSH payments at the time of cost report settlement.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50638), therefore, we adopted the policy to calculate the numerator and the denominator of Factor 3 for hospitals based on the most recently available full year of Medicare cost report data (including the most recently available data that may be used to update the SSI ratios) with respect to a Federal fiscal year. In other words, we use data from the most recently available full year cost report for the Medicaid days and the most recently available SSI ratios (that is, latest available SSI ratios before the beginning of the Federal fiscal year) for the Medicare SSI days. We noted that these data are publicly available, subject to audit, and used for payment purposes. While we recognized that older data also meet these criteria, we often use the most recently available data for payment determinations. The data used are located in the HCRIS database for most hospitals, but the data for IHS hospitals are not included in that database. Accordingly, in the FY 2014 IPPS interim final rule with comment period (78 FR 61195), we revised our policy to also include cost report data submitted to CMS by IHS hospitals in order allow their Medicaid days to be used to calculate Factor 3.

Therefore, for FY 2014, we used data from the most recently available full year cost report for the Medicaid days and the most recently available SSI ratios, which meant data from the 2010/2011 cost reports (that is, cost reports that have cost reporting periods that begin in either FY 2010 or FY 2011) for the Medicaid days taken from the March 2013 update of the HCRIS database, 2011 cost report data submitted to CMS by IHS hospitals by March 2013, and the FY 2011 SSI ratios for the Medicare SSI days to estimate Factor 3 for FY 2014. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28102), for FY 2015, we again proposed to use data from the most recently available full year cost report for the Medicaid days (that is, we proposed to use the 2012 cost report, unless that cost report is unavailable or reflects less than a full 12-month year; in the event the 2012 cost report is for less than 12 months, we proposed to use the cost report from 2012 or 2011 that is closest to being a full 12-month cost report), cost report data submitted to CMS by IHS hospitals and the most recently available SSI ratios. For purposes of the proposed rule, we used data from the 2011/2012 Medicare cost reports (that is, from cost reports that have cost reporting periods that begin in either FY 2011 or FY 2012) taken from the December 2013 update of the HCRIS database for the Medicaid days and the FY 2011 SSI ratios for the Medicare SSI days. Consistent with our FY 2014 IPPS interim final rule with comment period (78 FR 61195), for FY 2015, we also used supplemental cost report data provided by IHS hospitals to CMS as of December 2013 in order to calculate the proposed Factor 3. We indicated that, for the FY 2015 IPPS/LTCH PPS final rule, we intended to use the March 2014 update of the HCRIS database for the 2011/2012 Medicare cost reports, cost report data submitted to CMS by IHS hospitals as of March 2014, and the most recently available SSI ratios (FY 2012 SSI ratios and, if not available, the FY 2011 SSI ratios) to calculate Factor 3. We stated that we believed the March update to the Medicare cost reports would be the most recently available data to calculate Factor 3 at the time of publication of the FY 2015 IPPS final rule. We also indicated that this proposal is consistent with CMS' historical policy to use the best available data when setting the payment rates and factors in both the proposed and final rules. Furthermore, we noted that this approach is consistent with our approach in other areas of IPPS, where we historically use the March update of cost report data and MedPAR claims data to calculate IPPS relative weights, budget neutrality factors, the outlier threshold, and the standardized amount for the IPPS final rule. If we were to wait for a later update of the cost report data to become available, this would cause delay of the publication of the IPPS final rule.

Comment: Several commenters questioned the data used to calculate the hospitals' Factor 3. Several commenters stated that their Medicaid days were understated. Furthermore, commenters stated that they submitted their updated cost report to be included in the March 2014 update of the Medicare cost report data but the contractor had not yet uploaded the information in the HCRIS database. In addition, some commenters indicated that they had updated Medicaid days and had submitted their cost report to their contractors after the March 2014 update of the Medicare hospital cost report data and wanted their updated data included. Some commenters requested use of the June update of cost report data to obtain Medicaid days to calculate Factor 3. Some commenters sought clarification of why some hospitals have their Medicaid days based on Worksheet S-2 and some hospitals have their Medicaid days based on Worksheet S-3. Some commenters stated that their Medicaid days were based on a 6-month cost report and they should be based on a 12-month cost report either by combining cost reports or annualizing the data. Some commenters questioned their DSH eligibility, stating that their hospitals had been listed as not being eligible for DSH for FY 2015, when they had previously received DSH on their cost report. Other commenters submitted corrections because their hospitals had been identified as SCHs, but were actually operating as MDHs. Finally, several commenters requested additional time after the publication of the final rule to review the data used to calculate Factor 3 and submit corrections.

Response: We are finalizing our proposal to use the most recently available full year cost report for the Medicaid days (that is, our proposal to use the 2012 cost report, unless that cost report is unavailable or reflects less than a full 12-month year; in the event the 2012 cost report is for less than 12 months, we will use the cost report from 2012 or 2011 that is closest to being a full 12-month cost report) and the most recently available SSI ratios. For this FY 2015 final rule, we are using the March 2014 update of the hospital cost report data in the HCRIS database and cost report data submitted to CMS by IHS hospitals as of March 2014 to obtain the Medicaid days to calculate Factor 3. In addition, we are using the FY 2012 SSI ratios published on the CMS Web site to calculate Factor 3 (http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/dsh.html).

We note that we are unable to use a later update of the cost report data, like the June update, and still calculate the final Factor 3 in time for publication of the IPPS final rule. Any delay in the publication of the final rule would prevent changes and updates to payments under the IPPS from taking effect on October 1, the first day of the fiscal year. We are not able to accept supplemental data for hospitals, as we are not able to validate the information included in that supplemental data. We note that hospitals have ample time after the close of their fiscal year to submit the data that are used in this calculation. Specifically, Chapter I, section 104 of the Provider Reimbursement Manual, Part 2, generally allows a hospital 5 months after the close of its cost reporting period to file its cost report. In addition, CMS allows hospitals to request amendments of their cost report submissions before CMS issues a Notice of Program Reimbursement. In response to the commenters that indicated they had submitted their updated cost reports, but that the MAC had not yet uploaded the information, we note that MACs follow guidelines to upload revised cost report information. In accordance with Medicare Financial Management Manual, Chapter 8, Section 10.4—Submission of Cost Report Data to CMS, the MACs are required to submit an extract of the following Medicare cost reports to CMS in accordance with the HCRIS specifications within 210 days of the cost reporting period ending date or 60 days after receipt of the cost report, whichever is later.

With respect to the comments requesting clarification on whether Worksheet S-2 or Worksheet S-3 is used to obtain Medicaid days, we addressed this concern in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50642) and reiterate that we use the Medicaid days reported on Worksheet S-2 of the Medicare Hospital Cost Report version 2552-10 for hospitals projected to receive Medicare DSH because the Medicaid days reported on Worksheet S-2 are used in the computation of the Medicaid fraction for Medicare DSH payments. Therefore, because they are used for payment of Medicare DSH, we believe that these data are more reliable than data not used for payment purposes. Hospitals that were not eligible to receive Medicare DSH payments on that cost report were unable to report Medicaid days on Worksheet S-2, but could report their Medicaid days on Worksheet S-3. Therefore, for hospitals that we project to not be eligible for Medicare DSH payments, we are using the Medicaid days reported on Worksheet S-3 to calculate their Factor 3. A transmittal has been issued to allow for hospitals that are not receiving DSH to report their Medicaid days on Worksheet S-2, and we hope to rely only on the data reported on that Worksheet S-2 in the future, if we continue to use this data on low-income insured days in the future.

With regard to the comments from hospitals that found that their Factor 3 was calculated using a cost report that was less than 12 months, we are finalizing our proposal to use the 2012 cost report, unless that cost report is unavailable or reflects less than a full 12-month year. In the event the 2012 cost report is for less than 12 months, we would use the cost report from 2012 or 2011 that is closest to being a full 12-month cost report. In the case where a less than 12-month cost report was used to calculate a hospital's Factor 3, this would indicate that both the 2012 and 2011 cost reports were less than 12 months. In such a case, we would use the longer of the two cost reports to calculate a hospital's Factor 3. We did not make a proposal to annualize or combine cost reports to calculate Factor 3. We note that section 1886(r)(2)(c) of the Act specifies that Factor 3 is equal to the percent that represents “the amount of uncompensated care for such hospital for a period selected by the Secretary (as estimated by the Secretary, based on appropriate data . . .” divided by “the aggregate amount of uncompensated care for all subsection (d) hospitals that receive a payment under this subsection for such period (as so estimated . . .” In implementing this provision, as we did through rulemaking in FY 2014, we believe it is appropriate to first select the period—in this case, the period for which we have the most recently available data—and then to select the data from a cost report that aligns best with that period. However, we acknowledge that the situations presented by commenters, where a hospital remains in operation in both Federal fiscal years for which we analyze cost report data but submits cost reports for both Federal fiscal years that reflect substantially less than a full year of data, pose unique challenges in the context of estimating Factor 3. As a result, this is an issue that we intend to consider further and may address in future rulemaking.

With regards to the comments from hospitals stating that their DSH eligibility is inaccurate, we note that we used the FY 2012 SSI ratios and the Medicaid fraction listed in the March 2014 update of the Provider Specific File in order to identify which hospitals are projected to receive DSH for FY 2015, and thus are eligible to receive uncompensated care payments and interim uncompensated care payments for FY 2015. We did not use historical cost report data to make this determination. We believe that the FY 2012 SSI ratios and the Medicaid fraction in the March 2014 update of Provider Specific File are the most recently available information regarding whether a hospital is currently being paid Medicare DSH on an interim basis, and therefore, we believe they are an appropriate data source to make our determination of which hospitals are projected to receive DSH for FY 2015, and thus are eligible to receive uncompensated care payments, as presented in Table 18. As we have stated previously, final determination of DSH eligibility and uncompensated care payments are made at cost report settlement.

In making our DSH projections for FY 2015, we also identify which hospitals are SCHs that we estimate will be paid the hospital-specific rate and not the Federal rate and, therefore, will not receive a Medicare DSH payment and will be ineligible to receive the uncompensated care payment. In the FY 2015 IPPS/LTCH PPS proposed rule, we inadvertently identified several MDHs as SCHs in our projections and have updated our list of SCHs for the final rule accordingly.

Finally, we accept the recommendation of many commenters to provide the public with an additional 30 days subsequent to the publication of the final rule in order to review and submit comments limited to whether any hospitals should be added to the list of hospitals eligible to receive interim empirically justified DSH payments and uncompensated care payments or if any hospitals should be removed from the list based on changes in their subsection(d) status, as we did in the FY 2014 IPPS/LTCH PPS final rule. Commenters can submit their comments to our inbox at Section3133DSH@cms.hhs.gov. After receiving and reviewing comments, if we make any changes to the list, we will post on the Web site a revised table showing the final Factor 3 for each hospital prior to October 1, 2014. This timetable will give MACs sufficient time in order to enter the final data into the provider specific file and make timely payments for discharges occurring on or after October 1, 2014.

Comment: Several commenters asked whether the Medicaid days used to calculate Factor 3 can be reconciled based on audit by the Medicare contractor and whether any recouped uncompensated care payments would be redistributed to the providers receiving an uncompensated care payment at cost report settlement.

Response: As we discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50645), at this time, we do not intend to reconcile Factor 3 because we believe the statute provides the authority to make uncompensated care payments on the basis of estimates of Factors 1, 2, and 3 and that it is preferable to do so in order to avoid unacceptable delays in the final determination of uncompensated care payments.

Comment: One commenter objected to the proposal to calculate Factor 3 based on a hospital's share of total Medicaid days and Medicare SSI days as a proxy for measuring a hospital's share of uncompensated care. The commenter believed that this proxy does not appropriately target hospitals with the highest burden of uncompensated care costs and instead rewards hospitals in states where Medicaid has expanded.

Response: For the reasons discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50639), we continue to believe that our methodology to calculate Factor 3 based on a hospital's share of Medicaid days and SSI days does not inappropriately reward States that expand Medicaid coverage. Furthermore, as discussed above and in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50634 through 50639), we believe that using the low-income insured days as a proxy for uncompensated care costs provides a reasonable basis to determine Factor 3 on a temporary basis as we work to improve Worksheet S-10 to accurately and consistently capture uncompensated care costs.

Comment: Several commenters requested that hospitals have the opportunity to request to have the SSI days recalculated on the basis of their cost reporting period, not Federal fiscal year, as part of their Factor 3 calculation.

Response: We do not believe that this would improve our estimates for Factor 3. For the DSH calculation, CMS generally issues SSI ratios based on a Federal fiscal year to be used to determine a hospital's Medicare DSH payments at cost report settlement. For the purpose of the Medicare DSH payment, a provider may request a realignment under § 412.106(b)(3) such that its SSI ratio is recalculated based on the hospital's specific cost-reporting period. The choice to request a realignment and the timing of this choice may vary. Therefore, a hospital's decision whether to have its SSI ratio calculated on the basis of its cost reporting period may not have been made at the time we determine Factor 3 for a specific Federal fiscal year. Furthermore, we do not believe that allowing hospitals the option of having their SSI days calculated on the basis of their cost reporting period would improve our estimates of Factor 3. Therefore, to preserve consistency and administrative efficiency, we continue to believe it is appropriate to use SSI ratios based on the Federal fiscal year.

Comment: Several commenters asked how the decision in Allina v. Sebelius would affect the calculation of Factor 3. Commenters stated that the SSI days should exclude MA days, and MA dual-eligible days should be included as Medicaid days in the calculation of Factor 3 for FY 2015 and that CMS should reconcile the FY 2014 Factor 3 estimates based on the decision in Allina v. Sebelius.

Response: Similar to what we stated earlier in this final rule, we do not believe the Allina decision has any bearing on our estimate of Factor 3 for either FY 2014 or FY 2015. The decision in Allina did not address our decision to readopt the policy of counting Medicare Advantage days in the SSI ratio for FY 2014 and subsequent fiscal years. Nor did the decision address the issue of how patient days should be counted for purposes of estimating uncompensated care. Moreover, section 1886(r)(2)(C) of the Act provides discretion for the Secretary to determine how to estimate uncompensated care costs, and for FY 2015, we are finalizing our proposal to continue to apply the methodology adopted in the FY 2014 IPPS/LTCH PPS final rule to define uncompensated care based on the proxy of utilization by low-income insured patients. Specifically, Factor 3 will be based on a hospital's share of total Medicaid days and SSI days. Consistent with the policy that we finalized in the FY 2014 IPPS/LTCH PPS final rule regarding the counting of SSI days, we believe that, for purposes of determining uncompensated care payments, SSI days should include both MA and FFS SSI days.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50642), we discussed several specific issues concerning the use of cost report data to determine Factor 3. One issue concerned the process and data to be employed in determining Factor 3 in the case of hospital mergers. Specifically, two hospitals that merged in 2011 with one surviving provider number requested that we account for the merger by including data from both hospitals' cost reports immediately prior to the merger in the calculation of the Factor 3 amount. In that final rule, we had calculated Factor 3 using only the surviving hospital's cost report data and SSI ratio data. In the final rule (78 FR 50602), we responded to the public comment that Factor 3 would be calculated based on the low-income insured patient days (that is, Medicaid days and SSI days) under the surviving CCN, based on the most recent available data for that CCN (for FY 2014, from the cost report for 2011 or 2010). We noted that this was consistent with the treatment of other IPPS payment factors, where data used to calculate a hospital's Medicare DSH payment adjustment, CCRs for outlier payments, and wage index values are tied to a hospital's CCN. Data associated with a CCN that is no longer in use are not used to determine those IPPS hospital payments under the surviving CCN.

Since the publication of the FY 2014 IPPS/LTCH PPS final rule, we have received additional input from hospitals that have undergone mergers that suggest using only the surviving CCN produces an estimate of the surviving hospital's uncompensated care burden that is lower than warranted. For FY 2015, for example, Factor 3 of the uncompensated care payment calculation would be determined using 2011/2012 cost reports. As a result, for any mergers occurring between FY 2011 and FY 2015, Factor 3 of the uncompensated care payment for FY 2015 would reflect only the data of the hospital with the surviving CCN, not the combination of the data from the two hospitals that merged. We believe that revising our methodology to incorporate data from both of the hospitals that merged could improve our estimate of the uncompensated care burden of the merged hospital. Accordingly, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28103 through 28104), we proposed to revise our methodology for determining Factor 3 to incorporate data from both merged hospitals until data for the merged hospitals become available under the surviving CCN.

In addition, because the data systems used to calculate Factor 3 do not identify hospitals that have merged, we also proposed to establish a process to identify hospitals that have merged after the period of the historical data that are being used to calculate Factor 3, up to a point in time during ratesetting for that Federal fiscal year. Under this approach, we would combine the data for the merged hospitals to calculate Factor 3 of the uncompensated care payment. Specifically, we proposed that we would identify the hospitals that merged after the period from which data are being used to calculate Factor 3 (for FY 2015, 2012 and 2011) but before the publication of each year's final rule. For purposes of the proposal, we defined a merger to be an acquisition where the Medicare provider agreement of one hospital is subsumed into the provider agreement of the surviving provider. We would not consider an acquisition where the new owner voluntarily terminates the Medicare provider agreement of the hospital it purchased by rejecting assignment of the previous owner's provider agreement to be a merger. We believe it is appropriate to combine data to calculate Factor 3 for a merged hospital where the Medicare provider agreement of one hospital is subsumed into the provider agreement of the surviving provider because, in this type of acquisition as described in the September 6, 2013 Survey & Certification Memorandum S&C: 13-60-ALL (http://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/Downloads/Survey-and-Cert-Letter-13-60.pdf), the buyer is subject to all applicable statutes and regulations and to the terms and conditions under which the assigned agreement was originally issued. These include, but are not limited to, Medicare requirements to adjust payments to account for prior overpayments and underpayments, even if they relate to a pre-acquisition period (successor liability), and to adjust payments to collect civil monetary penalties. Therefore, we believe it is appropriate to also retain the data of the subsumed hospital to calculate the uncompensated care payment for the merged hospital. Conversely, by rejecting assignment of the Medicare provider agreement of the subsumed hospital, the surviving provider has voluntarily terminated the Medicare provider agreement and is precluded from having successor liability for Medicare overpayments or underpayments that would have otherwise been made to the subsumed provider. Furthermore, when the surviving hospital rejects automatic assignment of the existing provider agreement, but wishes to participate in the Medicare program, the merged hospital is considered an initial applicant to the Medicare program. In an instance in which the surviving provider has rejected assignment of the Medicare provider agreement of the subsumed provider, it would not seem appropriate to use data from the subsumed provider for purposes of Medicare payment, including for the calculation of a hospital's uncompensated care payment.

For FY 2015, we proposed to identify mergers by querying the Medicare contractors. We believe it is appropriate to obtain merger information from the Medicare contractors, as a copy of each final sales agreement/transaction indicating the effective date of the acquisition is generally submitted to the Medicare contractors once an acquisition is finalized. For the purpose of the proposed rule, we requested that the Medicare contractors provide us with a list of mergers that occurred between October 1, 2010 (the first day of FY 2011, which is the earliest date that would be included in any 2011 cost report data that are used to calculate a hospital's Factor 3) through January 2014 (when we started preparing for the FY 2015 IPPS/LTCH PPS proposed rule). On the basis of this information, we would then combine the data elements of any hospitals that had merged to calculate the uncompensated care payment for the merged hospital. Specifically, we proposed to combine the Medicaid days from the most recently available full year cost reports and the SSI days from the most recently available SSI ratios tied to the two CCNs prior to the merger to calculate the merged hospital's Factor 3. For FY 2015, we proposed to combine the Medicaid days from either the 2011 or 2012 cost reports and would use the most recently available SSI ratios available at the time the final rule is developed.

In order to confirm these mergers and the accuracy of the data used to determine each merged hospital's uncompensated care payment, we proposed to publish a table on the CMS Web site, in conjunction with the issuance of the proposed and final rules for a fiscal year, containing a list of the mergers that we are aware of and the computed uncompensated care payment for each merged hospital. A copy of this table was published on the CMS Web site in conjunction with the issuance of the FY 2015 proposed rule. The affected hospitals had the opportunity to comment during the public comment period on the accuracy of this information.

We proposed to treat hospitals that merge after the development of the final rule similar to new hospitals. For these newly merged hospitals, we would not have data currently available to calculate a Factor 3 amount that accounts for the merged hospital's uncompensated care burden. In addition, we would not have data to determine if the newly merged hospital is eligible for Medicare DSH payment and, therefore, eligible for uncompensated care payments for the applicable fiscal year because the only data we would have to make this determination are those for the surviving CCN. Accordingly, we proposed to treat newly merged hospitals in a similar manner as new hospitals, such that the newly merged hospital's final uncompensated care payment would be determined at cost report settlement where the numerator of the newly merged hospital's Factor 3 would be based on the Medicaid days and SSI days reported on the cost report used for the applicable fiscal year. We proposed that the interim uncompensated care payments for the newly merged hospitals would be based on only the data of the surviving hospital's CCN at the time of the preparation of the final rule for the applicable fiscal year. In other words, for newly merged hospitals, eligibility to receive interim uncompensated care payments and the amount of any interim uncompensated care payments would be based on the Medicaid days from either the 2011 or 2012 cost reports and the most recently available SSI ratios available at the time the final rule is developed for only the surviving CCN. However, at cost report settlement, we would determine the newly merged hospital's final uncompensated care payments based on the Medicaid days and SSI days reported on the cost report used for the applicable fiscal year. That is, we would revise the numerator of Factor 3 for the newly merged hospital to reflect the Medicaid and SSI days reported on the cost report for the applicable fiscal year. We invited public comment on our proposed change to the treatment of hospital mergers in the calculation of a hospital's uncompensated care payment.

Comment: Commenters uniformly supported the proposal to establish a process to identify the hospitals that have merged so CMS can calculate the merged hospital's share of the total uncompensated care amount available using the low-income patient days from all hospitals that existed prior to the merger. Several commenters identified additional hospitals that had undergone a merger that were not included on the list of mergers identified in the FY 2015 IPPS/LTCH PPS proposed rule. A number of commenters requested that the public have additional time after the publication of the final rule to review and submit corrections to CMS' list of identified mergers. One commenter asked CMS to clarify that, under the proposal, CMS would recalculate the hospital's uncompensated care payments by combining the Medicaid days and SSI days published with the final rule from the applicable “data year” for the surviving CCN, as well as for any acquired CCNs that were retired through the merger process.

Response: We appreciate the commenters' support and are finalizing our proposal as proposed. We have updated our list of mergers based on information submitted by the Medicare contractor as of June 2014. In addition, we have reviewed the commenters' submissions of mergers not previously identified in the proposed rule and have updated our list accordingly. In response to the request from one commenter, for the hospitals that we have listed as undergoing a merger, we are confirming that we would recalculate the hospital's uncompensated care payments by combining the Medicaid days and SSI days published with the final rule from the applicable “data year” for the surviving CCN, as well as for any acquired CCNs that were retired through the merger process. For example, to calculate the FY 2015 Factor 3 using the FY 2012 SSI ratio and the full year cost report from 2012 or 2011, we would combine the FY 2012 SSI days and Medicaid days from the 2012 or 2011 cost report from the surviving and retiring providers. We would not update the merged hospital's Factor 3 after that. For a newly merged hospital, defined for the purpose of this policy as a hospital that we do not identify as undergoing a merger until after the public comment period and additional review period after the final rule or that undergoes a merger during the fiscal year, the final Factor 3 would be recalculated based on the Medicaid days and SSI days reported on the cost report used for the applicable fiscal year since the Factor 3 that we are publishing in this final rule would not reflect the merger. For example, for a newly merged hospital that merged in FY 2015, the numerator of its Factor 3 would be recalculated based on the FY 2015 SSI days and the Medicaid days reported on its 2015 cost report. Finally, in response to the comments seeking additional review of CMS' list of identified mergers, we are providing an additional 30 days after the publication of this final rule for hospitals to review and submit comments on the accuracy of the list of mergers that we have identified in this final rule. Comments can be submitted to our inbox at Section3133DSH@cms.hhs.gov and any changes to Factor 3 will be posted on the CMS Web site prior to October 1, 2014.

Comment: One commenter expressed concern about the process of requesting that MACs provide information to CMS on mergers/acquisitions. The commenter noted that the MACs may not have the most up-to-date information on mergers to provide to CMS because documentation of the merger, such as tie-in or tie-out notices, can be delayed in getting to a contractor.

Response: We appreciate the commenters' concerns. We believe that we have implemented several safeguards in the event that a merger is not identified by the MACs, including allowing opportunity for comment on the accuracy of the mergers that we have identified during the comment period for the proposed rule and after the publication of the final rule. In addition, as described earlier, for any newly merged hospital or for a hospital that we do not identify as having undergone a merger, we will recalculate the merged hospital's Factor 3 at the end of the applicable fiscal year based on the Medicaid days and SSI days reported on the cost report used for the applicable fiscal year since the Factor 3 published in the final rule would not reflect the merger.

Comment: MedPAC and one other commenter expressed concern about our policy of distributing the uncompensated care payments as a per-discharge add-on. They believed this policy is problematic because the per-discharge add-on varies widely from hospital to hospital. The variability of the add-on payments in turn distorts the MS-DRG prices and creates problematic incentives for ACOs and MA plans. Therefore, MedPAC and the other commenter believed that it would be better to provide a common interim add-on payment for all DSH hospitals in a county. Any underpayments or overpayments to an individual hospital could be corrected at year-end settlement or on an interim basis during the year (as is already necessary under the current system). One commenter also suggested applying a growth factor based on CBO projections to CMS' historical discharge data to calculate the interim per-discharge uncompensated care payments to mitigate overpayments and stabilize cash flow. Another commenter opposed MedPAC's recommendation and supported CMS' current methodology to calculate interim uncompensated care payments, stating that MedPAC's recommendation could cause cash-flow problems for providers.

Response: We consider these comments to be outside the scope of the proposed rule, as we did not propose any revision in our method of making interim payments for uncompensated care. However, we would like to make two preliminary reactions to this recommendation. The first observation is that we have received very few comments from the hospital industry indicating that the problem cited by these two commenters actually exists. We would expect that, if hospitals were truly disadvantaged in the manner cited by these commenters by our methodology for making interim payment uncompensated care payments, we would have received many more comments to that effect. The second preliminary reaction is that adopting the recommendation may pose, for some hospitals, serious problems that may conceivably exceed the problem that the recommendation is designed to solve. For example, reducing the interim uncompensated care payments of high DSH hospitals to a county-wide average payment may cause serious cash flow problems during the period before the interim payments can be adjusted or settled. Similarly, low DSH hospitals may receive significantly higher interim payments than would be warranted by their actual uncompensated care data. As a result, these hospitals would have to take financial management steps to ensure that they are capable of making significant repayments when interim payments are adjusted or settled.

Comment: Commenters suggested that CMS implement a stop-loss and stop-gain policy to limit the amount by which a hospital's DSH payment could change in a single year.

Response: As we previously stated in a response to a similar comment in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50622), we do not believe that the statute provides authority for adopting a stop-loss and stop-gain policy designed to limit changes in DSH payments.

Comment: Commenters expressed concern that there shall be no administrative or judicial review of the uncompensated care factors.

Response: Section 1886(r)(3) of the Act provides that there will be no administrative or judicial review under section 1869 of the Act, section 1878 of the Act, or otherwise of any of the following:

  • Any estimate of the Secretary for purposes of determining the factors described in paragraph (2) of section 1886(r) of the Act.
  • Any period selected by the Secretary for such purposes.

The regulation at § 412.106(g)(2), which was finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50643), is consistent with these statutory limitations on review.

G. Medicare-Dependent, Small Rural Hospital (MDH) Program (§ 412.108) and Sole Community Hospitals (SCHs) (§ 412.92)

1. Background for MDH Program

Section 1886(d)(5)(G) of the Act provides special payment protections, under the IPPS, to a Medicare-dependent, small rural hospital (MDH). (For additional information on the MDH program and the payment methodology, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51683 through 51684).) As we discussed in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50287) and in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51683 through 51684), section 3124 of the Affordable Care Act extended the expiration of the MDH program from the end of FY 2011 (that is, for discharges occurring before October 1, 2011) to the end of FY 2012 (that is, for discharges occurring before October 1, 2012). Under prior law, as specified in section 5003(a) of Public Law 109-171 (DRA 2005), the MDH program was to be in effect through the end of FY 2011 only.

Since the extension of the MDH program through FY 2012 provided by section 3124 of the Affordable Care Act, the MDH program has been further extended multiple times. First, section 606 of the ATRA (Pub. L. 112-240) extended the MDH program through FY 2013 (that is, for discharges occurring before October 1, 2013). Second, section 1106 of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67) extended the MDH program through the first half of FY 2014 (that is, for discharges occurring before April 1, 2014). In the interim final rule with comment period (IFC) that appeared in the Federal Register on March 18, 2014 (the “March 2014 IFC”) (79 FR 15025 through 15027), we discussed the expiration of the MDH program on March 31, 2014. (In section IV.P. of the preamble of this final rule, we are responding to any public comments we received on the March 2014 IFC and are stating our finalized policy for the extension of the MDH program for the first half of FY 2014, through March 31, 2014, under the Pathway for SGR Reform Act of 2013.) In the March 2014 IFC, we explained how providers may be affected by the 6-month extension of the MDH program under the Pathway for SGR Reform Act of 2013 and described the steps to reapply for MDH status for FY 2014, as applicable. Generally, a provider that was classified as an MDH as of September 30, 2013, was reinstated as an MDH effective October 1, 2013, with no need to reapply for MDH classification. However, if the MDH had classified as an SCH or cancelled its rural classification under § 412.103(g) effective on or after October 1, 2013, the effective date of MDH status may not be retroactive to October 1, 2013. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50647 through 50649) and the March 2014 IFC (79 FR 15025 through 15027), we made conforming changes to the regulations at § 412.108(a)(1) and (c)(2)(iii) to reflect the extensions of the MDH program provided for by the ATRA and Pathway for SGR Reform Act of 2013, respectively. Lastly, under current law, section 106 of the PAMA (Pub. L. 113-93) provides for a 1-year extension of the MDH program effective from April 1, 2014 through March 31, 2015. Specifically, section 106 of the PAMA amended sections 1886(d)(5)(G)(i) and 1886(d)(5)(G)(ii)(II) of the Act by striking “April 1, 2014” and inserting “April 1, 2015”. Section 106 of the PAMA also made conforming amendments to sections 1886(b)(3)(D)(i) and 1886(b)(3)(D)(iv) of the Act.

We addressed the extension of the MDH program for the second half of FY 2014 (that is, from April 1, 2014 through September 30, 2014) under the PAMA in a notice that appeared in the Federal Register on June 17, 2014 (79 FR 34446 through 344449). For additional information on the extensions of the MDH program after FY 2012, we refer readers to the following: the FY 2013 IPPS/LTCH PPS final rule that appeared in the Federal Register on August 31, 2012 (77 FR 53404 through 53405 and 53413 through 53414); the FY 2013 IPPS notice that appeared in the Federal Register on March 7, 2013 (78 FR 14689); the FY 2014 IPPS/LTCH PPS final rule that appeared in the Federal Register on August 19, 2013 (78 FR 50647 through 50649); the FY 2014 interim final rule with comment period that appeared in the Federal Register on March 18, 2014 (the “March 2014 IFC”) (79 FR 15025 through 15027); and the FY 2014 notice that appeared in the Federal Register on June 17, 2014 (79 FR 34446 through 34449).

2. PAMA Provisions for FY 2015 for MDHs

Prior to the enactment of the PAMA, under section 1106 of the Pathway for SGR Reform Act of 2013, the MDH program authorized by section 1886(d)(5)(G) of the Act was set to expire midway through FY 2014. Section 106 of the PAMA amended sections 1886(d)(5)(G)(i) and 1886(d)(5)(G)(ii)(II) of the Act to provide for an additional 1-year extension of the MDH program, effective from April 1, 2014 through March 31, 2015. Section 106 of the PAMA also made conforming amendments to sections 1886(b)(3)(D)(i) and 1886(b)(3)(D)(iv) of the Act.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28104), we proposed to make conforming changes to the regulations at §§ 412.108(a)(1) and (c)(2)(iii) to reflect the statutory extension of the MDH program for the first 6 months of FY 2015 made by section 106 of the PAMA.

We did not receive any public comments on our proposed conforming changes to the existing regulations text at §§ 412.108(a)(1) and (c)(2)(iii) to reflect the statutory extension of the MDH program through the first half of FY 2015 (that is, through March 31, 2015) in accordance with section 106 of the PAMA. Therefore, in this final rule, we are adopting our proposed revisions to the regulations at §§ 412.108(a)(1) and (c)(2)(iii) as final without modification. We note that these regulatory provisions supersede the conforming changes to §§ 412.108(a)(1) and (c)(2)(iii) made in the March 2014 IFC to reflect the extension of the MDH program through March 31, 2014, under the Pathway for SGR Reform Act, as discussed in section IV.P. of the preamble of this final rule.

3. Expiration of the MDH Program

Because section 106 of the PAMA extends the MDH program through the first half of FY 2015 only, beginning April 1, 2015, the MDH program will no longer be in effect. Because the MDH program is not authorized by statute beyond March 31, 2015, beginning April 1, 2015, all hospitals that previously qualified for MDH status will no longer have MDH status.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53404 through 53405), we revised our SCH policies to allow MDHs to apply for SCH status and be paid as such under certain conditions, following expiration of the MDH program at the end of FY 2012. We codified these changes in the regulations at § 412.92(b)(2)(i) and § 412.92(b)(2)(v). For additional information, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53404 through 53405 and 53674). We note that those same conditions apply to MDHs that intend to apply for SCH status with the expiration of the MDH program on March 31, 2015. Specifically, the existing regulations at § 412.92(b)(2)(i) and (b)(2)(v) allow for an effective date of approval of SCH status that is the day following the expiration date of the MDH program. In accordance with these regulations, in order for an MDH to receive SCH status effective April 1, 2015, it must apply for SCH status at least 30 days before the end of the MDH program; that is, the MDH must apply for SCH status by March 1, 2015. The MDH also must request that, if approved as an SCH, the SCH status be effective with the expiration of the MDH program provision; that is, the MDH must request that the SCH status, if approved, be effective April 1, 2015, immediately after its MDH status expires with the expiration of the MDH program on March 31, 2015. We note that an MDH that applies for SCH status in anticipation of the expiration of the MDH program would not qualify for the April 1, 2015 effective date upon approval if it does not apply by the March 1, 2015 deadline. The provider would instead be subject to the usual effective date for SCH classification, that is, 30 days after the date of CMS' written notification of approval as specified at § 412.92(b)(2)(i).

Although we made no proposals related to the expiration of the MDH program, we received the following comments.

Comment: A few commenters expressed concern about the financial impact of the expiration of the MDH program. Some of these commenters urged CMS to continue the MDH program permanently, while other commenters urged CMS to support legislative efforts to extend these provisions beyond the current March 31, 2015 statutory expiration date. Some commenters urged CMS to work with Congress to extend the MDH provision because these hospitals are vitally needed in serving elderly persons with health care needs. Other commenters stated that the MDH program provides needed funding for hospitals with Medicare as their predominant payor. The commenters stated that many of these hospitals provide primarily outpatient services, and the low Medicare OPPS rates, which pay less than cost, threaten the financial viability of these hospitals without the added funding that Medicare dependent status provides. In order to maintain access to care for Medicare beneficiaries and others in many rural communities, the commenters urge CMS to continue the MDH program permanently.

Response: While we appreciate the commenters' concerns about the expiration of the MDH program, CMS does not have the authority under current law to extend the MDH program beyond the March 31, 2015 statutory expiration date. These comments are similar to comments we received previously, prior to the statutory extensions of the MDH program for FYs 2013 and 2014 provided by subsequent legislation, and discussed in both the FY 2013 IPPS/LTCH PPS final rule (77 FR 53413 through 53414) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50647 through 50649.

Therefore, under current law, beginning April 1, 2015, all hospitals that previously qualified for MDH status will no longer have MDH status.

4. Effect on MDHs of Adoption of New OMB Delineations

We received some comments regarding the effect of the implementation of the new OMB delineations (discussed in section III.H.5. of the preamble of this final rule) on MDHs, including requests for a transition period for MDHs to adapt to the changes that would result from the new OMB delineations; in particular, changes from rural to urban status. We refer readers to section III.H.5. (Update of Application of Urban to Rural Classification Criteria) of the preamble of this final rule for our summary of public comments received and our responses to those comments.

5. Effect on SCHs of Adoption of New OMB Delineations and 2-Year Transition for CAHs

Section 1886(d)(5)(D)(iii) of the Act defines a sole community hospital (SCH) generally as a hospital that is located more than 35 road miles from another hospital or that, by reason of factors such as isolated location, weather conditions, travel conditions, or absence of other like hospitals (as determined by the Secretary), is the sole source of inpatient hospital services reasonably available to Medicare beneficiaries. The regulations at 42 CFR 412.92 set forth the criteria that a hospital must meet to be classified as a SCH. For more information on SCHs, we refer readers to the FY 2009 IPPS/LTCH PPS final rule (74 FR 43894 through 43897).

In connection with the implementation of the new OMB delineations for urban and rural areas, as discussed in section III.H.5. of the preamble of this final rule, we received public comments requesting a transition period for SCHs affected by implementation of the new OMB delineations, similar to the 2-year transition period for affected CAHs, as discussed in section VI.D.2. of the preamble of this final rule, during which the CAH must reclassify as rural in order to retain its CAH status after the 2-year transition period ends. We refer readers to section III.H.5. of the preamble of this final rule for the discussion of and responses to those public comments.

We also were asked to clarify the status of a CAH during the 2-year transition period and its effect on SCHs.

Comment: One commenter requested that CMS clarify that a hospital's SCH status would not be affected by a CAH that is now located in an urban area as a result of a new OMB delineation while that CAH is in its 2-year transition period to reclassify as rural.

Response: We are clarifying that during an affected CAH's 2-year transition period, the facility will continue to be considered a CAH and, therefore, would not fall under the definition of “like hospital” at § 412.92(c)(2). Therefore, an affected CAH will not impact the status of an SCH during that CAH's 2-year transition period. For purposes of determining whether an SCH is located near a CAH affected by the most recent change in OMB delineations implemented in this final rule effective October 1, 2014, we plan to post on the CMS Web site, a list of the affected CAHs. We refer readers to section VI.D.2. of the preamble of this final rule for a discussion related to the CAH 2-year transition period.

H. Hospital Readmissions Reduction Program: Changes for FY 2015 Through FY 2017 (§§ 412.150 Through 412.154)

1. Statutory Basis for the Hospital Readmissions Reduction Program

Section 3025 of the Affordable Care Act, as amended by section 10309 of the Affordable Care Act, added a new section 1886(q) to the Act. Section 1886(q) of the Act establishes the “Hospital Readmissions Reduction Program,” effective for discharges from an “applicable hospital” beginning on or after October 1, 2012, under which payments to those applicable hospitals may be reduced to account for certain excess readmissions.

Section 1886(q)(1) of the Act sets forth the methodology by which payments to “applicable hospitals” will be adjusted to account for excess readmissions. In accordance with section 1886(q)(1) of the Act, payments for discharges from an “applicable hospital” will be an amount equal to the product of the “base operating DRG payment amount” and the adjustment factor for the hospital for the fiscal year. That is, “base operating DRG payments” are reduced by a hospital-specific adjustment factor that accounts for the hospital's excess readmissions. Section 1886(q)(2) of the Act defines the base operating DRG payment amount as “the payment amount that would otherwise be made under subsection (d) (determined without regard to subsection (o) [the Hospital VBP Program]) for a discharge if this subsection did not apply; reduced by . . . any portion of such payment amount that is attributable to payments under paragraphs (5)(A), (5)(B), (5)(F), and (12) of subsection (d).” Paragraphs (5)(A), (5)(B), (5)(F), and (12) of subsection (d) refer to outlier payments, IME payments, DSH adjustment payments, and add-on payments for low-volume hospitals, respectively.

Furthermore, section 1886(q)(2)(B) of the Act specifies special rules for defining “the payment amount that would otherwise be made under subsection (d)” for certain hospitals, including policies for SCHs and for MDHs for FY 2013. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53374), we finalized policies to implement the statutory provisions related to the definition of “base operating DRG payment amount” with respect to those hospitals.

Section 1886(q)(3)(A) of the Act defines the “adjustment factor” for an applicable hospital for a fiscal year as equal to the greater of “(i) the ratio described in subparagraph (B) for the hospital for the applicable period (as defined in paragraph (5)(D)) for such fiscal year; or (ii) the floor adjustment factor specified in subparagraph (C).” Section 1886(q)(3)(B) of the Act, in turn, describes the ratio used to calculate the adjustment factor. It states that the ratio is “equal to 1 minus the ratio of—(i) the aggregate payments for excess readmissions . . . and (ii) the aggregate payments for all discharges. . . .” Section 1886(q)(3)(C) of the Act establishes the floor adjustment factor, which is set at 0.99 for FY 2013, 0.98 for FY 2014, and 0.97 for FY 2015 and subsequent fiscal years.

Section 1886(q)(4) of the Act defines the terms “aggregate payments for excess readmissions” and “aggregate payments for all discharges” for an applicable hospital for the applicable period. The term “aggregate payments for excess readmissions” is defined in section 1886(q)(4)(A) of the Act as “the sum, for applicable conditions . . . of the product, for each applicable condition, of (i) the base operating DRG payment amount for such hospital for such applicable period for such condition; (ii) the number of admissions for such condition for such hospital for such applicable period; and (iii) the excess readmissions ratio . . . for such hospital for such applicable period minus 1.” The “excess readmissions ratio” is a hospital-specific ratio based on each applicable condition. Specifically, section 1886(q)(4)(C) of the Act defines the excess readmissions ratio as the ratio of actual-over-expected readmissions; specifically, the ratio of “risk-adjusted readmissions based on actual readmissions” for an applicable hospital for each applicable condition, to the “risk-adjusted expected readmissions” for the applicable hospital for the applicable condition.

Section 1886(q)(5) of the Act provides definitions of “applicable condition,” “expansion of applicable conditions,” “applicable hospital,” “applicable period,” and “readmission.” The term “applicable condition” (which is addressed in detail in section IV.C.3.a. of the FY 2012 IPPS/LTCH PPS final rule (76 FR 51665 through 51666)) is defined as a “condition or procedure selected by the Secretary among conditions and procedures for which: (i) Readmissions . . . represent conditions or procedures that are high volume or high expenditures . . . and (ii) measures of such readmissions . . . have been endorsed by the entity with a contract under section 1890(a) [of the Act] . . . and such endorsed measures have exclusions for readmissions that are unrelated to the prior discharge (such as a planned readmission or transfer to another applicable hospital).” Section 1886(q)(5)(B) of the Act also requires the Secretary, beginning in FY 2015, “to the extent practicable, [to] expand the applicable conditions beyond the 3 conditions for which measures have been endorsed . . . to the additional 4 conditions that have been identified by the Medicare Payment Advisory Commission in its report to Congress in June 2007 and to other conditions and procedures as determined appropriate by the Secretary.”

Section 1886(q)(5)(C) of the Act defines “applicable hospital,” that is, a hospital subject to the Hospital Readmissions Reduction Program, as a “subsection (d) hospital or a hospital that is paid under section 1814(b)(3) [of the Act], as the case may be.” The term “applicable period,” as defined under section 1886(q)(5)(D) of the Act, “means, with respect to a fiscal year, such period as the Secretary shall specify.” As explained in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51671), the “applicable period” is the period during which data are collected in order to calculate various ratios and payment adjustments under the Hospital Readmissions Reduction Program.

Section 1886(q)(6) of the Act sets forth the public reporting requirements for hospital-specific readmission rates. Section 1886(q)(7) of the Act limits administrative and judicial review of certain determinations made pursuant to section 1886(q) of the Act. Finally, section 1886(q)(8) of the Act requires the Secretary to collect data on readmission rates for all hospital inpatients (not just Medicare patients) for a broad range of both subsection (d) and non-subsection(d) hospitals, in order to calculate the hospital—specific readmission rates for all such hospital inpatients and to publicly report these “all-patient” readmission rates.

2. Regulatory Background

The payment adjustment factor set forth in section 1886(q) of the Act did not apply to discharges until FY 2013. In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51660 through 51676), we addressed the issues of the selection of readmission measures and the calculation of the excess readmissions ratio, which will be used, in part, to calculate the readmissions adjustment factor. Specifically, in that final rule, we finalized policies that relate to the portions of section 1886(q) of the Act that address the selection of and measures for the applicable conditions, the definitions of “readmission” and “applicable period,” and the methodology for calculating the excess readmissions ratio. We also established policies with respect to measures for readmission for the applicable conditions and our methodology for calculating the excess readmissions ratio.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53374 through 53401), we finalized policies that relate to the portions of section 1886(q) of the Act that address the calculation of the hospital readmission payment adjustment factor and the process by which hospitals can review and correct their data. Specifically, in that final rule, we addressed the base operating DRG payment amount, aggregate payments for excess readmissions and aggregate payments for all discharges, the adjustment factor, applicable hospital, limitations on review, and reporting of hospital-specific information, including the process for hospitals to review readmission information and submit corrections. We also established a new Subpart I under 42 CFR Part 412 (§§ 412.150 through 412.154) to codify rules for implementing the Hospital Readmissions Reduction Program.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50649 through 50676), we finalized our policies that relate to refinement of the readmissions measures and related methodology for the current applicable conditions, expansion of the “applicable conditions” beginning for FY 2015, and clarification of the process for reporting hospital—specific information, including the opportunity to review and submit corrections. We also established policies related to the calculation of the adjustment factor for FY 2014.

3. Overview of Policies for the FY 2015 Hospital Readmissions Reduction Program

In this final rule, we are—

  • Making refinements to the readmissions measures and related methodology for FY 2015 and subsequent years (section IV.H.4. of the preamble of this final rule);
  • Expanding the scope of “applicable conditions” for FY 2017 to include coronary artery bypass graft (CABG) (section IV.H.6. of the preamble of this final rule);
  • Discussing the maintenance of technical specifications for quality measures (section IV.H.7. of the preamble of this final rule);
  • Describing a waiver from the Hospital Readmissions Reduction Program for hospitals formerly paid under section 1814(b)(3) of the Act (§ 412.154(d)) (section IV.H.8. of the preamble of this final rule);
  • Specifying the adjustment factor floor for FY 2015 (section IV.H.9. of the preamble of this final rule);
  • Specifying the applicable period for FY 2015 (section IV.H.10. of the preamble of this final rule);
  • Making changes to the calculation of the aggregate payments for excess readmissions to include two additional readmissions measures (chronic obstructive pulmonary disease (COPD) and THA/TKA)) (section IV.H.11. of the preamble of this final rule); and
  • Discussing whether to establish an exceptions process to address hospitals with extraordinary circumstances (section IV.H.12. of the preamble of this final rule).

4. Refinement of the Readmission Measures and Related Methodology for FY 2015 and Subsequent Years Payment Determinations

We note that, during the comment period following the proposed rule, we received comments that were not related to our specific proposals and considered out of scope for the Hospital Readmissions Reduction Program in the FY 2015 IPPS/LTCH PPS proposed rule. Some of the out-of-scope comments were related to a wide range of aspects of the Hospital Readmissions Reduction Program and its readmission measures. For example, there were recommendations for statutory changes to the program payment structure and previously finalized program definitions, changes to the program goals, frequency of assessing and reporting performance on measures, and changes to the 30-day window of measuring readmissions. Notably, there were many comments on risk adjustment for socioeconomic status (SES) at the patient- and hospital-level. While we appreciate the commenters' feedback, these topics are out of scope of this rule, and we will not be specifically addressing them, with the exception of risk-adjustment for SES.

Among the out-of-scope topics, we are addressing the risk-adjustment for SES because of the volume of comments and the importance of this topic for outcome measures in payment programs. All other out-of-scope topics not specifically addressed in this rule will be taken into consideration when developing policies and program requirements for future years.

Comment: Many of the commenters on CMS quality programs and those specifically commenting on the Hospital Readmissions Reduction Program expressed concern that these programs do not risk-adjust for SES. Many commenters expressed concern that the lack of risk adjustment for SES leads to the unintended consequences of unfair payment adjustments which: (1) Disproportionately penalize hospitals serving high proportions of low-SES patients; (2) penalize hospitals for patient characteristics outside of their control; and (3) decrease financial resources of the hospitals most likely to serve low-SES patients which would lead to lower quality of care for these patients. Many commenters outlined specific SES characteristics that are not adjusted for in the current readmission measures, including Medicare dual-eligible status, life circumstances, access to health care post-discharge, literacy, education level, community factors, language, income, poverty level, living conditions and support in the home (that is, post-discharge support structure), complex medical histories, and having chronic conditions.

One commenter noted that claims data cannot be used to identify SES, making it difficult for the commenter to support the Hospital Readmissions Reduction Program, which uses claims-based measures. Other commenters believed that risk adjustment for SES “levels the playing field” among all hospitals while still illuminating disparities.

Response: We appreciate the commenters' concerns and note that these concerns were addressed in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50653 through 50654; 50673 through 50674). As described in prior rulemaking, we do not currently risk-adjust for SES in the Hospital Readmissions Reduction Program. However, we do risk-adjust for comorbidities (that is, correlated illnesses) and other factors to ensure that hospitals are not penalized for serving populations that are sicker or have higher incidences of chronic disease.

We are aware that there are differing opinions regarding this approach. We appreciate the commenters' suggestions on the importance of addressing SES in the Hospital Readmissions Reduction Program. We have continued to consider and evaluate stakeholder concerns regarding the influence of patient SES status on readmission rates.

Comment: One commenter noted that the purpose of the Hospital Readmissions Reduction Program is to transform the Medicare payment and delivery system. Other commenters supported this belief and urged CMS not to adjust the readmission measures for SES.

Response: We appreciate the feedback and support not to adjust the readmissions measures for SES.

Comment: Some commenters urged that CMS not risk-adjust the readmission measures with SES until it is proven that the program is biased against low-SES hospitals. These commenters noted that the Hospital Readmissions Reduction Program is designed to provide incentives for changes that also enhance the quality of health care and that the same care protocols that work with a different population of patients who are not low-SES may also work with low-SES patients.

Response: We appreciate support of the Hospital Readmissions Reduction Program's goal to encourage improved health care through this program. Like the commenters, we continue to believe that the same care protocols and processes that are successful in caring for nonlow-SES patient populations may also be successful in caring for low-SES patient populations.

Comment: Many commenters provided recommendations on how to risk-adjust for SES and specifically recommended adopting the recommendations of the draft report issued by NQF's Expert Panel on Risk-Adjustment for Sociodemographic Factors (Draft Report available at: http://www.qualityforum.org/Risk_Adjustment_SES.aspx). A few commenters supported risk adjustment for SES as recently proposed in two bills in the 113th Congress (S. 2501, “The Hospital Readmissions Program Accuracy and Accountability Act,” and H.R.4188, the “Establishing Beneficiary Equity in the Hospital Readmission Program Act”). Both bills attempt to address the potential disproportionate impact of payment penalties on hospitals that serve high proportions of low-SES patients.

Response: We appreciate these comments and the importance of the role that SES plays in the care of patients. We are aware that there are differing opinions regarding our current approach in risk-adjusting measures in the Hospital Readmissions Reduction Program for SES. We note that the readmission measures aim to reveal differences related to the quality of care provided. We believe that quality of care received by patients of lower SES contributes at least in part to the observed association between SES status and the readmissions rate. We continue to have concerns about holding hospitals to different standards for the outcomes of their patients of low SES—we do not want to mask potential disparities or minimize incentives to improve the outcomes of disadvantaged populations.

We routinely monitor the impact of SES on hospitals' results. To date, we have found that hospitals that care for large proportions of patients of low SES are capable of performing well on our measures (we refer readers to the 2013 Medicare Hospital Quality Chart Book on pages 46 through 53 at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/-Medicare-Hospital-Quality-Chartbook-2013.pdf). Previous analyses presented at the NQF during endorsement proceedings of the Hospital-Wide All-Cause Unplanned Readmission Measure (available at: http://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=70813) also show that adding SES to the risk-adjustment has a negligible impact on hospitals' risk-standardized rates. The risk adjustment for clinical factors likely captures much of the variation due to SES, therefore resulting in an attenuation of the impact of SES factors on hospitals' results.

We continue to monitor related activities at NQF, such as the July 23, 2014 decision by the NQF Board in which the Board approved a trial period to test the impact of sociodemographic factor risk adjustment of performance measures (available at: http://www.qualityforum.org/Press_Release/2014/NQF_Board_Approves_Trial_Risk_Adjustment.aspx), and in Congress. As we stated in the past, we are committed to working with the NQF and other stakeholder communities to continuously refine our measures and to address the concerns associated with SES and risk adjustment. We believe that continued collaboration with the stakeholder communities will enable us to identify feasible ways to appropriately address any unintended consequences for providers serving high proportions of low-SES patients.

Comment: Many commenters provided recommendations on how to risk-adjust for SES and specifically referenced MedPAC's recommendation to use “peer group stratification,” that is, stratifying hospitals by the hospital's proportion of low-SES patients, as a method to correlate readmission rates and penalties with patient income. These commenters also recognized that this new method would require legislative changes because the current payment adjustment formula used to compute the readmission penalty is set in law.

Response: We appreciate the suggestion for risk-adjustment by “peer group stratification” as a method to address SES. We will take MedPAC's recommendations into consideration for the Hospital Readmissions Reduction Program, but also note that MedPAC recognizes that statutory changes would be required for us to adopt this recommendation because the current payment adjustment formula used to compute the readmission penalty is established by statute.

Comment: A few commenters supported the use of an unplanned hospital-wide readmission measure (some of these commenters specifically asked CMS to consider adding the Hospital-Wide All-Cause Readmission Measure (NQF #1789)) as this type of measure would capture a global perspective on hospital performance and urged CMS to consider these measures instead of CABG.

Response: We thank the commenters for this input and will continue to take the recommendation into consideration, as we stated previously in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50658). We developed the Hospital-Wide All-Cause Readmission Measure (NQF #1789) (HWR measure) that has been implemented in the Hospital IQR Program. The HWR measure estimates the hospital-level, risk-standardized rate of all-cause, unplanned readmission within 30 days of hospital discharge with any eligible condition. The measure reports a single composite risk-standardized readmission rate (RSRR), derived from the volume-weighted results of five different models, one for each of the following specialty cohorts (groups of related discharge condition categories or procedure categories): surgery/gynecology; general medicine; cardiorespiratory; cardiovascular; and neurology.

While we appreciate the commenters' recommendations to consider this measure for the Hospital Readmissions Reduction Program, we believe that section 1886(q)(5)(A) of the Act (defining “applicable condition”) prohibits us from adopting the many categories of diagnoses and procedures comprising the HWR measure as a single “condition.” Based on the limitations of the current statutory provisions for the Hospital Readmissions Reduction Program, we have not implemented the HWR measure in the Hospital Readmissions Reduction Program.

Comment: One commenter opposed the addition of the 30-day Ischemic stroke readmission measure in the Hospital Readmissions Reduction Program because it is not risk-adjusted using the National Institutes Stroke Severity Scale.

Response: We thank the commenter for this feedback and note that we did not propose this measure for the Hospital Readmissions Reduction Program. We note that, in the FY 2014 IPPS/LTCH PPS final rule, we discussed this issue with respect to the Hospital IQR Program (79 FR 50801). At that time we stated that we appreciated the concerns of the stakeholders on this issue. We also stated that not only are we committed to working with the stakeholder communities and to continuously refine our measures, which for the stroke outcome measures includes risk-adjusted patient severity, but also committed to working with the stroke communities and other stakeholders to seek feasible ways to incorporate additional severity adjustment as suggested. Finally, we highlighted that stroke is the fifth leading cause of adult mortality in the United States, and therefore we believe it would be a disservice to patients to delay inclusion of these current stroke outcome measures in quality reporting and quality improvement initiatives. We are committed to making these measures better and working with stakeholders to do so, and will take these comments into consideration.

Comment: A few commenters noted that heart failure readmission rates are inversely related to heart failure mortality rates.

Response: We appreciate the commenters' concerns and note that this issue was addressed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50650).

Comment: A few commenters recommended that the Hospital Readmissions Reduction Program be improved by excluding admissions that are part of the natural disease or treatment progression, in order to fairly assess hospitals and avoid unintended consequences for patients and their families. One commenter specifically highlighted readmissions due to ongoing care for patients suffering traumatic injury and requiring staged operative therapies should also be excluded.

Response: We appreciate these suggestions and agree that admissions that are part of planned management to address disease progression should not be counted in the outcome. We identify and do not count in the measure results and the readmission outcomes those admissions that are planned readmissions for ongoing care management. For example, ongoing treatments such as maintenance chemotherapy for cancer or cardiac device placement for cardiovascular disease patients are excluded from the calculation of the measure result for readmission rates.

Comment: Many commenters requested that CMS be more transparent and collaborative in its approach to all measures in the Hospital Readmissions Reduction Program.

Response: We appreciate this feedback regarding our proposed changes to the planned readmission algorithm and the proposed refinements to the measure cohort in the Elective Primary Total Hip Arthroplasty (THA) and/or Total Knee Arthroplasty (TKA) All-Cause Unplanned 30-Day Risk-Standardized Readmission Measure. We strive to collaborate with stakeholders, as well as be transparent about the direction of the Hospital Readmissions Reduction Program and the measures proposed for the program. We previously discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50657 through 50658) that we use multiple methods to communicate with stakeholders; for example, through press releases, open door forums, as well as through the Federal rulemaking process. We also post all measure methodology documents online for broad public access at our Web site (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html).

In addition, during measure development, we seek public comments on various stages of development, and also have the measures undergo the NQF's endorsement processes and measure maintenance reviews. We also adhere to the Affordable Care Act's provision that requires all measures that will be proposed in future rulemaking be reviewed by NQF's MAP as part of the pre-rulemaking process. All of these processes are open to the public for comment. While we believe all of these processes help to inform the public of our plans for and decisions regarding the Hospital Readmissions Reduction Program, we will strive to collaborate with all of our stakeholders to identify more effective ways of communicating our plans and decisions.

a. Refinement of Planned Readmission Algorithm for Acute Myocardial Infarction (AMI), Heart Failure (HF), Pneumonia (PN), Chronic Obstructive Pulmonary Disease (COPD), and Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Readmission Measures

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50651 through 50655), we finalized for 2014 and subsequent years' payment determinations the use of the CMS Planned Readmission Algorithm Version 2.1 in the AMI, HF, PN, COPD, and THA/TKA readmission measures. The algorithm identifies readmissions that are planned and occur within 30 days of discharge from the hospital. A complete description of the CMS Planned Readmission Algorithm Version 2.1, which includes lists of planned diagnoses and procedures, can be found on our Web site (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). NQF has endorsed the use of the algorithm for these measures.

Last year's stakeholder comments supported the incorporation of the CMS Planned Readmission Algorithm Version 2.1 and suggested that we update it on a regular basis. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50652), we agreed to continually review the CMS Planned Readmission Algorithm and make updates as needed. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28106 through 28108) we proposed to use a revised version, the CMS Planned Readmission Algorithm Version 3.0, for the AMI, HF, PN, COPD, and THA/TKA readmission measures for FY 2015 and subsequent payment determinations. We also proposed to use this algorithm for the CABG readmission measure proposed for inclusion in the Hospital Readmissions Reduction Program starting in FY 2017.

Version 3.0 incorporates improvements that were made based on a validation study of the algorithm. Researchers reviewed 634 patients' charts at 7 hospitals, classified readmission as planned or unplanned based on the chart review, and compared the results to the claims-based algorithm's classification of the readmissions. The findings suggested the algorithm was working well but could be refined.

Specifically, the study suggested the need to make small changes to the tables of procedures and conditions used in the algorithm to classify readmission as planned or unplanned. The algorithm uses the Agency for Healthcare Research and Quality's (AHRQ's) Clinical Classification Software (CCS) to group thousands of procedure and diagnosis codes into fewer categories of related procedures or diagnoses. The algorithm then uses four tables of procedures and diagnoses categories and a flow diagram to classify tables as planned or unplanned. For all measures, the first table lists procedures that, if present in a readmission, classify the readmission as planned. The second table identifies primary discharge diagnoses that always classify readmissions as planned. Because almost all planned admissions are for procedures or surgeries, a third table identifies procedures for which patients are typically admitted; if any of these procedures are coded in the readmission, we classify a readmission as planned as long as that readmission does not have an acute (unplanned) primary discharge diagnosis. The fourth table lists the acute (unplanned) primary discharge diagnoses that disqualify readmissions that include one or more of the potentially planned procedure in the third table as planned. These tables are structured the same across all measures but the specific procedure and conditions they contain vary slightly for certain measures based on clinical considerations for each cohort. The final proposed tables for each measure can be found on our Web site under the Measure Methodology reports (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html).

Version 3.0 modifies two of these tables by removing or adding procedures or conditions to improve the accuracy of the algorithm. First, a validation study revealed that the algorithm could be improved by removing two procedure CCS categories from the third table, the potentially planned procedure table: CCS 211—Therapeutic Radiation and CCS 224—Cancer Chemotherapy. Typically, patients do not require admission for scheduled Therapeutic Radiation treatments (CCS 211). The study found that readmissions that were classified as planned because they included Therapeutic Radiation were largely unplanned.

The algorithm was also more accurate when CCS 224—Cancer Chemotherapy was removed from the potentially planned procedure table. The second table of the algorithm classifies all readmissions with a principal diagnosis of Maintenance Chemotherapy as planned. Most patients who receive cancer chemotherapy have both a code for Cancer Chemotherapy (CCS 224) and a principal discharge diagnosis of Maintenance Chemotherapy (CCS 45). In the validation study, the readmissions for patients who received Cancer Chemotherapy (CCS 224) but who did not have a principal diagnosis of Maintenance Chemotherapy were largely unplanned; therefore, removing CCS 224 from the potentially planned procedure table improved the algorithm's accuracy. Therefore, Version 3.0 removes CCS 211 and CCS 224 from the list of potentially planned procedures to improve the accuracy of the algorithm.

As noted above, the algorithm uses a table of acute principal discharge diagnoses to help identify unplanned readmissions. Readmissions that have a principal diagnosis listed in the table are classified as unplanned, regardless of whether they include a procedure in the potentially planned procedure table. The validation study identified one diagnosis CCS that should be added to the table of acute diagnoses to more accurately identify truly unplanned admissions as unplanned: Hypertension with Complications (CCS 99). Hypertension with Complications is a diagnosis that is rarely associated with planned readmissions.

In addition, the validation study identified a subset of ICD-9-CM diagnosis codes within two CCS diagnosis categories that should be added to the acute diagnosis table to improve the algorithm. CCS 149, Pancreatic Disorders, includes the code for acute pancreatitis; clinically, there is no situation in which a patient with this acute condition would be admitted for a planned procedure. Therefore, Version 3.0 adds the ICD-9-CM code for acute pancreatitis, 577.0, to the acute primary diagnosis table to better identify unplanned readmissions. Finally, CCS 149, Biliary Tract Disease, is a mix of acute and nonacute diagnoses. Adding the subset of ICD-9-CM codes within this CCS group that are for acute diagnoses to the list of acute conditions improves the accuracy of the algorithm for these acute conditions while still ensuring that readmissions for planned procedures, like cholecystectomies, are counted accurately as planned. For more detailed information on how the algorithm is structured and the use of tables to identify planned procedures and diagnoses, we refer readers to discussion of the CMS Planned Readmission Algorithm Version 2.1 in our reports (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). As noted above, readers can find the specific Version 3.0 tables for each measure in the measure updates and specifications reports at the above link.

We invited public comment on these proposals.

Comment: Several commenters specifically supported all of the proposed modifications to the planned readmissions algorithm. Some commenters supported the use of the algorithm in general and others specifically supported the inclusion of the algorithm in the Hospital-Level 30-Day Readmission Following Admission for an Acute Exacerbation of Chronic Obstructive Pulmonary Disease.

Response: We appreciate the support for the inclusion of the planned readmission algorithm in the Hospital Readmissions Reductions Program measures.

Comment: Several commenters support the periodic update to the Hospital Readmissions Reduction Program's planned readmission algorithm to ensure its lists of inclusions and exclusions are accurate.

Response: We appreciate the comment and, as discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28106), we have revised the planned readmission algorithm based on a validation study conducted at 7 hospitals. During the revision of the algorithm, we also collaborated with technical and medical experts.

Comment: Several commenters commended CMS for including the planned readmission algorithm updates in the FY 2015 IPPS/LTCH PPS proposed rule updates specifically related to the proposed exclusions. They also suggested CMS exclude unrelated readmissions.

Response: We appreciate the commenters' support and the support to remove the two procedure Clinical Classification Software (CCS) categories of Therapeutic Radiation (CCS 211) and Cancer Chemotherapy (CCS 224) as we strive to be transparent with the stakeholders. We note that we addressed the concern for exclusion of unrelated readmissions in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50654). We indicated last year that unrelated readmissions are addressed through the planned readmission algorithm and, in coordination with medical experts, we expanded the list of conditions considered planned. Generally, planned readmissions are not a signal of quality of care. Therefore, we have worked with experts in the medical community, as well as other stakeholders, to carefully identify procedures and treatments that should be considered “planned” and, therefore, not counted as readmissions.

Comment: Several commenters were concerned with the exclusion of two cancer related Clinical Classification Software (CCS) groups. Some commenters were specifically concerned that removal of Clinical Classification Software (CCS) groups CCS 211—Therapeutic Radiation and CCS 224—Cancer Chemotherapy from the potentially planned procedure table of the planned readmission algorithm will have the unintended consequences of discouraging needed cancer care. These commenters requested that CMS therefore initiate an ad hoc review of this change. One commenter was unconvinced that the validation study findings for Maintenance Chemotherapy holds true for all hospitals and therefore hospitals that deliver a large amount of cancer services could be affected by this change.

Response: We note that removal of CCS 211 would be appropriate because patients are not typically admitted for therapeutic radiation, and admissions with this treatment were noted in general to be unplanned. In addition, we removed CCS 224 because the validation study showed admissions for individuals who receive cancer chemotherapy but do not have a principal diagnosis of maintenance chemotherapy are typically unplanned admissions. All admissions for patients with a principal diagnosis of Maintenance Chemotherapy (that is, CCS 45) will continue to be considered planned and will not be counted in the measure outcome. Therefore, we expect removal of CCS 211 and CCS 224 to improve the accuracy of the planned readmission algorithm and do not anticipate it will have the unintended consequence of discouraging needed cancer care. We appreciate the concern that the validation study findings may not apply to all hospitals and will consider further evaluation of this concern.

Comment: One commenter noted that he/she was aware of the methodologies that separate preventable versus nonpreventable readmissions while measures in the Hospital Readmissions Reduction Program continue to penalize hospitals for circumstances outside of their control. The commenter asserted that “well researched and documented methodologies” exist to separate potentially preventable versus nonpreventable readmissions.

Response: We note that it is difficult, and can be subjective, to categorize a readmission as preventable or unpreventable. The difficulty, and risk for being subjective, occurs because a readmission cannot be determined as being preventable or unpreventable based on the reason or diagnoses for the admission alone. For this reason, we have not chosen to categorize readmissions as preventable or unpreventable, but rather planned or unplanned. The planned readmission algorithm identifies those diagnoses codes, identified by medical experts in multiple specialties, as those frequently and most likely to be associated with planned reasons for a readmission. By categorizing readmissions as planned, we are trying to remove the subjective nature of deeming readmissions as preventable or unpreventable.

Finally, we are not aware of any publicly known NQF-approved methodology for identifying preventable versus unpreventable readmissions. The goal of the Hospital Readmissions Reduction Program is to lower the risk of all types of admissions through the most appropriate care and care transitions. We believe this goal can best be achieved through measuring and reporting a risk-standardized metric of excess readmissions that reflects how well hospitals are doing in decreasing unplanned readmissions relative to hospitals with similar patients.

Comment: One commenter believed that CMS' measures would benefit from refinement, including exclusion of planned readmissions and unrelated readmissions. Other commenters were disappointed that CMS did not propose a process for excluding readmissions unrelated to the initial reason for admission in calculating the measures, which they characterize as being mandated by the Affordable Care Act. Several commenters continued to urge CMS to exclude from the Hospital Readmissions Reduction Program admissions unrelated to the prior hospital stay, including, for example, admissions for chemotherapy, trauma, burns, end-stage renal disease, maternity, and substance abuse, because, the commenters stated, by their nature, they are not preventable readmissions.

Response: We note that we have been responsive to stakeholder suggestions to not include planned readmissions in the calculations, as discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50654), and as evidenced by multiple versions of the planned readmission algorithm since 2012. As with other aspects of any measure, we continue to review and revise the area of unrelated readmissions through our refinement of the planned readmissions algorithm. Regarding other types of unrelated readmissions, we currently do not seek to differentiate between related and unrelated readmissions because readmissions not directly related to the index condition may still be a result of the care received during the index hospitalization. For example, a patient hospitalized for COPD who develops a hospital associated infection may ultimately be readmitted for sepsis. It would be inappropriate to treat this readmission as unrelated to the care the patient received during the index hospitalization.

Furthermore, the range of potentially avoidable readmissions also includes those not directly related to the initial hospitalization, such as those resulting from poor communication at discharge or inadequate follow-up. Therefore, we believe that creating a comprehensive list of potential complications related to the index hospitalization would be arbitrary, incomplete, and, ultimately, extremely difficult to implement. However, in coordination with medical experts, we created a planned readmission algorithm to determine which conditions and therefore, readmissions, that are generally considered planned. Generally, planned readmissions are not indicative of an inferior quality of care, therefore are not counted as readmissions.

Regarding the suggestion to remove “extreme circumstances [such as] chemotherapy, trauma, burns, end stage renal disease, maternity, and substance abuse because, by their nature, they are not preventable readmissions,” we addressed this comment in the FY 2013 IPPS/LTCH PPS final rule. In that rule and the current rulemaking, the commenters requested that circumstances like those listed in the above comment be excluded from the index hospitalizations. In FY 2013 IPPS/LTCH PPS final rule, (77 FR 53377), we stated that “we appreciate the concern expressed by some commenters that patients of these `extreme circumstances' clinically could be sicker and more likely to be readmitted. The measures address clinical differences in hospitals' case-mix through risk adjustment rather than through excluding patients from the measure as suggested by the commenter. The goal in developing outcomes measures is to create a clinically cohesive cohort that includes as many patients as possible admitted with the given condition. Greatly expanding our list of exclusions would result in a measure that was less useful and meaningful because it would reflect the care of fewer patients. In addition, we believe that, by excluding patients with significant comorbidities, the measure would not assess of the quality of care for those patients. To fairly profile hospitals' performance, it is critical to place hospitals on a level playing field and account for their differences in the patients that present for care. This is accomplished through adequate risk-adjustment for patients' clinical presentation rather than exclusion of patients.”

Comment: One commenter urged CMS to work with the physician and hospital communities to identify other planned readmissions that should be excluded.

Response: We will continue to involve all stakeholders in the process of measure development and measure maintenance. We also collaborate with various medical specialty societies and associations whenever feasible and appropriate to ensure that their input and feedback are considered in real-time during measure development and maintenance, which also include input from expert panels, and public comment periods. We will consider the comment in future revisions to the algorithm.

Comment: Many commenters believe that CMS should have had the proposed Planned Readmission Algorithm Version 3.0 changes reviewed by NQF before finalizing and using in the readmission measures. One commenter believed the changes to be substantive and did not support adopting changes for measures to incorporate the Planned Readmission Algorithm Version 3.0 until the revised measures have been recommended by the MAP. One commenter stated that the size of the validation study for the Planned Readmission Algorithm Version 3.0 was limited, and making recommendations based on this information, without external review from NQF, could create unintended consequences.

Response: We note that the NQF has reviewed the Planned Readmission Algorithm Version 3.0 multiple times over the past 6 to 8 months as it was submitted for review as part of the NQF's annual measure maintenance review for re-endorsement of the Hospital-Level 30-Day Readmission Following Admission for Heart Failure, Pneumonia, Chronic Obstruction Pulmonary Disease, and Total Hip Arthroplasty/Total Knee Arthroplasty measures. As of July 2014, all of these measures are still under review by NQF. NQF also reviewed the Planned Readmission Algorithm Version 3.0 with the Coronary Artery Bypass Surgery readmission measure during its endorsement proceedings of this measure, which led to the measure being recommended for endorsement. We will consider the comment in future revisions to the algorithm.

Comment: Several commenters requested that CMS clarify what is a “related” readmission or a “planned” readmission, while others noted the measures fail to distinguish between a planned and unplanned readmission. Other commenters expressed appreciation for the proposed exclusions for certain readmissions, but requested exclusion of unrelated readmissions.

Response: We note that the issue of excluding unrelated readmissions from the Hospital Readmissions Reduction Program was addressed in FY 2014 IPPS/LTCH PPS final rule (78 FR 50654 through 50655). Regarding clarification of what is a planned readmission, we refer readers to the technical reports for each measure that define specifically how planned readmissions are defined for the measure. The technical reports can be found in the planned readmission algorithm at the following Web site: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. Finally, we continue to review and revise our algorithm for planned readmissions to improve its accuracy.

Comment: One commenter emphasized the need to continuously improve and evaluate the accuracy of a signal provided by a specific readmission measure.

Response: We thank the commenter for this feedback. We believe that unplanned readmissions, in general, are a signal of the quality of care that hospitals provide to their patients. The commenter is concerned with the accuracy of the readmission measures; we note that these measures have been NQF-endorsed and widely vetted by technical experts during measure development and annual measure maintenance. We will continue to monitor and update the measures to ensure their accuracy.

Comment: Some commenters recommended that CMS create a system to monitor unintended consequences related to planned readmissions and implement an audit function that will accurately account for true planned readmissions.

Response: We note that we have been concerned about the unintended consequence of hospitals' increased use of observation stays and emergency department visits to avoid counting a patient as having been readmitted, and we are tracking these incidences in the Medicare Hospital Quality Chartbook available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/-Medicare-Hospital-Quality-Chartbook-2013.pdf. Regarding the recommendation to create an audit function that will accurately account for “true planned readmissions,” we understand this to mean that the commenter is concerned about the validity of the planned readmission algorithm. We note that, during development and maintenance of the planned readmission algorithm, there have been several iterations of the algorithm as a result of review by medical experts and other stakeholders like the NQF. We believe that the constant review and update of the algorithm by medical experts and other stakeholders provide a planned readmissions algorithm that accurately identifies truly planned readmissions.

After consideration of the public comments we received, we are finalizing our proposal to update the planned readmission algorithm. We believe the updated Planned Readmission Algorithm Version 3.0 continues to fulfill statutory requirements to remove planned readmissions, as well as addresses stakeholder recommendations to continually refine and adjust the algorithm.

b. Refinement of Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Readmission Measure Cohort

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28107), for FY 2015 and subsequent years, we proposed to refine the measure cohort for the Elective Primary Total Hip Arthroplasty (THA) and/or Total Knee Arthroplasty (TKA) All-Cause Unplanned 30-Day Risk-Standardized Readmission Measure.

Currently, the THA/TKA Readmission Measure adopted for the Hospital Readmissions Reduction Program is intended to only include patients who have an elective THA or TKA. Therefore, this measure excludes patients who have a principal discharge diagnosis of femur, hip, or pelvic fracture on their index admission because hip replacement for hip fracture is not an elective procedure. However, after hospitals reviewed their hospital-specific THA/TKA Readmission Measure data during the national dry run conducted during September and October 2012, we learned that hospitals code hip fractures that occur during the same admission as a THA as either a principal or secondary diagnosis. According to feedback received from hospitals participating in the dry run, the measure methodology failed to identify and therefore appropriately exclude a small number of patients (that is, 0.42 percent of patients in 2009-2010 data) with hip fracture who had nonelective total hip arthroplasty.

To ensure that all such hip fracture patients are excluded from the measure, we proposed to refine the measure to exclude patients with hip fracture coded as either principal or secondary diagnosis during the index admission. We believe this refinement is responsive to comments from hospitals and will allow us to accurately exclude patients who were initially admitted for a hip fracture and then underwent total hip arthroplasty, making their procedure nonelective.

Comment: Several commenters supported the refinements to the Total Hip Arthroplasty and Total Knee Arthroplasty 30-day readmission cohort.

Response: We appreciate support of these refinements to this measure cohort.

Comment: Several commenters acknowledged that the change to the THA/TKA measure is likely appropriate, but recommended that CMS submit the proposal to NQF in an ad hoc review of updates to the planned readmission algorithm and this measure cohort, before finalizing this proposal.

Response: We appreciate this feedback, especially acknowledgement that the change to the THA/TKA measure is likely to be appropriate. We appreciate the suggestion for an ad hoc NQF review of this change to the THA/TKA measure cohort. We also understand the importance of seeking broad stakeholder review during routine measure maintenance and note that this measure was submitted to NQF for annual maintenance review in June 2014. We note that the proposed changes to the cohort were the result of feedback from hospitals that had participated in the dry run of this measure and noted that the prior measure methodology failed to identify, and therefore appropriately exclude, a small number of patients (that, is 0.42 percent of patients in the 2009-2010 data) with hip fracture who had nonelective total hip arthroplasty. The recommendations resulting from the hospitals participating in the dry run were also reviewed by a group of medical experts working with our measure developer.

Notwithstanding this expert medical opinion, we realize that broader stakeholder review is necessary as we continue to strive for transparency with management of the Hospital Readmissions Reduction Program. We will work towards improving and broadening stakeholder review of measure updates; we will take this recommendation under consideration.

Comment: One commenter encouraged CMS to continue to work with appropriate clinicians and stakeholder groups (for example, the American Association of Hip and Knee Surgeons and the American Academy of Orthopaedic Surgeons) to identify planned readmissions that may occur within 30 days of discharge from the hospital that are unrelated to the quality of care received during the initial admission.

Response: We continually work towards improving and broadening stakeholder review of measure updates.

Comment: One commenter encouraged CMS to work with the Yale-New Haven Hospital Health Services Corporation, Center for Outcomes Research and Evaluation (YNHHSC/CORE) to determine the most appropriate method for excluding or risk-adjusting for cases that involve conversion of previous hip surgery to total hip arthroplasty (represented by CPT code 27132).

Response: We note that the commenter's concerns focused on having us revise our Elective Primary Total Hip Arthroplasty (THA) and/or Total Knee Arthroplasty (TKA) All-Cause Unplanned 30-Day Risk-Standardized Readmission Measure to exclude additional specific groups of patients with prior hip surgeries that place them at a significantly increased risk of complications, including revision procedures and those requiring removal of implanted devices from the femur (ICD-9-CM codes 78.65). We will continue to work closely with the YNHHSC/CORE to determine the most appropriate method for exclusions or risk-adjustment for these cases for this measure.

Suter L.G., Parzynski C.S., Searfoss R., Dorsey K.B., Grady J.N., Keenan M., Okai M., Nwosu M., Ngo C., Lin Z., Bhat K.R., Krumholz H.M., Bernheim S.M. 2014 Procedure-Specific Readmission Measures Updates and Specifications Report: Report prepared for the Centers for Medicare & Medicaid Services. 2014.

After consideration of the public comments we received, we are finalizing our proposal to the refinements to the THA/TKA readmission measure cohort.

c. Anticipated Effect of Proposed Refinements on Measures

The proposed refinement of the CMS Planned Readmission Algorithm Version 2.1 to Version 3.0 would have had the following effects on the measures based on our analyses of discharges between July 2009 and June 2012, if these changes had been applied for FY 2014. We note that these statistics are for illustrative purposes only, and we did not propose to revise the measure calculations for the FY 2014 payment determination. Rather, we proposed to apply these changes to the readmission measures for the FY 2015 payment determination and subsequent years.

Among hospitals that were subject to the Hospital Readmissions Reduction Program in FY 2014 (Table IV.H.1), the number of eligible discharges based on the July 2009 through June 2012 data were 494,121 discharges for AMI; 1,165,606 discharges for HF; 954,033 discharges for PN; 926,433 discharges for COPD; and 858,266 discharges for hip/knee (as shown in Table IV.H.1. below).

The proposed 30-day readmission rate (excluding the planned readmissions) would remain constant for AMI and COPD; increase by 0.1 percentage points for HF and PN; and increase by 0.4 percentage points for hip/knee.

The new national readmission (unplanned) rate for each condition would have been 17.9 percent for AMI; 23.0 percent for HF; 17.7 percent for PN; 21.1 percent for COPD; and 5.27 percent for hip/knee.

The number of readmissions considered planned (and, therefore, not counted as a readmission) would decrease by 319 for AMI; 1,313 for HF; 866 for PN, 547 for COPD; and 298 for hip/knee.

The proposed modification of the hip/knee measure cohort would have had the following effects on the measure: The measure cohort would have been reduced by 0.37 percent; the crude readmission rate would have been reduced by 0.02 absolute percentage points; and the mean RSRR would have been reduced by 0.03 absolute percentage points.

Table IV.H.1.—Comparison of Planned Readmission Algorithms V 2.1 and 3.0 for AMI/HF/PN/COPD/THA/TKA Readmission Measures

[Based on 2009-2012 discharges from 3025 hospitals]

AMIHFPNCOPDTHA/TKA
V 3.0V 2.1V 3.0V 2.1V 3.0V 2.1V 3.0V 2.1V 3.0V 2.1
Number of Discharges494,121494,1211,165,6061,165,606954,033954,033926,433926,433858,266858,266
Number of Unplanned Readmissions88,56788,248268,072266,759169,213168,347195,595195,04845,20544,907
Readmission Rate17.9%17.9%23.0%22.9%17.7%17.6%21.1%21.1%5.27%5.23%
Number of Planned Readmissions11,57711,89615,29316,6065,8676,7335,8586,4052,2832,581
Planned Readmission Rate2.3%2.4%1.3%1.4%0.6%0.7%0.6%0.7%0.3%0.3%
% of Readmissions that are Planned11.6%11.9%5.4%5.9%3.4%3.8%2.9%3.2%4.8%5.4%

5. No Expansion of the Applicable Conditions for FY 2016

In FY 2014 IPPS/LTCH PPS final rule, we finalized for FY 2015 two new condition specific readmission measures: (1) Hospital-level 30-day all-cause risk-standardized readmission rate following elective total hip arthroplasty (THA) and total knee arthroplasty (TKA) (NQF #1551); and (2) Hospital-level 30-day all-cause risk-standardized readmission rate following chronic obstructive pulmonary disease (COPD) (NQF #1891). This brought the total number of finalized applicable conditions to five over the past 2 years of implementation. We also noted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50657) that commenters requested that we delay adding other condition-specific measures. In view of these requests and our belief that it is reasonable to allow more time for hospitals to become familiar with these five applicable conditions before adding other applicable conditions in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28108), we did not propose any new applicable conditions for FY 2016.

Comment: Several commenters encouraged CMS to strengthen the Hospital Readmissions Reduction Program through the inclusion of new measures for FY 2016 and FY 2017 so that momentum of recent successes in the reduction of readmission rates is not lost. Other commenters not only supported CMS' decision not to expand measures in FY 2016 for the Hospital Readmissions Program, but also recommended that CMS delay expanding the program in FY 2017.

Response: We agree that it is important for the nation's hospitals to continue to be successful in the reduction of readmission rates and to utilize this momentum to implement other condition specific readmission measures. However, we noticed over the past 2 years a persistent dichotomy in stakeholder recommendations where some recommended expansion of the program with new measures each fiscal year and others recommended not expanding the program in FY 2016 and FY 2017.

In response to last year's proposed rule (78 FR 50657), stakeholders requested that they be given time to become familiar with the measures and the program. For this reason, we did not propose expanding the program in FY 2016. However, we proposed to expand the program in FY 2017 with the Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery measure. We will continue to review condition-specific readmission rate performance gaps in conjunction with our Quality Improvement Strategy (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityInitiativesGenInfo/CMS-Quality-Strategy.html) and the availability of robust risk-adjusted readmission measures. As we indicated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50657), we will continue to ensure that hospitals are aware of future proposed program expansion through press releases, open door forums, as well as through the Federal rulemaking process. We also continue to strive to ensure our measure selection process for the Hospital Readmissions Reduction Program is transparent and allows the public several opportunities to comment on measures being selected for the Hospital Readmissions Reduction Program.

Comment: Several commenters suggested expanding the Hospital Readmissions Reduction Program by adding the Society of Thoracic Surgeons' (STS) Risk-Adjusted Coronary Artery Bypass Graft (CABG) Readmission Rate measure (NQF #2514) in conjunction with CMS' Hospital 30-DayDay, All-Cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515). These commenters believed that having the STS registry-based measure in addition to CMS' claims-based measure would help providers and patients fully understand CABG care. Others commenters not only recommended including the Hospital 30-Day Risk—Standardized Readmission Rates following Percutaneous Coronary Intervention (PCI) measure (NQF #0695), believing it would drive improvements in quality and patient outcomes while simultaneously realizing significant cost savings for Medicare, but also implementing the Hospital 30-Day Risk-Standardized Readmission Rates following Percutaneous Coronary Intervention (PCI) measure (NQF #0695) no later than FY 2017.

Response: We note that both the STS and the CMS Coronary Artery Bypass Graft (CABG) Readmission measures (NQF # 2514 and 2515 respectively) were both recommended for endorsement by NQF in May 2014, and a final decision on the endorsement status will be forthcoming in the third quarter of 2014. We note that both measures are fully harmonized, despite using different data sources. The STS' Risk-Adjusted Coronary Artery Bypass Graft (CABG) Readmission Rate measure (NQF #2514) uses the STS National Database, while CMS' Hospital 30-DayDay, All-Cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515) uses administrative claims. We believe having two measures that are fully harmonized using two different data sources provides the greatest flexibility for stakeholders to identify which measure best fits their current capabilities for data collection and submission.

We also note that we believe the use of the administrative claims-based measure would be less burdensome for participating hospitals in the Hospital Readmissions Reduction Program. Regarding the recommendation to expand the program in FY 2017 with the Hospital 30-Day Risk-Standardized Readmission Rates following Percutaneous Coronary Intervention (PCI) measure (NQF #0695), we note that this issue was addressed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50657). We stated that the addition of measures for the other vascular and PCI conditions are not feasible for two reasons: (1) Inpatient admissions for PCI and other vascular conditions appear to be decreasing; and (2) the hospitals are increasingly performing procedures relating to these conditions in outpatient departments. For these reasons, addition of these measures in the Hospital Readmissions Reduction Program is not practical.

After consideration of the public comments we received, we are finalizing our proposal not to expand the applicable conditions in the Hospital Readmissions Reduction Program in FY 2016.

6. Expansion of the Applicable Conditions for FY 2017 To Include the Patients Readmitted Following Coronary Artery Bypass Graft (CABG) Surgery Measure

a. Background

Under section 1886(q)(5)(B) of the Act, “[b]eginning with FY 2015, the Secretary shall, to the extent practicable, expand the applicable conditions beyond the 3 conditions for which measures have been endorsed as described in subparagraph (A)(ii)(I) . . . to the additional 4 conditions that have been identified by the Medicare Payment Advisory Commission [MedPAC] in its report to Congress in June 2007, and to other conditions and procedures as determined appropriate by the Secretary.” The four conditions and procedures recommended by MedPAC are: (1) Coronary artery bypass graft (CABG) surgery; (2) chronic obstructive pulmonary disease (COPD); (3) percutaneous coronary intervention (PCI); and (4) other vascular conditions. Section 1886(q)(5)(A)(i) of the Act directs the Secretary, in selecting an “applicable condition,” to choose from among readmissions “that represent conditions or procedures that are high volume or high expenditures under this title (or other criteria specified by the Secretary).”

In accordance with section 1886(q)(5)(A) of the Act, effective for the calculation of the readmissions payment adjustment factors in FY 2017, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28108 through 28111), we proposed to expand the scope of applicable conditions and procedures to include patients readmitted following CABG surgery. This proposal is consistent with the prior FY 2014 IPPS/LTCH PPS final rule (78 FR 50657) where we indicated our intent to explore quality measures that address CABG readmission rates. We describe this measure in detail below.

We proposed the inclusion of the condition of CABG readmissions to the Hospital Readmissions Reduction Program based on MedPAC's recommendations. For this condition, we developed a Hospital-Level 30-Day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery measure. The National Quality Forum (NQF) Measure Applications Partnership (MAP) Hospital workgroup conditionally supported this measure for use in the Hospital Readmissions Reduction Program. The condition for support is based on attainment of NQF endorsement. On February 5, 2014, we submitted the Hospital-Level 30-Day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery measure to NQF for endorsement.

The rationale for expanding the applicable conditions and the measures used to estimate the excess readmissions ratio is described in detail below.

b. Overview of the CABG Readmissions Measure: Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery

Among the seven conditions MedPAC identified in its 2007 Report to Congress as having the highest potentially preventable readmission rate, CABG had the highest rate of readmissions within 15 days following discharge (13.5 percent) and second highest average Medicare payment per readmission ($8,136). The annual cost to Medicare for potentially preventable CABG readmissions was estimated at $151 million.

Medicare Payment Advisory Committee. Report to the Congress: Promoting Greater Efficiency in Medicare, 2007.

Ibid.

Evidence also shows variation in readmissions rates for patients with CABG surgery, supporting the finding that opportunities exist for improving care. The median, 30-day, risk-standardized readmission rate among Medicare fee-for-service patients aged 65 or older hospitalized for CABG in 2009 was 17.2 percent, and ranged from 13.9 percent to 22.1 percent across 1,160 hospitals. Although data documenting readmission reductions in CABG are limited, there are data that support CABG readmission as an important quality metric. Studying readmission rates after CABG surgery in New York, Hannan, et al. found: (1) Wide variation in readmission rates; (2) the most common cause of readmission after CABG is complication related to the surgery; and (3) that hospital-level variables such as use of cardiac rehabilitation and length of stay influenced readmission rates. The authors also noted that readmission rates were not closely correlated to mortality rates and thus measuring readmission rates likely offers a complementary metric intended to assess a different domain of quality. Mortality measures are more likely to encourage improvements in clinical quality, including rapid triage, effective safety practices, and early intervention and coordination in the hospital. Readmission measures place an increased emphasis on aspects of quality related to effective transitions to the outpatient setting, clear communication with patients and caregivers, and collaboration across communities and providers. Together, these data suggest that reducing readmission rates following CABG surgery is an important target for quality improvement. In addition, inclusion of this measure in the Hospital Readmissions Reduction Program aligns with CMS' Quality Strategy objectives to promote successful transitions of care for patients from the acute care setting to the outpatient setting, and to reduce short-term readmission rates. In its final recommendations for rulemaking, the MAP conditionally supported the inclusion of the proposed CABG measure pending NQF endorsement and implementation. In order to address this concern, we submitted the CABG readmission measure to NQF for endorsement on February 5, 2014.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-Level 30-Day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Rumsfield J, Allen L. Reducing Readmission Rates: Does Coronary Artery Bypass Graft Surgery Provide Clarity? JACC Cardiovasc Interv. May 2011;4(5):2.

Hannan EL, Zhong Y, Lahey SJ, et al. 30-day readmissions after coronary artery bypass graft surgery in New York State. JACC Cardiovasc Interv. 2011;4(5):569-576.

We believe the proposed Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Measure Following CABG Surgery measure warrants inclusion in the Hospital Readmissions Reduction Program for FY 2017 because it meets the criteria in section 1886(q)(5)(A) of the Act, as a high cost, high volume condition that was recognized by MedPAC Report to Congress in 2007 as a specific medical condition to focus on for improving readmission rates. As with other readmission measures, this measure also excludes such unrelated readmissions as planned readmissions and transfers to other hospitals. For these reasons, we believe this measure is appropriate for the Hospital Readmissions Reduction Program.

We invited public comments on this proposal.

Comment: Many commenters supported the use of the Hospital-Level, 30-Day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery (NQF #2515) measure unconditionally, while other commenters only supported the use of the measure if it was endorsed by the NQF. Some commenters supported the measure because it will add more conditions and procedures to the Hospital Readmissions Reduction Program. Other commenters supported the measure but encouraged CMS to implement the measure in FY 2016 instead FY 2017 because the measure will be NQF-endorsed by FY 2016 and the MAP conditionally recommended the measure upon endorsement.

Response: We appreciate the commenters' support for the inclusion of the Hospital-Level, 30-DayDay All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery measure (NQF #2515) in the Hospital Readmissions Reduction Program. We also appreciate the recommendation to expand the program by another condition-specific measure a year earlier than proposed. We note that, in last year's final rule, we stated we would allow the stakeholders time to become familiar with the current finalized measures, and for this reason, we proposed to implement the measure in FY 2017 rather than FY 2016.

Finally, on May 5-6, 2014, both the STS Risk-Adjusted Coronary Artery Bypass Graft (CABG) Readmission Rate measure (NQF #2514) and the CMS Hospital 30-DayDay, All-Cause, Unplanned, Risk Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515) were recommended for endorsement by the NQF (Draft Report for Commenting at: http://www.qualityforum.org/ProjectMaterials.aspx?projectID=73619).

Comment: Many commenters did not support the use of the CMS Hospital 30-DayDay, All-Cause, Unplanned, Risk Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515) for the Hospital Readmissions Reduction Program. Some of the reasons commenters gave for not supporting the CABG readmission measure (NQF #2515) included:

  • Not being given enough time to establish their quality improvement program before having to incorporate additional medical conditions into its program. These commenters indicated that expansion of the Hospital Readmissions Reduction Program through additional conditions (that is, readmission measures) and penalties while these hospital programs are being established will place additional strain on hospitals before they are given a chance to succeed in reducing their readmission rates;
  • Not being given time to become familiar with the CABG readmission measure (NQF #2515) through the Hospital IQR Program;
  • CMS not addressing Hospital Readmissions Reduction Program policies related to a lack of risk-adjustment for SES and excessive payment penalties for a single readmission;
  • Belief that there is the potential negative consequence of unfairly targeting hospitals that do perform CABG surgical procedures, when CABG is not a universally performed procedure; and
  • Belief that there is the potential negative unintended consequence of reducing access for high-risk, older patients to CABG procedures due to their increased potential for complications and readmissions. This commenter asked that CMS monitor CABG utilization in high-risk, older patients to ensure these patients are offered medically indicated care.

Finally, one commenter did not support the CABG readmission measure (NQF #2515) until concerns over the limitations of the readmissions exclusions, risk adjustment, and access to information on hospital performance on the readmission measures were resolved.

Response: We acknowledge that there is a balance between allowing time for stakeholders to initiate and establish programs to improve readmission rates and expanding the Hospital Readmissions Reduction Program to narrow the performance gaps noted throughout the nation with various medical conditions. We take into account many factors when we decide how and when to expand the readmission measure set, and believe that addition of the CABG readmission measure (NQF #2515) for FY 2017 is reasonable, especially considering that we had signaled in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27597) that we were considering how to expand the Hospital Readmissions Reduction Program based on the recommendations in MedPAC's June 2007 report (available at: http://www.medpac.gov/documents/jun07_entirereport.pdf) which included CABG surgical procedures. We understand that hospitals prefer time to become familiar with new measures and, for this reason, we had posted the CABG readmission measure (NQF #2515) measure methodology reports in April 2014 the CMS Web site (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html), as well as alerted the public of these reports documents in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28109). We also have intermittently performed dry runs for certain measures and may consider plans to have a dry run for the CABG readmission measure (NQF #2515) in order to allow hospitals and other stakeholders to become more familiar with the measure.

We also provide the opportunity for hospitals to review and correct their readmissions data relating to these measures prior to its release to the public on the Hospital Compare Web site. We anticipate the review and correction period to be in late July 2014. Because we have instituted a sequential pattern of implementing a readmission measure in the Hospital IQR Program before implementing it in the Hospital Readmission Reduction Program, we believe that stakeholders have sufficient time to become familiar with this measure because it will not be implemented until FY 2017. We also note suggestions we received from some commenters that we take advantage of the readmissions' improvement momentum, as evidenced by nationwide reductions in the rate of hospital readmissions, by expanding the Hospital Readmission Reduction Program measure set beginning in FY 2016 instead of FY 2017. We will continue to take into consideration comments regarding expansion of the Hospital Readmission Reduction Program during future deliberations on when to expand the readmission measure set.

Regarding the concern for a lack of SES risk-adjustment SES in the CABG readmission measure (NQF #2515), we refer readers to section IV.H.4 of this preamble of this final rule for our discussion of SES.

We note the commenter's views that CMS imposes excessive payment penalties for a single readmission. We recognize that not all hospitals perform CABG procedures. However, we also note that in the January 2009-September 2011 Medicare FFS data, there were over 150,000 CABG procedures eligible for inclusion in this measure, and that there was a broad range of hospital-level risk-standardized readmission rates after CABG surgical procedures among hospitals ranges from 12.0 percent to 23.1 percent. We also note in MedPAC's June 2007 report (available at: http://www.medpac.gov/documents/jun07_entirereport.pdf) that CABG has the highest potentially preventable readmission rate within 15 days following discharge (13.5 percent) and the second highest average Medicare payment per readmission ($8,136). For these reasons, and because of the physical and emotional burden of readmissions on patients themselves, we seek to ensure readmission rates following these common, costly, and preventable procedures are adequately monitored and hospitals are provided with performance data to allow quality improvement.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Finally, regarding the concern for a potential negative unintended consequence of reducing access for high-risk, older patients to CABG procedures due to such patients' increased potential for complications and readmissions, we note that the readmission measures take into account the care of older patients in the risk-adjustment model in order not to create disincentives to care for older patients. We also note that the goal of the readmissions measures is not to have readmission rates of zero, but rather to evaluate hospitals relative to hospitals with similar patients for excess readmissions. We will consider ways to monitor for this unintended consequence as well.

Comment: Many commenters stated that the CMS Hospital 30-DayDay, All-Cause, Unplanned, Risk Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515) is unreliable due to a small number of CABG surgeries performed during the measurement period. One commenter suggested that hospitals may be unfairly penalized because of variation in readmission rates that results from a small number of cases during the measurement period.

Response: We appreciate the commenters' feedback, and note that reliability is related to sample size. We do not agree that the CABG readmission measure is unreliable. We note that the same statistical approach to reliability for the CMS Hospital 30-Day, All-Cause, Unplanned, Risk Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure (NQF #2515) is used and established for all other CMS NQF-approved, hospital risk-adjusted outcome measures, including the mortality and readmissions measures. We adopted a risk adjustment modeling methodology for our outcome measures that takes into account sample size.

We note that this issue was raised and responded to in part in the FY 2013 and FY 2014 IPPS/LTCH PPS final rules (77 FR 53379 and 78 FR 50659, respectively) in our discussion of the readmission measures for the Hospital Readmissions Reduction Program. In the former rule, we stated that “[w]e determined the 25-case threshold for public reporting based on a reliability statistic that is calculated from the intercluster correlation, a parameter of the model [we refer readers to pages 14 through 17 of the document “PulmonaryAdditionalComment.pdf” which can be retrieved at: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0CB0QFjAA&url=https%3A%2F%2Fwww.qualityforum.org%2FWorkArea%2Flinkit.aspx%3FLinkIdentifier%3Did%26ItemID%3D71385&ei=CRDYU4D6BYPhsASm4oDACA&usg=AFQjCNEvsLpiX23smKpBINVSv-91IAsXGA&sig2=UMJeMe1LdVq3lP69ks-1Hg&bvm=bv.71778758,d.cWc&cad=rja]. We are maintaining the minimum 25-case threshold that we adopted through rulemaking last year.”

We acknowledge that smaller hospitals typically have less certain estimates because they have fewer cases for use in assessing quality. Our approach to modeling addresses the concern that small hospitals will be penalized due to random variation, and this challenge is inherent in outcome measurements. However, one advantage of the statistical model used for the CMS outcome measures is that it allows for the inclusion of small hospitals while characterizing the certainty of their estimates. The hierarchical logistic regression model that we use to calculate the risk-standardized outcome measures allows the inclusion of hospitals with relatively few observations, but takes into account the uncertainty associated with sample size in estimating their risk-standardized outcome rates. The model takes into account the uncertainty in the estimate of outcome rates for low-volume hospitals by assuming that each hospital is a typically performing hospital. It weighs that assumption along with the outcomes for the particular hospital in calculating the outcome rate. Therefore, the estimated outcome rates for smaller hospitals will likely be closer to the national rate because the limited number of eligible cases in the hospital indicated little about that hospital's true outcome rate.

c. Methodology for the CABG Measure: Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery

The proposed CABG readmission measure assesses hospitals' 30-day, all-cause risk-standardized rate of unplanned readmission following admission for a CABG procedure. In general, the measure uses the same approach to risk-adjustment and hierarchical logistic modeling (HLM) methodology that is specified for the AMI, HF, PN, COPD, and THA/TKA readmission measures that we previously adopted for this program. Information on how the measure employs HLM can be found in the 2012 CABG Readmission Measure Methodology Report (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). This approach appropriately accounts for the types of patients a hospital treats (that is, hospital case-mix), the number of patients it treats, and the quality of care it provides. The HLM methodology is an appropriate statistical approach to measuring quality based on patient outcomes when the patients are clustered within hospitals (and, therefore, the patients' outcomes are not statistically independent) and sample sizes vary across hospitals. The measure methodology defines hospital case-mix based on the clinical diagnoses provided in the hospitals' claims for the hospitals' patient inpatient and outpatient visits for the 12 months prior to the hospitalization for CABG, as well as those present in the claims for care at admission. However, the methodology specifically does not account for diagnoses present in the index admission that may indicate complications rather than patient comorbidities.

Comment: One commenter supported the minimum case size and believed that hospitals that are included in this measure will far exceed the minimum case volume, which will result in better measurement of a predicted readmission rate.

Response: We appreciate the commenter's support.

Comment: One commenter believed that, for the CABG measure, there should be areas for accountability on both the index and discharge hospitals. For example, if the discharge hospital does not perform accurate medication reconciliation, an error resulting in readmission should not reasonably be attributed to the index hospital.

Response: We acknowledge that, unlike our other readmission measures in the Hospital Readmissions Reduction Program, the CABG readmission measure (NQF #2515) attributes the readmission outcome to the hospital that performed the initial CABG procedure, even if that hospital was not responsible for discharging the patient home or to a postacute setting for care. We took this approach for CABG readmission measure (NQF #2515) because, unlike for medical conditions, transfer to another acute care facility following CABG surgery is most likely due to a complication of the initial CABG procedure or the peri-operative care the patient received. Therefore, the care provided by the hospital performing the CABG procedure likely dominates readmission risk, even among transferred patients. We believe that the transferring hospital has control over the hospital to which they transfer their CABG patients and will be encouraged by this measure to work closely with the institutions they transfer patients to, to provide optimal continuity of care for their patients. We note that this approach is supported by the high proportion of CABG readmissions for diagnoses such as heart failure, pleural effusion, and pneumonia and is endorsed by clinical experts from the Society of Thoracic Surgeons and the nationally convened Technical Expert Panel members who helped develop this measure.

Hannan EL, Racz MJ, Walford G, Ryan TJ, Isom OW, Bennett E, Jones RH. Predictors of Readmission for Complications of Coronary Artery Bypass Graft Surgery. JAMA. 2003;290:773-780.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

We discuss the measure methodology below.

(1) Data Sources

The proposed CABG readmission measure is based on data derived from administrative claims. It uses Medicare administrative data from hospitalizations for Medicare FFS beneficiaries hospitalized for a CABG procedure.

(2) Definition of Outcome

The proposed CABG readmission measure defines 30-day, all-cause readmission as an unplanned subsequent inpatient admission to any applicable acute care facility for any cause within 30 days of the date of discharge from the index hospitalization. A number of studies demonstrate that improvements in care at the time of discharge can reduce 30-day readmission rates. Thirty days is a meaningful timeframe for hospitals because readmissions are more likely attributable to care received within the index hospitalization and during the transition to the outpatient setting.

Gulshan Sharma, Kou Yong-Fang, Freeman Jean L, Zhang Dong D, Goodwin James S.: Outpatient Follow-up Visit and 30-Day Emergency Department Visit and Readmission in Patients Hospitalized for Chronic Obstructive Pulmonary Disease. Arch Intern Med. Oct. 2010;170:1664-1670.

Nelson EA, Maruish ME, Axler JL.: Effects of Discharge Planning and Compliance with Outpatient Appointments on Readmission Rates. Psychiatr Serv. July 1 2000;51(7):885-889.

The proposed CABG readmission measure assesses all-cause unplanned readmissions (excluding planned readmissions) rather than readmissions for CABG only. We include all unplanned readmissions for several reasons. First, from the patient perspective, a readmission for any reason is likely to be an undesirable outcome of care, even though not all readmissions are preventable. Second, limiting the measure to CABG-related readmissions may focus quality improvement efforts too narrowly rather than encouraging broader initiatives aimed at improving the overall care within the hospital and care transitions from the hospital setting. Moreover, it is often hard to exclude quality issues and accountability for a readmission based on the documented cause of readmission. For example, a patient who underwent a CABG surgery and developed a hospital associated infection might ultimately be readmitted for sepsis. It would be inappropriate to consider such a readmission to be unrelated to the care the patient received for their CABG surgery. Finally, while the measure does not presume that each readmission is preventable, quality improvement interventions generally have shown reductions in all types of readmissions.

The proposed measure does not count planned readmissions as readmissions. Planned readmissions are identified in claims data using the CMS Planned Readmission Algorithm Version 3.0 that detects planned readmissions that may occur within 30 days of discharge from the hospital. Version 2.1 of the algorithm was finalized for use in the Hospital Readmissions Reduction Program in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50651 through 50655). We have since updated the algorithm to Version 3.0 as part of yearly measure maintenance. The proposed CABG readmission measure uses the planned readmission algorithm, tailored for CABG patients. We adapted the algorithm for this group of patients with input from cardiothoracic surgeons and other experts, narrowing the types of readmissions considered planned because planned readmissions following CABG are less common and less varied than among patients discharged from the hospital following a medical admission. More detailed information on how the proposed CABG readmission measure incorporates the CMS Planned Readmission Algorithm Version 3.0 can be found in the 2012 CABG Readmission Measure Methodology Report on the CMS Web site (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). For the proposed CABG readmission measure, unplanned readmissions that fall within the 30-day post-discharge timeframe from the index admission would not be counted as readmissions for the index admission if they were preceded by a planned readmission.

(3) Cohort of Patients

In order to include a clinically coherent set of patients in the measure, we sought input from clinical experts regarding the inclusion of other concomitant cardiac and noncardiac procedures, such as valve replacement and carotid endarterectomy. Adverse clinical outcomes following such procedures are higher than those following “isolated” CABG procedures; that is, CABG procedures performed without concomitant high-risk cardiac and noncardiac procedures. Limiting the measure cohort to “isolated” CABG patients is consistent with published reports of CABG outcomes. Therefore, the proposed measure cohort considers only patients undergoing isolated CABG as eligible for inclusion in the measure. We defined isolated CABG patients as those undergoing CABG procedures without concomitant valve or other major cardiac, vascular or thoracic procedures. In addition, our clinical experts, consultants, and Technical Expert Panel (TEP) members agreed that an isolated CABG cohort is a clinically coherent cohort suitable for a risk-adjusted outcome measure. For detailed information on the cohort definition, we refer readers to the 2012 CABG Readmission Measure Methodology Report on the CMS Web site (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html).

Hannan EL, Zhong Y, Lahey SJ, et al. 30-day readmissions after coronary artery bypass graft surgery in New York State. JACC Cardiovasc Interv. 2011;4(5):569-576.

Comment: One commenter supported focusing on isolated CABG cases for the measure because it reflects a much more cohesive clinical population.

Response: We appreciate the commenter's support.

(4) Inclusion and Exclusion Criteria

The proposed CABG readmission measure includes hospitalizations for patients who are 65 years of age or older at the time of index admission and for whom there was a complete 12 months of Medicare FFS enrollment to allow for adequate data for risk adjustment. The measure excludes the following admissions from the measure cohort: (1) Admissions for patients who are discharged against medical advice (excluded because providers do not have the opportunity to deliver full care and prepare the patient for discharge); (2) admissions for patients who die during the initial hospitalization (these patients are not eligible for readmission); (3) admissions for patients with subsequent qualifying CABG procedures during the measurement period (a repeat CABG procedure during the measurement period very likely represents a complication of the original CABG procedure and is a clinically more complex and higher risk surgery; therefore, we select the first CABG admission for inclusion in the measure and exclude subsequent CABG admissions from the cohort); and (4) admissions for patients without at least 30 days post-discharge enrollment in Medicare FFS (excluded because the 30-day readmission outcome cannot be assessed in this group).

Comment: One commenter did not support the CABG measure because it does not exclude readmissions unrelated to the initial reason for admission.

Response: We addressed a similar comment in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50654). We continue to review and revise the area of unrelated readmissions through our expansion of planned readmissions. Regarding other types of unrelated readmissions, we currently do not seek to differentiate between related and unrelated readmissions because readmissions not directly related to the index condition may still be a result of the care received during the index hospitalization. For example, a patient hospitalized for CABG who develops a hospital associated infection may ultimately be readmitted for sepsis. It would be inappropriate to treat this readmission as unrelated to the care the patient received during the index hospitalization. Furthermore, the range of potentially avoidable readmissions also includes those not directly related to the initial hospitalization, such as those resulting from poor communication at discharge or inadequate follow-up. Therefore, we believe that creating a comprehensive list of potential complications related to the index hospitalization would be arbitrary, incomplete, and, ultimately, extremely difficult to implement. However, in coordination with medical experts, we created a planned readmission algorithm to determine conditions considered planned. Generally, planned readmissions are not a signal of quality of care. Therefore, we have worked with experts in the medical community, as well as other stakeholders, to carefully identify procedures and treatments that should be considered “planned” and, therefore, not counted as readmissions.

(5) Transferred Patients and Attribution of Readmission Outcome

Among medical conditions, such as AMI, heart failure, and pneumonia, transfers between acute care facilities can occur for a variety of different reasons and it is likely that the discharging hospital has the most influence over a patient's risk of readmission and therefore the readmission outcome is appropriately assigned to the hospital that discharges the patient. For that reason, the currently publicly reported AMI, HF, and PN readmission measures attribute the readmission outcome to the hospital discharging the patient, even if that is not the hospital that initially admitted the patient.

In contrast, following CABG surgery, transfer to another acute care facility after CABG is most likely due to a complication of the CABG procedure or the peri-operative care the patient received. Therefore, the care provided by the hospital performing the CABG procedure likely dominates readmission risk, even among transferred patients. This viewpoint is supported by the high proportion of CABG readmissions for diagnoses such as heart failure, pleural effusion, and pneumonia and endorsed by the clinical experts on YNHHSC/CORE, and the STS CABG readmission measure development working groups and our TEP. Therefore, for this measure, the readmission outcome is attributed to the hospital performing the first (“index”) CABG, even if this is not the discharging hospital. For example, a patient may be admitted to hospital A for a CABG that qualifies the patient for inclusion in the measure and is then transferred to hospital B. The initial admission to hospital A and the admission to hospital B are considered one acute episode of care, made up of two inpatient admissions. The measure identifies transferred patients as those who are admitted to an acute care hospital on the same day or following day of discharge from an eligible admission.

Comment: One commenter supported attributing the readmission following a CABG procedure to the hospital performing the first CABG procedure.

Response: We appreciate the commenter's support.

(6) Risk-Adjustment

The proposed CABG readmission measure adjusts for differences across hospitals in the level of risk their patients have for readmission relative to patients cared for by other hospitals. The measure uses administrative claims data to identify patient clinical conditions and comorbidities to adjust patient risk for readmission across hospitals, but does not adjust for potential complications of care. We refer readers to section IV.H.4 of the preamble of this final rule for further discussion of risk-adjustment for socioeconomic factors.

Comment: One commenter was concerned with the CABG readmission measure's predictive ability, but the commenter did not provide additional details of its concern.

Response: We believe the commenter's primary concern is with the c-statistic of the measure, and would like to clarify the important difference between predictive models intended for patient-level risk-stratification versus models used to profile hospital performance. First, in a patient-level predictive model, the objective is to predict patient outcomes and the risk-adjustment variables as a means to best predict these outcomes. As an example, a patient who has a serious complication of care may be at higher risk of mortality and readmission, and therefore complications might be useful to include in a model used for patient-level prediction. Second, and in contrast, the role of risk-adjustment in hospital profiling models is to level the playing field for hospitals in measures that assess hospitals on their relative performance—that is, on how well a hospital is doing compared to other hospitals with similar patients. The risk-adjustment variables should only include those that are inherent to the patient and are present at the start of the time period. Although risk-adjusting for complications of care could increase the statistical power of a profiling model, it would not make sense to risk-adjust for complications because it could lead hospitals with high rates of complications to appear to be performing better than hospitals that admitted similar patients even though the quality of care is worse. We note that, in addition to this clarification, the CABG readmission measure (NQF #2515) risk model has been validated using registry data from the STS' Adult Cardiac Surgery Database and produced nearly identical c-statistics in a matched set of patients with correlation coefficients between 0.92 and 0.96, depending upon the statistic used.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Comment: One commenter encouraged CMS to ensure measures risk-adjust for comorbidities and preexisting conditions for vascular patients as these are major determinants of patient outcomes.

Response: We agree with the commenter that vascular comorbidities and preexisting conditions for vascular patients are important determinants of CABG patient outcomes. The CABG readmission measure adjusts for a range of preexisting comorbidities, including vascular and circulatory conditions, stroke and cerebrovascular disease, and other cardiac disorders such as congestive heart failure and arrhythmias, as well as comorbidities that place patients at risk for these conditions, such as diabetes and end-stage renal disease.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

(7) Calculating the Excess Readmissions Ratio

The proposed CABG readmission measure uses the same methodology and statistical modeling approach as the other Hospital Readmissions Reduction Program measures. We published a detailed description of how the readmission measures estimate the excess readmissions ratio in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53380 through 53381).

In summary, we proposed to adopt the Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery measure in the Hospital Readmissions Reduction Program beginning in FY 2017.

We note that the set of hospitals for which this measure is calculated for the Hospital Readmissions Reduction Program differs from the set of hospitals used in calculations for the Hospital IQR Program. The Hospital Readmissions Reduction Program includes only subsection (d) hospitals as defined in 1886(d)(1)(B) of the Act, while the Hospital IQR Program calculations include non-IPPS hospitals such as CAHs, cancer hospitals, and hospitals located in the Territories of the United States. However, we believe that the CABG readmissions measure is appropriate for use in both programs.

After consideration of the public comments we received, we are finalizing our proposal to adopt the Hospital-Level, 30-Day, All-Cause, Unplanned Readmission Measure Following CABG Surgery measure for inclusion in the Hospital Readmissions Reduction Program for FY 2017.

7. Maintenance of Technical Specifications for Quality Measures

Technical specification of the readmission measures are provided at our Web site in the Measure Methodology Reports (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). Additional resources about the Hospital Readmissions Reduction Program and measure technical specifications and methodology are on the QualityNet Web site on the Resources Web page (available at: https://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228772412995).

Many of the quality measures used in different Medicare and Medicaid reporting programs are NQF endorsed. As part of its regular maintenance process for NQF-endorsed performance measures, the NQF requires measure stewards to submit annual measure maintenance updates and undergo maintenance of endorsement review every 3 years. In the measure maintenance process, the measure steward (owner/developer) is responsible for updating and maintaining the currency and relevance of the measure and will confirm existing or minor specification changes with NQF on an annual basis. NQF solicits information from measure stewards for annual reviews, and it reviews measures for continued endorsement in a specific 3-year cycle.

We note that NQF's annual or triennial maintenance processes for endorsed measures may result in the NQF requiring updates to the measures. We believe that it is important to have in place a subregulatory process to incorporate nonsubstantive updates required by the NQF into the measure specifications we have adopted for the Hospital Readmissions Program so that these measures remain up-to-date. The NQF regularly maintains its endorsed measures through annual and triennial reviews, which may result in the NQF requiring updates to the measures. We note that, for this calendar year, the AMI readmission measure is undergoing the NQF maintenance endorsement process.

For the Hospital Readmissions Reduction Program, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28111), we proposed to follow the finalized processes outlined for addressing changes to adopted measures in the Hospital IQR Program “Maintenance of Technical Specifications for Quality Measures” section found in section IX.A.1.b. of the preamble of the proposed rule.

We believe this proposal adequately balances our need to incorporate NQF updates to NQF-endorsed Hospital Readmissions Reduction Program measures in the most expeditious manner possible while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted. We invited public comment on this proposal.

Comment: One commenter commended the proposal to follow the finalized processes outlined for addressing changes to adopted measures in the Hospital IQR Program “Maintenance of Technical Specifications for Quality Measures” section found in section IX.A.1.b. of the preamble of the proposed rule (79 FR 28218). The commenter noted that this policy of handling substantive and nonsubstantive changes to measures that arise through measure maintenance processes allows CMS two mechanisms to address measure updates: (1) The use of future proposed rules and review periods for substantive changes; (2) subregulatory processes for nonsubstantive changes which also preserves CMS' autonomy and flexibility to rapidly implement nonsubstantive updates to measures.

No commenters opposed or recommended changes to the proposal.

Response: We appreciate the commenter's support.

Comment: One commenter indicated that any changes to a measure developed for adults but now include those <18 years of age should not be considered nonsubstantive.

Response: We appreciate this comment and note that this concern was addressed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50776).

After consideration of the public comments we received, we are finalizing the proposed maintenance of technical specifications for quality measures for the Hospital Readmissions Reduction Program.

8. Waiver From the Hospital Readmissions Reduction Program for Hospitals Formerly Paid Under Section 1814(b)(3) of the Act (§ 412.152 and § 412.154(d))

The definition of “applicable hospital” under section 1886(q)(5)(C) of the Act also includes hospitals paid under section 1814(b)(3) of the Act. Section 1886(q)(2)(B)(ii) of the Act, however, allows the Secretary to exempt such hospitals from the Hospital Readmissions Reduction Program, provided that the State submit an annual report to the Secretary describing how a similar program to reduce hospital readmissions in that State achieves or surpasses the measured results in terms of health outcomes and cost savings established by Congress for the program as applied to “subsection (d) hospitals.”

The State of Maryland entered into an agreement with CMS, effective January 1, 2014, to participate in CMS' new Maryland All-Payer Model, a 5-year hospital payment model. This model is being implemented under section 1115A of the Act, as added by section 3021 of the Affordable Care Act, which authorizes the testing of innovative payment and service delivery models, including models that allow States to “test and evaluate systems of all-payer payment reform for the medical care of residents of the State, including dual-eligible individuals.” Section 1115A of the Act authorizes the Secretary to waive such requirements of titles XI and XVIII of the Act as may be necessary solely for purposes of carrying out section 1115A of the Act with respect to testing models.

As part of this agreement, Medicare will no longer pay Maryland hospitals in accordance with section 1814(b)(3) of the Act. Therefore, section 1886(q)(2)(B)(ii) of the Act is no longer applicable to Maryland hospitals. The effect of Maryland hospitals no longer being paid under 1814(b)(3) of the Act is that they are not entitled to be exempted from the Hospital Readmissions Reduction Program under section 1886(q)(2)(B)(ii) of the Act but, for the model, would be included in the Hospital Readmissions Reduction Program. In other words, the exemption from the Hospital Readmissions Reduction Program under section 1814(b)(3) of the Act no longer applies. However Maryland hospitals will not be participating in the Hospital Readmissions Reduction Program because section 1886(q) of the Act and its implementing regulations have been waived for purposes of the model, under the terms of the agreement.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28111 through 28112), we proposed to make conforming changes to the implementing regulations to reflect this change. Under § 412.152, we proposed to delete from the definition of an “applicable hospital” the following language: “or a hospital in Maryland that is paid under section 1814(b)(3) of the Act and that, absent the waiver specified by section 1814(b)(3) of the Act, would have been paid under the hospital inpatient prospective payment system.” Under § 412.154, we proposed to delete § 412.154(d) in its entirety.

We invited public comment on these proposals.

Comment: Several commenters supported CMS' proposal to continue to exempt Maryland hospitals, now being paid under the Maryland All-Payer Model, from the Hospital Readmissions Reduction Program and the proposed conforming changes to the Hospital Readmissions Reduction Program regulations.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing the changes to the Hospital Readmissions Reduction Program regulations as proposed without modification. Specifically, we are finalizing our proposal to make conforming changes to our regulations at § 412.152 and § 412.154(d) to reflect that Maryland elected to no longer have Medicare pay Maryland hospitals in accordance with section 1814(b)(3) of the Act.

9. Floor Adjustment Factor for FY 2015 (§ 412.154(c)(2))

Section 1886(q)(3)(A) of the Act defines the “adjustment factor” for an applicable hospital for a fiscal year as equal to the greater of “(i) the ratio described in subparagraph (B) for the hospital for the applicable period (as defined in paragraph (5)(D)) for such fiscal year; or (ii) the floor adjustment factor specified in subparagraph (C).” Section 1886(q)(3)(B) of the Act, in turn, describes the ratio used to calculate the adjustment factor. Specifically, it states that the ratio is “equal to 1 minus the ratio of—(i) the aggregate payments for excess readmissions . . . and (ii) the aggregate payments for all discharges . . . .” The calculation of this ratio is codified at § 412.154(c)(1) of the regulations. Section 1886(q)(3)(C) of the Act specifies the floor adjustment factor, which is set at 0.99 for FY 2013, 0.98 for FY 2014, and 0.97 for FY 2015 and subsequent fiscal years. We codified the floor adjustment factor at § 412.154(c)(2) of the regulations (77 FR 53386).

Consistent with 1886(q)(3) of the Act, codified at § 412.154(c)(2), the adjustment factor is either the greater of the ratio or, for FY 2015 and subsequent fiscal years, a floor adjustment factor of 0.97. Under our established policy, the ratio is rounded to the fourth decimal place. In other words, for FY 2015 and subsequent fiscal years, a hospital subject to the Hospital Readmissions Reduction Program will have an adjustment factor that is between 1.0 (no reduction) and 0.9700 (greatest possible reduction).

Comment: One commenter expressed concern that the maximum reduction has been raised from 2 percent to 3 percent and that, in conjunction with adding two new measures to the program, this change will only increase harm to safety net hospitals.

Response: We recognize the commenter's concern regarding the magnitude of the maximum payment reduction for FY 2015 provided under the statute. Section 1886(q)(3) of the Act requires that, effective for discharges occurring in FY 2015 and beyond, the maximum readmissions payment adjustment factor or the floor adjustment factor to be 0.97 or a 3 percent reduction, applied to a hospital's base operating DRG payment amount. We note that we estimate that only 39 hospitals will be subject to the maximum reduction for FY 2015.

After consideration of the public comments we received, we are finalizing our proposal that the floor adjustment factor be 0.97 for FY 2015, consistent with section 1886(q)(3) of the Act, as codified at § 412.154(c)(2).

10. Applicable Period for FY 2015

Under section 1886(q)(5)(D) of the Act, the Secretary has the authority to specify the applicable period with respect to a fiscal year under the Hospital Readmissions Reduction Program. We finalized our policy to use 3 years of claims data to calculate the readmission measures in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51671). In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53675), we codified the definition of “applicable period” in the regulations at 42 CFR 412.152 as the 3-year period from which data are collected in order to calculate excess readmissions ratios and adjustments for the fiscal year, which includes aggregate payments for excess readmissions and aggregate payments for all discharges used in the calculation of the payment adjustment.

Consistent with the definition at § 412.152, we established that the applicable period for FY 2014 under the Hospital Readmissions Reduction Program is the 3-year period from July 1, 2009, to June 30, 2012. That is, we determined the excess readmissions ratios and calculate the payment adjustment (including aggregate payments for excess readmissions and aggregate payments for all discharges) for FY 2014 using data from the 3-year time period of July 1, 2009 to June 30, 2012, as this was the most recent available 3-year period of data upon which to base these calculations (78 FR 50669).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28112), for FY 2015, consistent with the definition at § 412.152, we proposed an “applicable period” for the Hospital Readmissions Reduction Program to be the 3-year period from July 1, 2010 to June 30, 2013. In other words, we proposed that the excess readmissions ratios and the payment adjustment (including aggregate payments for excess readmissions and aggregate payments for all discharges) for FY 2015 would be calculated based on data from the 3-year time period of July 1, 2010 to June 30, 2013. We invited public comment on these proposals.

Comment: Several commenters requested that CMS make real-time reporting of readmission rates accessible to hospitals, while other commenters suggested that CMS monitor reported data for correlation and trends to identify if hospitals are making unacceptable trade-offs by reducing readmissions at the expense of increasing post discharge mortality.

Response: We note that these requests are considered out of scope for the Hospital Readmissions Reduction Program in the FY 2015 IPPS/LTCH PPS proposed rule and will take these requests under consideration during future rulemaking.

Comment: Several commenters requested that CMS revise the applicable time period to only include the most recent year. One commenter believed that it is unfair to penalize hospitals for performance from 2 or 3 years ago, especially if they have improved in the most recent year.

Response: We note that we addressed this concern in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53380), and that we use a 3-year period of index admissions to increase the number of cases per hospital used for measure calculation, which improves the precision of each hospital's readmission estimate. Although this approach utilizes older data, it also identifies more variation in hospital performance and still allows for improvement from one year of reporting to the next.

After consideration of the public comments we received, we are finalizing as proposed the applicable period of the 3-year time period of July 1, 2010 to June 30, 2013 to calculate the excess readmission ratios and the readmission payment adjustment factors for FY 2015.

11. Inclusion of THA/TKA and COPD Readmissions Measures To Calculate Aggregate Payments for Excess Readmissions Beginning in FY 2015

Under the Hospital Readmissions Reduction Program the “base operating DRG payment amount” defined at § 412.152 is used both to determine the readmission adjustment factor that accounts for excess readmissions under section 1886(q)(3) of the Act and to determine which payment amounts will be adjusted to account for excess readmissions under section 1886(q) of the Act. Consistent with section 1886(q)(2) of the Act, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53374 through 53383), under the regulations at § 412.152, we define the “base operating DRG payment amount” and specify that it does not include adjustments or add-on payments for IME, DSH, outliers and low-volume hospitals as required by section 1886(q)(2) of the Act. Furthermore, consistent with section 1886(q)(2)(B)(i) of the Act, for SCHs and for MDHs for FY 2013, the definition of “base operating DRG payment amount” at § 412.152 excludes the difference between the hospital's applicable hospital-specific payment rate and the Federal payment rate.

For FY 2015 and subsequent years, for purposes of calculating the payment adjustment factors and applying the payment methodology, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28112 through 282117), we proposed that the base operating DRG payment amount for MDHs includes the difference between the hospital-specific payment rate and the Federal payment rate (as applicable).

Section 1886(q)(3)(B) of the Act specifies the ratio used to calculate the adjustment factor under the Hospital Readmissions Reduction Program. It states that the ratio is “equal to 1 minus the ratio of—(i) the aggregate payments for excess readmissions . . . and (ii) the aggregate payments for all discharges. . . .” The definition of “aggregate payments for excess readmissions” and “aggregate payments for all discharges,” as well as a methodology for calculating the numerator of the ratio (aggregate payments for excess readmissions) and the denominator of the ratio (aggregate payments for all discharges) are codified at § 412.154(c)(2) of the regulations (77 FR 53387).

Section 1886(q)(4) of the Act sets forth the definitions of “aggregate payments for excess readmissions” and “aggregate payments for all discharges” for an applicable hospital for the applicable period. The term “aggregate payments for excess readmissions” is defined in section 1886(q)(4)(A) of the Act as “for a hospital for an applicable period, the sum, for applicable conditions . . . of the product, for each applicable condition, of (i) the base operating DRG payment amount for such hospital for such applicable period for such condition; (ii) the number of admissions for such condition for such hospital for such applicable period; and (iii) the excess readmissions ratio . . . for such hospital for such applicable period minus 1.” We codified this definition of “aggregate payments for excess readmissions” under the regulations at § 412.152 as the product, for each applicable condition, of: (1) The base operating DRG payment amount for the hospital for the applicable period for such condition; (2) the number of admissions for such condition for the hospital for the applicable period; and (3) the excess readmissions ratio for the hospital for the applicable period minus 1 (77 FR 53675).

The excess readmissions ratio is a hospital-specific ratio calculated for each applicable condition. Specifically, section 1886(q)(4)(C) of the Act defines the excess readmissions ratio as the ratio of “risk-adjusted readmissions based on actual readmissions” for an applicable hospital for each applicable condition, to the “risk-adjusted expected readmissions” for the applicable hospital for the applicable condition. The methodology for the calculation of the excess readmissions ratio was finalized in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51673). “Aggregate payments for excess readmissions” is the numerator of the ratio used to calculate the adjustment factor under the Hospital Readmissions Reduction Program (as described in further detail later in this section).

The term “aggregate payments for all discharges” is defined at section 1886(q)(4)(B) of the Act as “for a hospital for an applicable period, the sum of the base operating DRG payment amounts for all discharges for all conditions from such hospital for such applicable period.” “Aggregate payments for all discharges” is the denominator of the ratio used to calculate the adjustment factor under the Hospital Readmissions Reduction Program. We codified this definition of “aggregate payments for all discharges” under the regulations at § 412.152 (77 FR 53387).

We finalized the inclusion of two additional applicable conditions, COPD and THA/TKA, to the Hospital Readmissions Reduction Program beginning for FY 2015 in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50657 through 50664). In section IV.H.11. of the preamble of the proposed rule, we discussed the proposed methodology to include these two additional measures in the calculation of the readmissions payment adjustment for FY 2015. Specifically, we proposed how the addition of COPD and THA/TKA applicable conditions would be included in the calculation of the aggregate payments for excess readmissions, which is the numerator of the readmissions payment adjustment. We note that this proposal does not alter our established methodology for calculating aggregate payments for all discharges, that is, the denominator of the ratio (77 FR 53387).

As discussed above, when calculating the numerator (aggregate payments for excess readmissions), we determine the base operating DRG payments for the applicable period. “Aggregate payments for excess readmissions” (the numerator) is defined as “the sum, for applicable conditions . . . of the product, for each applicable condition, of (i) the base operating DRG payment amount for such hospital for such applicable period for such condition; (ii) the number of admissions for such condition for such hospital for such applicable period; and (iii) the excess readmissions ratio . . . for such hospital for such applicable period minus 1.”

When determining the base operating DRG payment amount for an individual hospital for such applicable period for such condition, we use Medicare inpatient claims from the MedPAR file with discharge dates that are within the same applicable period to calculate the excess readmissions ratio. We use MedPAR claims data as our data source for determining aggregate payments for excess readmissions and aggregate payments for all discharges, as this data source is consistent with the claims data source used in IPPS rulemaking to determine IPPS rates.

For FY 2015, we proposed to use MedPAR claims with discharge dates that are on or after July 1, 2010, and no later than June 30, 2013. Under our established methodology we use the update of the MedPAR file for each Federal fiscal year, which is updated ­6 months after the end of each Federal fiscal year within the applicable period, as our data source (that is, the March updates of the respective Federal fiscal year MedPAR files) for the final rules.

The FY 2010 through FY 2013 MedPAR data files can be purchased from CMS. Use of these files allows the public to verify the readmissions adjustment factors. Interested individuals may order these files through the CMS Web site at: http://www.cms.hhs.gov/LimitedDataSets/ by clicking on MedPAR Limited Data Set (LDS)-Hospital (National). This Web page describes the files and provides directions and further detailed instructions for how to order the data sets. Persons placing an order must send the following: A Letter of Request, the LDS Data Use Agreement and Research Protocol (refer to the Web site for further instructions), the LDS Form, and a check for $3,655 to:

  • If using the U.S. Postal Service: Centers for Medicare and Medicaid Services, RDDC Account, Accounting Division, P.O. Box 7520, Baltimore, MD 21207-0520.
  • If using express mail: Centers for Medicare and Medicaid Services, OFM/Division of Accounting-RDDC, Mailstop C#07-11, 7500 Security Boulevard, Baltimore, MD 21244-1850.

In the proposed rule, we proposed to determine aggregate payments for excess readmissions and aggregate payments for all discharges using data from MedPAR claims with discharge dates that are on or after July 1, 2010, and no later than June 30, 2013. However, we note that, for the purpose of modeling the proposed FY 2015 readmissions payment adjustment factors for the proposed rule, we used excess readmissions ratios for applicable hospitals from the FY 2014 Hospital Readmissions Reduction Program applicable period. For the final rule, applicable hospitals will have had the opportunity to review and correct data from the proposed FY 2015 applicable period of July 1, 2010 to June 30, 2013, before they are made public under our policy regarding the reporting of hospital-specific information, which is discussed later in this section.

In the proposed rule, for FY 2015, we proposed to use MedPAR data from July 1, 2010 through June 30, 2013. Specifically, in the proposed rule, we used the March 2011 update of the FY 2010 MedPAR file to identify claims within FY 2010 with discharges dates that are on or after July 1, 2010, the March 2012 update of the FY 2011 MedPAR file to identify claims within FY 2011, the March 2013 update of the FY 2012 MedPAR file to identify claims within FY 2012, and the December 2013 update of the FY 2013 MedPAR file to identify claims within FY 2013 with discharge dates no later than June 30, 2013. For the final rule, we proposed to use the same MedPAR files as listed above for claims within FY 2010, FY 2011 and FY 2012. For claims within FY 2013, we proposed to use in the final rule the March 2014 update of the FY 2013 MedPAR file.

In order to identify the admissions for each condition, including the two additional conditions THA/TKA and COPD, to calculate the aggregate payments for excess readmissions for an individual hospital, for FY 2015, we proposed to identify each applicable condition using the ICD-9-CM codes used to identify applicable conditions to calculate the excess readmissions ratios. Under our existing policy, we identify eligible hospitalizations and readmissions of Medicare patients discharged from an applicable hospital having a principal diagnosis for the measured condition in an applicable period (76 FR 51669). The discharge diagnoses for each applicable condition are based on a list of specific ICD-9-CM codes for that condition. These codes are posted on the QualityNet Web site at: http://www.QualityNet.org > Hospital-Inpatient > Claims-Based Measures > Readmission Measures > Measure Methodology.

In order to identify the applicable conditions to calculate the aggregate payments for excess readmissions, for FY 2015, we proposed to identify the claim as an applicable condition consistent with the methodology to identify conditions to calculate the excess readmissions ratio. In other words, the applicable conditions of AMI, HF and PN are identified for the calculation of aggregate payments for excess readmissions if the ICD-9-CM code for that condition is listed as the principal diagnosis on the claim.

In order to identify claims with the applicable condition of THA/TKA, we proposed that any claim that has the procedure codes for THA/TKA listed in any diagnosis/procedure field of the claim would be included in the calculation of aggregate payments for readmissions, consistent with the methodology to calculate the excess readmissions ratio for THA/TKA. In order to identify claims with the applicable condition of COPD, we proposed to identify claims that either have the ICD-9-CM code for that condition is listed as the principal diagnosis on the claim or has a principal diagnosis of some respiratory failure along with secondary diagnosis of COPD.

Under our established methodology for calculating aggregate payments for readmissions, admissions that are not considered index admissions for the purpose of the readmissions measures are excluded from the calculation of the excess readmissions ratio, and therefore also are not considered admissions for the purposes of determining a hospital's aggregate payments for excess readmissions (78 FR 50670 through 50876). With the addition of THA/TKA and COPD as applicable conditions beginning in FY 2015, we proposed to modify our current methodology to identify the admissions included in the calculation of “aggregate payments for excess readmissions” for THA/TKA and COPD in the same manner as the original applicable conditions (AMI, HF and PN). That is, THA/TKA and COPD admissions that would not considered index admissions in the readmissions measures also would not considered admissions for the purposes of calculation a hospital's aggregate payments for excess readmissions.

In the proposed rule, for FY 2015, we proposed to continue to apply the same exclusions to the claims in the MedPAR file as we applied for FY 2014 (78 FR 50670 through 50673), and we proposed to apply those exclusions for the two additional applicable conditions, THA/TKA and COPD. For FY 2015, in order to have the same types of admissions to calculate aggregate payments for excess readmissions as is used to calculate the excess readmissions ratio, we proposed to identify admissions for all five applicable conditions, AMI, HF, PN, THA/TKA and COPD, for the purposes of calculating aggregate payments for excess readmissions as follows:

  • We would exclude admissions that are identified as an applicable condition if the patient died in the hospital, as identified by the discharge status code on the MedPAR claim.
  • We would exclude admissions identified as an applicable condition for which the patient was transferred to another provider that provides acute care hospital services (that is, a CAH or an IPPS hospital), as identified through examination of contiguous stays in MedPAR at other hospitals.
  • We would exclude admissions identified as an applicable condition for patients who are under the age of 65, as identified by linking the claim information to the information provided in the Medicare Enrollment Database.
  • For conditions identified as AMI, we would exclude claims that are same day discharges, as identified by the admission date and discharge date on the MedPAR claim.
  • We would exclude admissions for patients who did not have Medicare Parts A and B FFS enrollment in the 12 months prior to the index admission, based on the information provided in the Medicare Enrollment Database.
  • We would exclude admissions for patients without at least 30 days post-discharge enrollment in Medicare Parts A and B fee-for-service, based on the information provided in the Medicare Enrollment Database.
  • We would exclude all multiple admissions within 30 days of a prior index admission's discharge date, as identified in the MedPAR file, consistent with how multiple admissions within 30 days of an index admission are excluded from the calculation of the excess readmissions ratio.

These exclusions are consistent with our current methodology, which was established in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50671).

In addition to the exclusions described above for all five applicable conditions, for FY 2015, we proposed the following steps to identify admissions specifically for THA/TKA for the purposes of calculating aggregate payments for excess readmissions:

  • We proposed to exclude admissions for THA/TKA for all transfer cases regardless of whether the discharge was a transfer to another hospital or from another hospital, consistent with the calculation of the excess readmissions ratio for THA/TKA.
  • We proposed to exclude admissions for THA/TKA for cases where the discharge includes a femur, hip, or pelvic fracture coded in the principal or secondary diagnosis fields, consistent with the calculation of the excess readmissions ratio for THA/TKA.
  • We proposed to exclude admissions for THA/TKA for cases where the discharge includes a mechanical complication coded in the principal diagnosis field, consistent with the calculation of the excess readmissions ratio for THA/TKA.
  • We proposed to exclude admissions for THA/TKA for cases where the discharge includes a malignant neoplasm of the pelvis, sacrum, coccyx, lower limbs, or bone/bone marrow or a disseminated malignant neoplasm coded in the principal diagnosis field, consistent with the calculation of the excess readmissions ratio for THA/TKA.
  • We proposed to exclude admissions for THA/TKA for cases where the discharge includes more than two hip/knee procedures.
  • We proposed to exclude admissions for THA/TKA for cases that meet either any of the following conditions or following procedures concurrent with THA/TKA: Revision procedures; partial hip arthroplasty (PHA) procedures; resurfacing procedures; and removal of implanted devices/prostheses.

Furthermore, we proposed to only identify Medicare FFS claims that meet the criteria (that is, claims paid for under Medicare Part C (Medicare Advantage) would not be included in this calculation), consistent with the methodology to calculate excess readmissions ratios based solely on admissions and readmissions for Medicare FFS patients. Therefore, consistent with our established methodology, for FY 2015, we would exclude admissions for patients enrolled in Medicare Advantage as identified in the Medicare Enrollment Database. This proposal is consistent with how admissions for Medicare Advantage patients are identified in the calculation of the excess readmissions ratios under our established methodology. The tables below list the ICD-9-CM codes we proposed to use to identify each applicable condition to calculate the aggregate payments for excess readmissions under this proposal for FY 2015. The tables include the ICD-9-CM codes we proposed to use to identify the two conditions, THA/TKA and COPD, added to the Hospital Readmissions Reduction Program beginning for FY 2015. These ICD-9-CM codes also would be used to identify the applicable conditions to calculate the excess readmissions ratios, consistent with our established policy (76 FR 51673 through 51676).

ICD-9-CM Codes To Identify Pneumonia (PN) Cases

ICD-9-CM CodeDescription of code
480.0Pneumonia due to adenovirus.
480.1Pneumonia due to respiratory syncytial virus.
480.2Pneumonia due to parainfluenza virus.
480.3Pneumonia due to SARS-associated coronavirus.
480.8Viral pneumonia: pneumonia due to other virus not elsewhere classified.
480.9Viral pneumonia unspecified.
481Pneumococcal pneumonia [streptococcus pneumoniae pneumonia].
482.0Pneumonia due to klebsiella pneumoniae.
482.1Pneumonia due to pseudomonas.
482.2Pneumonia due to hemophilus influenzae [h. influenzae].
482.30Pneumonia due to streptococcus unspecified.
482.31Pneumonia due to streptococcus group a.
482.32Pneumonia due to streptococcus group b.
482.39Pneumonia due to other streptococcus.
482.40Pneumonia due to staphylococcus unspecified.
482.41Pneumonia due to staphylococcus aureus.
482.42Methicillin Resistant Pneumonia due to Staphylococcus Aureus.
482.49Other staphylococcus pneumonia.
482.81Pneumonia due to anaerobes.
482.82Pneumonia due to escherichia coli [e.coli].
482.83Pneumonia due to other gram-negative bacteria.
482.84Pneumonia due to legionnaires' disease.
482.89Pneumonia due to other specified bacteria.
482.9Bacterial pneumonia unspecified.
483.0Pneumonia due to mycoplasma pneumoniae.
483.1Pneumonia due to chlamydia.
483.8Pneumonia due to other specified organism.
485Bronchopneumonia organism unspecified.
486Pneumonia organism unspecified.
487.0Influenza with pneumonia.
488.11Influenza due to identified novel H1N1 influenza virus with pneumonia.

ICD-9-CM Codes To Identify Heart Failure (HF) Cases

ICD-9-CM CodeCode description
402.01Hypertensive heart disease, malignant, with heart failure.
402.11Hypertensive heart disease, benign, with heart failure.
402.91Hypertensive heart disease, unspecified, with heart failure.
404.01Hypertensive heart and chronic kidney disease, malignant, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified.
404.03Hypertensive heart and chronic kidney disease, malignant, with heart failure and with chronic kidney disease stage V or end stage renal disease.
404.11Hypertensive heart and chronic kidney disease, benign, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified.
404.13Hypertensive heart and chronic kidney disease, benign, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified failure and chronic kidney disease stage V or end stage renal disease.
404.91Hypertensive heart and chronic kidney disease, unspecified, with heart failure and chronic kidney disease stage V or end stage renal disease heart failure and with chronic kidney disease stage I through stage IV, or unspecified.
404.93Hypertensive heart and chronic kidney disease, unspecified, with heart failure and chronic kidney disease stage V or end stage renal disease.
428.xxHeart Failure.

ICD-9-CM Codes To Identify Acute Myocardial Infarction (AMI) Cases

ICD-9-CM codeDescription of Code
410.00AMI (anterolateral wall)—episode of care unspecified.
410.01AMI (anterolateral wall)—initial episode of care.
410.10AMI (other anterior wall)—episode of care unspecified.
410.11AMI (other anterior wall)—initial episode of care.
410.20AMI (inferolateral wall)—episode of care unspecified.
410.21AMI (inferolateral wall)—initial episode of care.
410.30AMI (inferoposterior wall)—episode of care unspecified.
410.31AMI (inferoposterior wall)—initial episode of care.
410.40AMI (other inferior wall)—episode of care unspecified.
410.41AMI (other inferior wall)—initial episode of care.
410.50AMI (other lateral wall)—episode of care unspecified.
410.51AMI (other lateral wall)—initial episode of care.
410.60AMI (true posterior wall)—episode of care unspecified.
410.61AMI (true posterior wall)—initial episode of care.
410.70AMI (subendocardial)—episode of care unspecified.
410.71AMI (subendocardial)—initial episode of care.
410.80AMI (other specified site)—episode of care unspecified.
410.81AMI (other specified site)—initial episode of care.
410.90AMI (unspecified site)—episode of care unspecified.
410.91AMI (unspecified site)—initial episode of care.

ICD-9-CM Codes To Identify Chronic Obstructive Pulmonary Disease (COPD) Cases

ICD-9-CM CodeDescription of code
491.21Obstructive chronic bronchitis; With (acute) exacerbation; acute exacerbation of COPD, decompensated COPD, decompensated COPD with exacerbation.
491.22Obstructive chronic bronchitis; with acute bronchitis.
491.8Other chronic bronchitis. Chronic: tracheitis, tracheobronchitis.
491.9Unspecified chronic bronchitis.
492.8Other emphysema; emphysema (lung or pulmonary): NOS, centriacinar, centrilobular, obstructive, panacinar, panlobular, unilateral, vesicular. MacLeod's syndrome; Swyer-James syndrome; unilateral hyperlucent lung.
493.20Chronic obstructive asthma; asthma with COPD, chronic asthmatic bronchitis, unspecified.
493.21Chronic obstructive asthma; asthma with COPD, chronic asthmatic bronchitis, with status asthmaticus.
493.22Chronic obstructive asthma; asthma with COPD, chronic asthmatic bronchitis, with (acute) exacerbation.
496Chronic: nonspecific lung disease, obstructive lung disease, obstructive pulmonary disease (COPD) NOS. NOTE: This code is not to be used with any code from categories 491-493.
518.81*Other diseases of lung; acute respiratory failure; respiratory failure NOS.
518.82*Other diseases of lung; acute respiratory failure; other pulmonary insufficiency, acute respiratory distress.
518.84*Other diseases of lung; acute respiratory failure; acute and chronic respiratory failure.
799.1*Other ill-defined and unknown causes of morbidity and mortality; respiratory arrest, cardiorespiratory failure.
*Principal diagnosis when combined with a secondary diagnosis of AECOPD (491.21, 491.22, 493.21, or 493.22)

ICD-9-CM Codes To Identify Total Hip Arthroplasty/Total Knee Arthroplaty (THA/TKA) Cases

ICD-9-CM codeDescription of code
81.51Total hip arthroplasty.
81.54Total knee arthroplasty.

For FY 2015, we proposed to calculate aggregate payments for excess readmissions, using MedPAR claims from July 1, 2010 to June 30, 2013, to identify applicable conditions based on the same ICD-9-CM codes used to identify the conditions for the readmissions measures, and to apply the proposed exclusions for the types of admissions discussed above. To calculate aggregate payments for excess readmissions, we proposed to calculate the base operating DRG payment amounts for all claims in the 3-year applicable period for each applicable condition (AMI, HF, PN, COPD and THA/TKA) based on the claims we have identified as described above. Once we have calculated the base operating DRG amounts for all the claims for the five applicable conditions, we proposed to sum the base operating DRG payments amounts by each condition, resulting in five summed amounts, one amount for each of the five applicable conditions. We proposed to then multiply the amount for each condition by the respective excess readmissions ratio minus 1 when that excess readmissions ratio is greater than 1, which indicates that a hospital has performed, with respect to readmissions for that applicable condition, worse than the average hospital with similar patients. Each product in this computation represents the payments for excess readmissions for that condition. We proposed to then sum the resulting products which represent a hospital's proposed “aggregate payments for excess readmissions” (the numerator of the ratio). Because this calculation is performed separately for each of the five conditions, a hospital's excess readmissions ratio must be less than or equal to 1 on each measure to aggregate payments for excess readmissions (and thus a payment reduction under the Hospital Readmissions Reduction Program). We note that we did not propose any changes to our existing methodology to calculate “aggregate payments for all discharges” (the denominator of the ratio).

We proposed the following methodology for FY 2015 as displayed in the chart below.

Formulas To Calculate the Readmissions Adjustment Factor

Aggregate payments for excess readmissions = [sum of base operating DRG payments for AMI × (Excess Readmissions Ratio for AMI-1)] + [sum of base operating DRG payments for HF × (Excess Readmissions Ratio for HF-1)] + [sum of base operating DRG payments for PN × (Excess Readmissions Ratio for PN-1)] + [sum of base operating DRG payments for COPD) × (Excess Readmissions Ratio for COPD-1)] + [sum of base operating DRG payments for THA/TKA × (Excess Readmissions Ratio for THA/TKA-1)].
*Note, if a hospital's excess readmissions ratio for a condition is less than/equal to 1, then there are no aggregate payments for excess readmissions for that condition included in this calculation.
Aggregate payments for all discharges = sum of base operating DRG payments for all discharges.
Ratio = 1-(Aggregate payments for excess readmissions/Aggregate payments for all discharges).
Proposed Readmissions Adjustment Factor for FY 2015 is the higher of the ratio or 0.9700.
*Based on claims data from July 1, 2010 to June 30, 2013 for FY 2015.

We invited public comment on these proposals.

Comment: Several commenters supported the inclusion of the Elective Primary Total Hip Arthroplasty (THA) and/or Total Knee Arthroplasty (TKA) All-Cause Unplanned 30-Day Risk-Standardized Readmission and the Hospital-level 30-day Readmission Following Admission for an Acute Exacerbation of Chronic Obstructive Pulmonary Disease measures. Others commenters supported the modified exclusions for both of these measures, as well as the payment adjustment factor and calculation of aggregate payments.

Response: We thank the commenters for support of the exclusions, payment adjustment factor, and calculation of aggregate payments for the Elective Primary Total Hip Arthroplasty (THA) and/or Total Knee Arthroplasty (TKA) All-Cause Unplanned 30-Day Risk-Standardized Readmission and the Hospital-Level 30-day Readmission Following Admission for an Acute Exacerbation of Chronic Obstructive Pulmonary Disease measures, and the support to expand the Hospital Readmissions Reduction Program with this measures.

Comment: On CMS' proposed methodology to identify THA/TKA admissions to include in the calculation of Aggregate Payments for Excess Readmissions, one commenter recommended that CMSCMS expand the list of exclusions to specifically exclude conversion of previous hip surgery to total hip arthroplasty (represented by CPT code 27132). The commenter noted that, while the current granularity of the ICD-9-CM coding framework may complicate isolating these cases, the commenter believed that the previous surgery of the hip is a specific risk factor for complications (for example, infection, fracture), and therefore these cases should be identified for purposes of the readmission measure.

Response: As discussed earlier in this final rule, in order to calculate aggregate payments for excess readmissions, consistent with our existing policy, we proposed to identify each applicable condition using the ICD-9-CM codes used to identify applicable conditions to calculate the excess readmissions ratios. We do not believe it would be appropriate to apply an exclusion to the set of admissions used to calculate the aggregate payments that is not applied in the measure cohort definition that is calculation of the excess readmission ratio. The current measure for THA/TKA excludes specific groups of patients with prior hip surgeries that place them at a significantly increased risk of complications, including revision procedures and those requiring removal of implanted devices from the femur (ICD-9-CM codes 78.65). We are currently exploring the specificity of ICD-9-CM versus CPT codes for prior hip surgery to assess whether the measure cohort definition could be further refined by including CPT codes. If we determine that any changes to the measure cohort may be appropriate, we would propose such changes through future rulemaking.

Comment: Several commenters recommended changes to the methodology to calculate the readmission payment adjustment factors. Several commenters stated that the proposed calculation of the readmission payment adjustment factor creates excessive payment reductions. Commenters noted that the calculation of the readmissions payment adjustment factors is flawed because the excess readmission ratio should be applied to the number of a hospital's readmissions, not admissions, in order to determine the hospital's excess payments for readmissions.

Furthermore, these commenters asserted that CMS has the authority through rulemaking to apply the excess readmission ratio to a hospital's readmissions to determine a hospital's excess payments for readmissions, which they believed would be consistent with Congressional intent. Commenters noted that CMS' estimated savings exceed the Congressional Budget Office (CBO) score for the provision, which commenters believed demonstrates that CMS' literal reading of the statute is not consistent with Congressional intent. Commenters also suggested that CMS could determine the magnitude of the readmission reduction using the 25th percentile of hospital performance on the readmission measures rather than assuming average hospital performance, which is the assumption of the current methodology used to determine the number of expected readmissions.

Response: We received a similar comment in response to the FY 2013 IPPS/LTCH PPS proposed rule (77 FR 53393) and to the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 50673). We continue to believe that the statute is prescriptive with respect to the calculation of “aggregate payments for excess readmissions” where the statute specifies that the “aggregate payments for excess readmissions” is the sum for each condition of the product of “the operating DRG payment amount for such hospital for such applicable period for such condition” and “the number of admissions for such condition” and “the excess readmission ratio” minus one. We believe that section 1886(q)(4)(A) of the Act requires us to include all admissions for a condition in the calculation of “aggregate payments for excess readmissions.” We do not believe we have the discretionary authority to implement an alternative methodology under the existing the statute. We continue to believe that we are implementing the provision as required by law.

Comment: Several commenters stated that the Hospital Readmissions Reduction Program does not account for improvement in readmission rates. One commenter asserted that there is no incentive for improvement under the Hospital Readmissions Reduction Program as there is in the Hospital VBP Program and stated that penalties under this program are due to issues out of the control of the hospital. One commenter suggested that the penalty should equal the cost of excess readmissions over a fixed target level of readmissions, as opposed to a hospital being measured against the national average.

Response: We appreciate the comments on various ways to change the calculations of the readmissions payment adjustment factors and readmissions measures to account for improvement in readmission rates or provide incentives for readmissions, as opposed to penalties. We received similar comments in responses in previous rulemaking (77 FR 53394 and 78 FR 50673). The Hospital Readmissions Reduction Program under section 1886(q) of the Act is structured to compare a hospital's performance for certain conditions compared to the average hospital. If a hospital can improve over time and those improvements result in a performance on readmissions for the applicable conditions that is better than the average hospital, the hospital has the potential to reduce its penalty or not be subject to a penalty at all. As we have stated in previous rules, the statute does not provide us with the authority to reward hospitals for improvement, which is allowed under section 1886 (p) of the Act for the Hospital VBP Program.

Comment: MedPAC provided several recommendations to change the Hospital Readmissions Reduction Program related to the calculation of the readmissions payment adjustment factor, which MedPAC acknowledged would require statutory changes. Specifically, MedPAC stated that the readmission penalty formula is flawed because aggregate penalties remain constant even as national readmission rates decline. In addition, MedPAC pointed out that the condition-specific penalty per excess readmission is higher for conditions with low readmission rates, which becomes more important with the inclusion of elective total hip and total knee arthroplasty (relatively low readmission rate conditions) to the Hospital Readmissions Reduction Program. Lastly, MedPAC believed the readmissions multiplier should be removed from the formula and replaced with a penalty that roughly equals the cost of excess readmissions over a fixed target level of readmissions. Given a fixed target, under this approach penalties would decline if hospitals' collective performance improves.

Response: We appreciate the comments and suggestions made by MedPAC. We note that these comments are similar to comments submitted year for the FY 2014 IPPS/LTCH PPS final rule (78 FR 50674), and we agree that to implement these recommendations would require statutory changes.

Comment: Several commenters requested that CMS clarify whether admissions denied by the CMS Recovery Audit Contractor (RACs) are excluded from either the numerator or the denominator in the calculation of the excess readmission ratios or in the calculation of the readmissions payment adjustment factors. Commenters believed that by including admissions denied by the CMS RACs, a hospital would be penalized twice for the same admission—once by the RAC denial and a second time by having the admission included in the readmission payment penalty.

Response: As we explained in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50675), we use MedPAR claims data as our data source to calculate readmissions payment adjustment factors, specifically the excess payments for readmissions and payment for all discharges. In this final rule, for FY 2015, we are finalizing a policy to use MedPAR data for discharges from July 1, 2010 through June 30, 2013, consistent with our historical practice. We also are finalizing the policy to use the March 2011 update of the FY 2010 MedPAR file, the March 2012 update of the FY 2011 MedPAR, the March 2013 update of the FY 2012 MedPAR file and the March 2014 update of the FY 2013 MedPAR file to identify the discharges occurring from July 1, 2010 through June 30, 2013. In addition, the Standard Analytic File is the data source used to calculate the excess readmission ratios. We use the June 2011 update of the 2010 SAF file, the June 2012 update of the 2011 file, the June 2012 update of the 2012 file, and the September 2013 update of the 2013 file.

RACs have up to 3 years to review claims to determine whether a claim was inappropriately billed as inpatient when it should have been an outpatient claim. If a claim is denied as an inpatient stay, the claim is adjusted through the standard Medicare claims processing systems, going through the CWF, SAF and MedPAR. However, given the timing of the RAC audits and the updates of the SAF and MedPAR files used to calculate the readmissions measures and readmissions payment adjustment factors, it is not certain that all denied claims will be reflected in our claims files at the time of our calculations. However, we continue to believe that using these updates of the MedPAR and SAF files is consistent with IPPS ratesetting and allows for transparency for the public to obtain this dataset for replication. Furthermore, inpatient stays that are denied payment under Medicare Part A typically remain classified as inpatient stays, and can be billed to Medicare Part B as an Medicare Part B inpatient stay. These inpatient stays that are denied payment under Medicare Part A will typically continue to count as a qualifying inpatient stay for other payment purposes such as qualifying for SNF benefits and Medicare DSH patient days. Therefore, we continue to believe that it is appropriate to include these admissions in the Hospital Readmissions Reduction Program.

Comment: One commenter opposed the proposal that the base operating DRG payment amount for MDHs include the difference between the hospital-specific rate payment and the Federal rate payment in FY 2015, noting that, for teaching MDHs, the hospital-specific rate add-on payment is inclusive of costs associated with teaching and that the inclusion of such payment would violate the Affordable Care Act. This commenter requested that CMS maintain the current definition of “base operating DRG payment amount,” which excludes this additional hospital-specific payment rate amount.

Response: We disagree with the commenter. The “base operating DRG payment amount” is generally defined as the wage-adjusted DRG operating payment plus any applicable new technology add-on payments (§ 412.152 and § 412.160). For years prior to FY 2014, the statutory provisions related to the definition of “base operating DRG payment amount” under section 1886(q)(2)(B)(i) of the Act excluded the difference between an MDH's applicable hospital-specific payment rate and the Federal payment rate (referred to as the hospital-specific add-on) from the definition of the base operating DRG payment amount. (MDHs are paid based on the Federal rate or, if higher, the Federal rate plus 75 percent of the amount by which the Federal rate is exceeded by the updated hospital-specific rate from certain specified base years.) However, section 1886(q)(2)(B)(i) of the Act states that the exclusion of the hospital-specific add-on from the base operating DRG payment amount is only effective for MDHs with respect to discharges occurring during FYs 2012 and 2013. Furthermore, section 1886(q)(2)(B)(ii) of the Act requires that the definition of base operating DRG payment amount exclude payments made under section 1886(d)(5)(B) of the Act (IME payments). While a portion of the hospital-specific rate is related to teaching services provided by teaching MDHs, we do not consider that amount to be a payment under section 1886(d)(5)(B) of the Act. We otherwise do not have authority to exclude the difference between the hospital-specific payment rate for MDHs from the definition of base operating DRG payment amount for discharges.

Therefore, in accordance with the statute, beginning in FY 2014, the definition of “base operating DRG payment amount” includes the difference between an MDH's applicable hospital-specific rate payment and Federal rate payment (that is, the hospital-specific add-on). As a result, in the calculation of the readmissions payment adjustment factor, which is a ratio of a hospital's “aggregate payments for excess readmissions” and a hospitals “aggregate payments for all discharges”, the base operating DRG payment amounts used in this calculation for MDHs also includes the hospital-specific add-on, if applicable. Furthermore, the statute specifies that the readmissions payment adjustment factor is applied to the base operating DRG payment amount for each Medicare FFS discharge in a Federal fiscal year.

Therefore, we are adopting our proposal as final, and for FY 2015, the readmissions payment adjustment factor will be applied to the base operating DRG payment amount, including the hospital-specific add on for MDHs as applicable. This is consistent with the policy established for the treatment of MDHs under the Hospital Readmissions Reduction Program and the Hospital VBP Program for FY 2014 in the notice that appeared in the Federal Register on June 17, 2014 (79 FR 34448 through 34449) that implemented the extension of the MDH program through September 30, 2015, as provided by the PAMA. In that notice, we explained that this change in the determination of base operating DRG for MDHs consistent is with the section 1886(q)(2)(B)(i) of the Act, and affects both the calculation of the readmission payment adjustment factor and the payments reduced by the readmission payment adjustment factor for MDHs that receive the hospital-specific add-on payment.

As noted previously, MDHs are paid the higher of the Federal rate payment or Federal rate payment plus the hospital-specific add-on payment on a per claim basis. At cost report settlement, the MAC determines which of the payment options yields a higher aggregate payment for an MDH, and also determines the final hospital-specific add-on payment (if applicable) for that MDH for each cost reporting period. Because a final payment determination for an MDH's cost reporting period is not done until cost report settlement, if an MDH ultimately receives the hospital-specific add-on (that is, its final payment is determined to be the Federal rate payment plus 75 percent of the amount by which the Federal rate payment is exceeded by the updated hospital-specific rate payment), then additional adjustments under the Hospital Readmissions Reduction Program will be made during cost report settlement and not on the claim. If at cost report settlement an MDH ultimately does not receive a hospital-specific add-on for the cost reporting period (that is, its final payment is determined to be the Federal rate payment only), then no additional adjustment (if otherwise applicable) under the Hospital Readmissions Reduction Program will be made.

Comment: Some commenters supported the proposed series of changes to calculate the aggregate payments for excess readmissions for FY 2015 including the two additional conditions of COPD and TKA/THA. Specifically, some commenters supported CMS' exclusions of admissions to calculate aggregate payments for excess readmissions, most of which conformed to the calculation exclusions of the individual measures. Commenters supported CMS' proposals where index admissions that are not considered readmissions for the purpose of the readmissions measures and are excluded from the calculation of the excess readmission ratio, would also be excluded from the admissions used to determine a hospital's aggregate payments for excess readmissions, such as exclusions for admissions for patients who did not have Medicare Part A and B for 12 months prior to the admission or 3030 days after the admission, as identified by linking MedPAR claims files to the Medicare Enrollment Database (EDB). Some commenters supported CMS' proposal to use of MedPAR data to calculate the readmissions payment adjustment factors.

Response: We thank the commenters for their support of our proposed methodology to calculate the readmission payment adjustment factors with the inclusion of two additional readmissions measures of THA/TKA and COPD, and we are finalizing the policies as proposed. The MedPAR data we are finalizing to use to calculate the readmissions payment adjustment factors for FY 2015 is specified above.

We note that we stated in the proposed rule (79 FR 28113) that, for the final rule, applicable hospitals will have had the opportunity to review and correct data from the proposed FY 2015 applicable period of July 1, 2010 to June 30, 2013 before they are made public under our policy regarding the reporting of hospital-specific information. In previous years, the review and correction period occurred prior to the publication of the final rule, and we published the final excess readmission ratios and readmission payment adjustment factors on the CMS IPPS Web site and the final readmission payment adjustment factors in Table 15 in conjunction with the issuance of the final rule. Since the publication of the proposed rule, we experienced unexpected delays in the production of the excess readmission ratios, which has resulted in a later than expected start to the 30-day review and corrections period. For the data from the FY 2015 applicable period, the review and corrections period will still be ongoing through August 19, 2014, which extends beyond the issuance of this FY 2015 IPPS/LTCH PPS final rule. As a result, in Table 15A listed in the Addendum of this final rule (which is available only via the Internet on the CMS Web site), we are providing proxy FY 2015 readmission payment adjustment factors, and are posting the corresponding proxy excess readmission ratios, which are based on the FY 2015 application period of July 1, 2010 to June 30, 2013, on the CMS IPPS Web site. After the completion of the review and corrections process, we will publish the final FY 2015 readmissions payment adjustment factors in Table 15B that will be effective for determining payments for discharges occurring on or after October 1, 2014, and the corresponding final excess readmission ratios on the CMS IPPS Web site. We expect the final FY 2015 readmissions payment adjustment factors in Table 15B and the corresponding final excess readmission ratios to be posted on the CMS IPPS Web site prior to October 1, 2014.

After consideration of the public comments we received, we are finalizing without modification our proposals pertaining to the inclusion of THA/TKA and COPD readmissions measures to calculate aggregate payments for excess readmissions beginning in FY 2015.

12. Hospital Readmissions Reduction Program Extraordinary Circumstances Exceptions

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50676), we indicated that commenters had requested a potential waiver or exemption process for hospitals located in areas that experience disasters or other extraordinary circumstances, even though we had not proposed an extraordinary circumstance exceptions/exemptions (ECE) policy for the Hospital Readmissions Reduction Program. We noted that there are several policy and operational considerations in developing a disaster exemption process for the Hospital Readmissions Reduction Program.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28117) we welcomed public comment on whether an exemption process should be implemented, and the policy and operational considerations for a potential Hospital Readmissions Reduction Program ECE policy.

Comment: A few commenters supported the creation of an extraordinary circumstance exemption process. The commenters recommended that an extraordinary circumstance exemption process should be allowed for hospitals that experience a natural disaster and should also be applied to the payment year in which the date of the disaster occurs because the Hospital Readmissions Reduction Program uses 2 years of performance data that also overlaps with subsequent payment years. Two commenters specifically indicated that the extraordinary circumstance exemption process should be similar to the existing Hospital VBP Program exceptions process. Finally, a commenter suggested establishing a 90-day period, beginning with the date of the disaster, for hospitals to submit a request for an exemption from the Hospital Readmissions Reduction Program for a specific fiscal year. No commenters made other recommendations on how to operationalize an extraordinary circumstance exemption policy and supporting processes.

Response: We appreciate the input from the commenters. We will take into consideration these recommendations as we consider whether an exemption process for the Hospital Readmissions Reduction Program should be implemented.

I. Hospital Value-Based Purchasing (VBP) Program

1. Statutory Background

Section 1886(o) of the Act, as added by section 3001(a)(1) of the Affordable Care Act, requires the Secretary to establish a hospital value-based purchasing program (the Hospital Value-Based Purchasing (VBP) Program) under which value-based incentive payments are made in a fiscal year to hospitals that meet performance standards established for a performance period for such fiscal year. Both the performance standards and the performance period for a fiscal year are to be established by the Secretary.

Section 1886(o)(1)(B) of the Act states that the Hospital VBP Program applies to payments for hospital discharges occurring on or after October 1, 2012. In accordance with section 1886(o)(6)(A) of the Act, we are required to make value-based incentive payments under the Hospital VBP Program to hospitals that meet or exceed performance standards for a performance period for a fiscal year. As further required by section 1886(o)(6)(C)(ii)(I) of the Act, we base each hospital's value-based payment percentage on the hospital's Total Performance Score (TPS) for a specified performance period. In accordance with section 1886(o)(7) of the Act, the total amount available for value-based incentive payments for a fiscal year will be equal to the total amount of the payment reductions for all participating hospitals for such fiscal year, as estimated by the Secretary. For FY 2014, the available funding pool was equal to 1.25 percent of the base-operating DRG payments to all participating hospitals, as estimated by the Secretary. The size of the applicable percentage has increased to 1.50 percent for FY 2015 and will increase to 1.75 percent for FY 2016, and to 2.0 percent for FY 2017 and successive fiscal years.

Section 1886(o)(1)(C) of the Act generally defines the term “hospital” for purposes of the Hospital VBP Program as a subsection (d) hospital (as that term is defined in section 1886(d)(1)(B) of the Act), but excludes from the definition of the term “hospital,” with respect to a fiscal year: (1) A hospital that is subject to the payment reduction under section 1886(b)(3)(B)(viii)(I) of the Act (the Hospital IQR Program) for such fiscal year; (2) a hospital for which, during the performance period for the fiscal year, the Secretary has cited deficiencies that pose immediate jeopardy to the health or safety of patients; and (3) a hospital for which there are not a minimum number (as determined by the Secretary) of measures that apply to the hospital for the performance period for the fiscal year involved, or for which there are not a minimum number (as determined by the Secretary) of cases for the measures that apply to the hospital for the performance period for such fiscal year.

2. Overview of Previous Hospital VBP Program Rulemaking

We refer readers to the Hospital Inpatient VBP Program final rule (76 FR 26490 through 26547), FY 2012 IPPS/LTCH PPS final rule (76 FR 51653 through 51660), CY 2012 OPPS/ASC final rule with comment period (76 FR 74527 through 74547), FY 2013 IPPS/LTCH PPS final rule (77 FR 53567 through 53614), FY 2014 IPPS/LTCH PPS final rule (78 FR 50676 through 50707), and CY 2014 OPPS/ASC final rule with comment period (78 FR 75120 through 75121) for further descriptions of our policies for the Hospital VBP Program.

We have also codified certain requirements for the Hospital VBP Program at Title 42, Sections 412.160 through 412.167 of our regulations.

3. FY 2015 Payment Details

a. Payment Adjustments

Section 1886(o)(7)(B) of the Act instructs the Secretary to reduce the base operating DRG payment amount for a hospital for each discharge in a fiscal year by an applicable percent. Under section 1886(o)(7)(A) of the Act, the sum total of these reductions in a fiscal year must equal the total amount available for value-based incentive payments for all eligible hospitals for the fiscal year, as estimated by the Secretary. We finalized details on how we would implement these provisions in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53571 through 53573) and refer readers to that rule for further details.

Under section 1886(o)(7)(C)(iii) of the Act, the applicable percent for the FY 2015 Hospital VBP Program is 1.50 percent. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28117 through 28118), using the methodology we adopted in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53571 through 53573), we estimated that the total amount available for value-based incentive payments for FY 2015 was $1.4 billion, based on the December 2013 update of the FY 2013 MedPAR file. We stated that we intended to update this estimate for the FY 2015 IPPS/LTCH PPS final rule, using the March 2014 update of the FY 2013 MedPAR file. Based on the March 2014 update of the FY 2013 MedPAR file, we continue to estimate that the amount available for value-based incentive payments for FY 2015 is $1.4 billion.

As finalized in the FY 2013 IPPS/LTCH PPS final rule, we will utilize a linear exchange function to translate this estimated amount available into a value-based incentive payment percentage for each hospital, based on its TPS. We will then calculate a value-based incentive payment adjustment factor that will be applied to the base operating DRG payment amount for each discharge occurring in FY 2015, on a per-claim basis. We noted in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28117-28118) that we were publishing proxy value-based incentive payment adjustment factors in Table 16 of that proposed rule (which is available via the Internet on the CMS Web site). The proxy factors are based on the TPSs from the FY 2014 Hospital VBP Program. These FY 2014 performance scores are the most recently available performance scores that hospitals have been given the opportunity to review and correct. The slope of the linear exchange function used to calculate those proxy value-based incentive payment adjustment factors was 2.0952951561. This slope, along with the estimated amount available for value-based incentive payments, was also published in Table 16.

We stated that we intended to update this table as Table 16A in this final rule (which will be available via the Internet on the CMS Web site) to reflect changes based on the March 2014 update to the FY 2013 MedPAR file. We also stated that we intended to update the slope of the linear exchange function used to calculate those updated proxy value-based incentive payment adjustment factors. The slope of the linear exchange function used to calculate those updated proxy value-based incentive payment adjustment factors is 2.0950773214. The updated proxy value-based incentive payment adjustment factors for FY 2015 continue to be based on historic FY 2014 Program TPSs because hospitals will not have been given the opportunity to review and correct their actual TPSs for the FY 2015 Hospital VBP Program until after this FY 2015 IPPS/LTCH PPS final rule is published. After hospitals have been given an opportunity to review and correct their actual TPSs for FY 2015, we will add Table 16B (which will be available via the Internet on the CMS Web site) to display the actual value-based incentive payment adjustment factors, exchange function slope, and estimated amount available for the FY 2015 Hospital VBP Program. We expect that Table 16B will be posted on the CMS Web site in October 2014.

We received a number of public comments on our stated intention to update Table 16 as Table 16A for the final rule:

Comment: Commenters found Table 16 misleading and urged CMS to adopt a change in the process that would allow for a more meaningful release of information in the proposed rule on Hospital VBP performance. Specifically, commenters stated that Table 16 is not useful to hospitals that attempt to assess their performance in comparison to others when CMS has added or removed new measures and changed the domain weights. As a result, commenters urged CMS to calculate proxy factors using the updated measures and domain weights finalized in last year's rule for FY 2015 so that hospitals are not forced to rely on data provided to them from other entities, such as State hospital associations that provide updated information to their members.

Response: While we understand commenters' concerns with comparing Hospital VBP performance information across program years, we make these calculations using the most recently-available performance data that hospitals have had the opportunity to review, which at the time of the IPPS/LTCH PPS rule's publication does not include the scoring data for the next fiscal year. We do not believe it would be useful to publish proxy factors using domain weights finalized for the next fiscal year without the corresponding performance scoring data from the same program year because that action would mix policies between fiscal years, which is why we have adopted the practice of calculating proxy factors from the previous year. We believe that these calculations represent the most accurate data available at the time of the final rule's publication and appropriately reflect policies for a single program year.

b. Base Operating DRG Payment Amount Definition for Medicare-Dependent, Small Rural Hospitals (MDHs)

Section 106 of Public Law 113-93, the Protecting Access to Medicare Act of 2014 (PAMA), extended the MDH program through March 31, 2015. We note that that the special treatment for MDHs under section 1886(o)(7)(D)(ii)(I) of the Act, with regard to definition of base operating DRG payment amount, does not apply to discharges occurring after FY 2013.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28118), for FY 2015 and subsequent years, for purposes of calculating the payment adjustment factors and applying the payment methodology, we proposed that the base operating DRG payment amount for MDHs will include the difference between the hospital-specific payment rate and the Federal payment rate (as applicable). We also proposed to revise the definition of “base operating DRG payment amount” in section 412.160 paragraph (2) of our regulations to reflect this change. We welcomed comments on this proposal.

Comment: One commenter opposed CMS' proposal to revise the definition of base operating DRG payment amount for MDHs to include the difference between the hospital-specific payment rate add-on payment amount and the Federal payment rate, noting that for teaching MDHs, the hospital-specific rate add-on payment amount is inclusive of costs associated with teaching and that the inclusion of such payment would violate the Affordable Care Act. This commenter requested that CMS maintain the current definition of base operating DRG payment amount, which excludes this additional hospital-specific payment rate amount.

Response: We disagree with this comment. Section 1886(o)(7)(D)(i)(II) of the Act requires that the definition of base operating DRG payment amount exclude payments made under section 1886(d)(5)(B) of the Act. While a portion of the hospital-specific rate is related to teaching services provided by teaching MDHs, we do not consider that amount to be a payment under section 1885(d)(5)(B) of the Act. We do not believe that we have authority to exclude the difference between the hospital-specific payment rate and the Federal payment rate for MDHs from the definition of base operating DRG payment amount for discharges after FY 2013.

We did not receive any public comments on the corresponding proposed regulatory revision at 42 CFR 412.160.

After consideration of the public comments we received, we are finalizing our policy, as proposed, to revise the definition of “base operating DRG payment amount” for MDH to include the difference between the hospital-specific payment rate and the Federal payment rate (as applicable). We also are finalizing the revision to the definition of “base operating DRG payment amount” in section 412.160, paragraph (2), of our regulations, as proposed.

We also received a number of general comments on the Hospital VBP Program:

Comment: Commenters asked that CMS to clarify why CMS did not address FY 2018 Hospital VBP Program requirements in the proposed rule.

Response: We adopted certain FY 2018 policies related to claims-based measures that require a long performance period in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50692 through 50694 and 50698 through 50699). For the same reason, we are adopting certain policies related to FY 2019 and FY 2020 measures in this final rule. We intend to propose additional FY 2018 policies, including additional measures, performance periods, performance standards, and other policies in future rulemaking.

Comment: One commenter expressed concern about the instability and changing requirements of the Hospital VBP Program. The commenter was especially concerned that 60 percent of the measures are calculated based on coding that could result in inaccurate measure rates. The commenter suggested that there be some sort of validation for hospitals performing well to assure that coding practices are being met.

Response: As discussed in the Hospital Inpatient VBP Program final rule (76 FR 26537 through 26538), we have finalized a policy under which we will use the validation process that we use for the Hospital IQR Program to ensure that Hospital VBP data are accurate. As we described in that final rule, we view the Hospital IQR Program's validation processes as sufficient to ensure that Hospital VBP Program data are accurate, and we intend to continue working with stakeholders to develop additional validation processes as necessary to ensure data accuracy for the Hospital VBP Program.

Comment: One commenter urged CMS to put measures in place prior to affecting Medicare payments. The commenter suggested the best way to improve patient care is to “put into practice” a measure and track it over time. According to the commenter, if there is no improvement in the results, the measure could then be included in the Hospital VBP Program.

Response: We interpret the comment as suggesting that we adopt measures for reporting purposes prior to adopting them under the Hospital VBP Program. We note that we can only select measures for the Hospital VBP Program that have been specified under the Hospital IQR Program and publicly reported on the Hospital Compare Web site. However, we appreciate the suggestion that we track measures over time before adopting them for the Hospital VBP Program to ensure that these measures will serve the goals of the program, and we will take the suggestion into consideration as we develop future policies.

Comment: Commenters strongly supported CMS' removal of process measures that use chart-abstracted data and supported the use of outcomes measures.

Response: We thank the commenters for their support.

Comment: One commenter urged CMS to return a hospital's “carve-out” if the hospital is deemed ineligible for the Hospital VBP Program as a result of the policy by which CMS requires that hospitals submit a minimum number of cases and measures across domains in order to receive a Total Performance Score.

Response: Hospitals that are excluded from the Hospital VBP Program for a fiscal year for any reason do not have the applicable percentage withheld from their base operating DRG payment amounts.

Comment: Several commenters stated that they do not believe 2 percent of the amount of Medicare hospital payments is significant enough to drive value-based change in the system. A few commenters suggested that CMS consider alternative ways to align Medicare payments with the policies developed in the Hospital VBP Program to promote more change.

Response: The statute ultimately caps the Hospital VBP Program's funding at 2 percent of base-operating DRG payment amounts, and we view this amount as substantial enough to provide significant incentives to hospitals to improve the quality of care they provide to Medicare beneficiaries.

Comment: Several commenters supported CMS' efforts to align the Hospital VBP Program with existing hospital and physician quality reporting initiatives, including the Physician Value-Based Modifier (VM) Program. One commenter stated that the programs should encourage consistent quality throughout the continuum of care.

However, one commenter cautioned CMS in its goal of increasing alignment between the Hospital VBP and physician quality reporting initiatives because, despite generally supporting alignment between Medicare reporting requirements to decrease the administrative burden on providers, the commenter expressed concern that the Medicare Spending per Beneficiary (MSPB) measure is inappropriate for inclusion in the physician quality reporting programs.

Response: We will consider possible policies aimed at aligning our quality programs across different care settings in future rulemaking. We disagree, however, that the MSPB measure is generally inappropriate for inclusion in physician quality reporting programs. We view measures of efficiency like MSPB as critical components of quality measurement and pay-for-performance programs.

Comment: One commenter suggested that CMS adopt more specific achievement thresholds and benchmarks to draw comparisons between hospitals of similar size, with similar access to technology, specialized staff, and patient populations.

Response: We do not believe that these types of specific adjustments to Hospital VBP Program performance standards are feasible at this time. To implement this change, we would need to incorporate detailed adjustment methodologies in each of the measures that we have adopted for the Hospital VBP Program. We do not believe we have sufficient data on the various comparison points that the commenter suggests to create separate Hospital VBP Program performance standards for different types of hospitals at this time.

Moreover, the Hospital VBP Program's scoring methodology, based on several years' research and policy development, is designed to provide incentives to hospitals based on national performance metrics. As discussed further below, we continue to believe that the scoring methodology appropriately holds hospitals accountable based on established and well-understood metrics. However, we may consider adjustments of the type the commenter suggests in the future as more data becomes available for analyses.

4. Measures for the FY 2017 Hospital VBP Program

a. Measures Previously Adopted

In the FY 2013 IPPS/LTCH PPS final rule, we finalized our proposal to readopt measures from the prior program year for each successive program year, unless proposed and finalized otherwise (for example, because one or more of the measures is “topped-out” or for other policy reasons). We stated our belief that this policy would facilitate measure adoption for the Hospital VBP Program for future years, as well as align the Hospital VBP Program with the Hospital IQR Program (77 FR 53592). The FY 2016 Hospital VBP Program includes the following measures:

Finalized Measures for the FY 2016 Hospital VBP Program

Clinical process of care domain
AMI-7aFibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival.
IMM-2Influenza Immunization.
PN-6Initial Antibiotic Selection for CAP in Immunocompetent Patient.
SCIP-Inf-2Prophylactic Antibiotic Selection for Surgical Patients.
SCIP-Inf-3Prophylactic Antibiotics Discontinued Within 24 Hours After Surgery End Time.
SCIP-Inf-9Urinary Catheter Removed on Postoperative Day 1 or Postoperative Day 2.
SCIP-Card-2Surgery Patients on Beta-Blocker Therapy Prior to Arrival Who Received a Beta-Blocker During the Perioperative Period.
SCIP-VTE-2Surgery Patients Who Received Appropriate Venous Thromboembolism Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After Surgery.
Patient experience of care domain
HCAHPSHospital Consumer Assessment of Healthcare Providers and Systems Survey.
Outcomes Domain
CAUTICatheter-Associated Urinary Tract Infection.
CLABSICentral Line-Associated Blood Stream Infection.
MORT-30-AMIAcute Myocardial Infarction (AMI) 30-day mortality rate.
MORT-30-HFHeart Failure (HF) 30-day mortality rate.
MORT-30-PNPneumonia (PN) 30-day mortality rate.
PSI-90Complication/patient safety for selected indicators (composite).
SSISurgical Site Infection:
• Colon
• Abdominal Hysterectomy
Efficiency domain
MSPB-1Medicare Spending per Beneficiary.

We received a number of comments on measures that we have previously adopted for the Hospital VBP Program.

Comment: Several commenters urged CMS to consider updating and researching the HCAHPS Survey as a whole because the measure has been used for over a decade and the technology and tools have changed in this period of time. Several commenters stated that less expensive survey administration modes should be available to minimize survey costs for participating hospitals. One commenter noted that the methods for delivering the survey are outdated given today's Internet-based society.

Response: While the HCAHPS Survey has been in use for nearly a decade, we continually review the survey and, when warranted, make changes to improve its content, implementation and data submission processes, and public reporting of its results. For instance, in recent years we added five new survey items, including the Care Transition Measure, made the patient-mix adjustment for ‘language spoken at home' more granular to account for differences among speakers of major languages, investigated the suitability of new modes of survey administration, and made survey results and analytical tools available to the public via downloadable databases on CMS Web sites. We continually examine and refine HCAHPS protocols for survey implementation, oversight, and public reporting to maintain the integrity of the survey and increase the usefulness and accessibility of its results. We will continue to asses, analyze and improve the HCAHPS Survey to increase its value to consumers and hospitals.

With regard to comments urging us to update the HCAHPS tool, we note that the HCAHPS Survey was purposely designed to accommodate, to the degree possible, the variety of patient survey methodologies hospitals employed prior to the introduction of HCAHPS. Thus, the HCAHPS Survey was made available in four modes of survey administration (mail only; telephone only; mail with telephone follow-up; and Active Interactive Voice Response modes). Hospitals are given the option to either self-administer the survey or engage an approved survey vendor, of which several dozen are listed on the official HCAHPS On-Line Web site, www.HCAHPSonline.org. In addition, hospitals are permitted to add their own supplemental items to the survey.

We are sensitive to the costs of survey administration, especially as patient experience surveys become a standard element of quality improvement and public reporting programs for other types of healthcare providers. In 2008, we conducted a large-scale mode experiment to test the suitability of a Web-based mode of the HCAHPS Survey and concluded that a number of factors, including unavailability of email addresses for a substantial portion of the hospital patient population and low response rates, preclude the adoption of a Web-based mode at this time. We will continue to monitor and periodically evaluate the suitability of alternative, electronic survey modes. We are continuing to look at this issue. In particular, we are tracking access to the Internet among the elderly and minority populations since currently access to the Internet is lower for these critical populations that participate in our surveys.

“A Randomized Experiment Investigating the Suitability of Speech-Enabled IVR and Web Modes for Publicly Reported Surveys of Patients' Experience of Hospital Care.” M.N. Elliott, J.A. Brown, W.G. Lehrman, M.K. Beckett, K. Hambarsoomian, L.A. Giordano and E. Goldstein. Medical Care Research and Review. 70: 165-184. 2013.

Comment: Several commenters expressed concerns about the sufficiency of the risk adjustment of the HCAHPS composite measures. One commenter pointed out that research shows that high-acuity patients score their patient experience at a lower level, systematically disadvantaging hospitals that take on complex and sicker patients, and suggested that CMS incorporate additional adjustments to account for patients' illness severity. One commenter urged CMS to further research broad improvements to the HCAHPS survey delivery and adjustment methodologies. A few commenters suggested that CMS exclude HCAHPS scores from the Hospital VBP Program until risk-adjustments are updated and its validity has been determined.

Response: Research on health care providers indicates that a number of quality measures differ on a regional basis, which is indicative of true differences that should not be obscured by data adjustment.

CMS and the HCAHPS Project Team are familiar with the studies commenter cited. We also are aware of a number of studies published in peer-reviewed journals that have found that patient experience of care, as measured by the HCAHPS Survey, is strongly and positively related to clinical process measures, outcomes, readmissions, and mortality. For brief reviews of these findings, we refer readers to: “The Patient Experience and Health Outcomes” and “What does the patient know about quality?”

Matthew Manary, William Boulding, Richard Staelin, and Seth Glickman. New England Journal of Medicine, 368 (3): 201-203. 2013.

Karen Luxford. International Journal for Quality in Health Care. 24 (5): 439-440. 2012.

With respect to the articles cited by the commenter, we note that other researchers have cited flaws in the approach, data and methodology employed in the Fenton, et al., study, which did not directly examine the HCAHPS Survey. The study by Lyu, et al. is premised upon the misunderstanding that we use patient experience as the sole criterion for measuring and assessing hospital quality. In addition, their findings, based on examination of 31 hospitals, may insufficiently represent the over 3,000 hospitals that participate in the Hospital VBP Program and the approximately 4,000 hospitals that participate in the Hospital IQR Program.

In addition, a recent national study found a significant positive relationship between patient experience of care and surgical quality, which suggests that incentives to improve surgical patient experience and surgical quality are aligned.

Tsai, et al. “Patient Satisfaction and Quality of Surgical Care in US Hospitals.” Annals of Surgery, 2014.

Comment: One commenter suggested that CMS separate the Cleanliness & Quietness dimension on the HCAHPS Survey, stating that it would be more helpful for consumers to know which element is driving hospitals' performance and improvement in those areas.

Response: “Hospital Environment” is one of eight equally-weighted dimensions in the Patient Experience of Care Domain of the Hospital VBP Program. The Hospital Environment dimension is itself composed of two equally-weighted measures from the HCAHPS Survey: Percent of patients who responded “Always” to the hospital cleanliness item, and percent of patients who responded “Always” to the hospital quietness item. Therefore, the Hospital Environment dimension assigns 5 points to each of the environment measures. The Hospital Environment dimension is given the same weight in Hospital VBP Program as other key HCAHPS measures, such as Communication with Nurses, and Discharge Information (“A Step-by-Step Guide to Calculating the Patient Experience of Care Domain Score in the Hospital Value-Based Purchasing FY 2013 Actual Percentage Payment Summary Report,” available on HCAHPS On-Line Web site at: http://www.hcahpsonline.org/HospitalVBP.aspx.) While the two environment measures have been combined in the Hospital VBP Program, consumers can see how hospitals perform on cleanliness and quietness separately by examining the measure scores posted on the Hospital Compare Web site.

Comment: One commenter urged CMS to reevaluate the validity of questions used on the HCAHPS Survey related to pain management, including whether the survey appropriately reflects patient satisfaction and whether or not it may encourage inappropriate treatment. The commenter expressed concern about the abuse of opioid pain relievers in hospital settings. The commenter explained that the HCAHPS Survey principally focuses on effective use of pharmacotherapy, which may be consistent with the patient's wishes but is not always in his or her best interest.

Response: The Pain Management domain is derived from three items on the HCAHPS Survey. It is important to note that the HCAHPS Survey is designed to capture and report patient experience of care at the hospital level, not at the level of physician, and that only adult inpatients are eligible for the HCAHPS Survey (emergency room patients would be eligible for the survey only if they were subsequently admitted as inpatients). The HCAHPS sampling protocol does not support reliable measurement of performance at the physician level. Any use of the HCAHPS Survey to evaluate individual physicians is inconsistent with our guidance.

We understand and share the commenter's concerns about the rising level of abuse of opioid pain relievers in the United States. The HCAHPS Survey includes three questions about pain control to measure and publicly report patient experience with this common, yet critical, aspect of hospitalization; and neither the patient nor the physician(s) is identified in survey data submitted to CMS. Pain control is an important part of patient care in a hospital and should be evaluated at the hospital level. There are non-opioid options for pain control that many hospitals use.

All items on the HCAHPS Survey have been carefully constructed and tested, both in the field and in focus groups of patients and caregivers. The statistical reliability of the Pain Management domain was 0.80 in 2013.

Sofaer, et al., Health Services Research, 40: 6, 2018-2036. 2005.

We share the commenter's commitment to reducing abuse of opioids and will reach out to hospitals and physicians to help them more fully understand the capacities and limits of the HCAHPS Survey in this regard and will pursue further research on the wording of the pain management items in the HCAHPS Survey.

Comment: One commenter proposed replacing the MSPB measure with the NQF-endorsed Relative Resource Use (RRU) measure, or a measure designed to track health care resource use by providers, health plans, or other units for individuals having one of five chronic diseases (COPD, cardiovascular disease, diabetes, asthma, and hypertension).

Response: We disagree that the MSPB should be replaced with an RRU measure. We note that the MSPB measure is also NQF-endorsed. Inclusion of an overall measure of cost is an essential complement to the condition-specific measures included in the clinical process of care and outcomes domains. Relying on condition-specific measures alone, such as RRU measures, would disregard differences in overall cost. The MSPB measure is reported as a ratio of the payment-standardized, risk-adjusted MSPB amount for each hospital divided by the weighted median MSPB amount across all hospitals. As discussed in section IV.I.6.b. of the preamble of this final rule (Possible Future Efficiency and Cost Reduction Domain Measure Topics), we are considering expansion of the Efficiency and Cost Control domain to include six condition-specific Medicare payment measures (three medical and three surgical condition-specific episodes) in addition to the MSPB measure and would do so through public notice and comment rulemaking.

Comment: Several commenters recommended that CMS refine its policies on risk-adjustment in the MSPB and other measures to include socioeconomic status because a patient's socioeconomic status affects clinical outcomes.

Commenters explained that comorbidities, socioeconomic status, and sociodemographic factors are major determinants of outcomes, and penalizing physicians and hospitals for readmissions of the most chronically ill patients without proper risk adjustment could provide unintended negative consequences. Commenters stated that, without a risk-adjustment factor, hospitals treating these patients become subject to penalizations for readmissions not related to the care provided as well as penalizations for extending an inpatient stay in order to better optimize the patient's health status. Further, commenters suggested that hospitals that serve a disproportionate share of these patients could conceivably do the most to improve their health status but are disproportionately penalized without a risk adjustment.

One commenter suggested that CMS examine NQF's Risk Adjustment for Socioeconomic Status or Other Sociodemographic Factors Draft Report for determining the appropriate risk adjustment methodology for the Hospital VBP Program. (Draft Report available at: http://www.qualityforum.org/Risk_Adjustment_SES.aspx.) Another commenter strongly supported the recommendations contained in the draft report and urged CMS to accordingly modify its risk-adjustment methodology to include such factors.

Response: We appreciate these comments and the importance of the role that SES plays in the care of patients. With regard to the MSPB measure's risk adjustment specifically, we note that the MSPB measure was finalized in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51619 through 51627). In that rule, we addressed concerns about risk adjustment. We are aware that there are differing opinions regarding our current approach in risk-adjusting measures in the Hospital Readmissions Reduction Program for SES. We note that the readmission measures aim to reveal differences related to the quality of care provided. We believe that quality of care received by patients of lower SES contributes at least in part to the observed association between SES status and the readmissions rate. We continue to have concerns about holding hospitals to different standards for the outcomes of their patients of low SES—we do not want to mask potential disparities or minimize incentives to improve the outcomes of disadvantaged populations.

We routinely monitor the impact of SES on hospitals' results. To date, we have found that hospitals that care for large proportions of patients of low SES are capable of performing well on our measures (see the 2013 Medicare Hospital Quality Chart Book on pages 46 through 53 at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/-Medicare-Hospital-Quality-Chartbook-2013.pdf). Previous analyses presented at the NQF during endorsement proceedings of the Hospital-Wide All-Cause Unplanned Readmission Measure (available at: http://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=70813) also show that adding SES to the risk-adjustment has a negligible impact on hospitals' risk-standardized rates. The risk adjustment for clinical factors likely captures much of the variation due to SES, therefore resulting in an attenuation of the impact of SES factors on hospitals' results.

We continue to monitor related activities at NQF, such as the July 23, 2014 decision by the NQF Board in which the Board approved a trial period to test the impact of sociodemographic factor risk adjustment of performance measures (available at: http://www.qualityforum.org/Press_Release/2014/NQF_Board_Approves_Trial_Risk_Adjustment.aspx), and in Congress. As we stated in the past, we are committed to working with the NQF and other stakeholder communities to continuously refine our measures and to address the concerns associated with SES and risk adjustment. We believe that continued collaboration with the stakeholder communities will enable us to identify feasible ways to appropriately address any unintended consequences for providers serving high proportions of low-SES patients.

Comment: Some commenters expressed concern about the influence of factors that are outside the hospital's control on the MSPB measure and the lack of associated quality or outcome measures. One of these commenters stated that any measures focusing exclusively on cost such as the MSPB measure create incentives to reduce services in ways that adversely affect patient outcomes and that such cost measures also create disincentives to adopt new technologies.

One commenter expressed concern that the measure does not track the frequency of hospitalization, noting that a community that reduces avoidable hospitalizations may experience higher per-hospitalization costs, even if overall costs go down.

Response: Regarding the commenters' concern with the degree of the hospital's control over the MSPB measure, we continue to disagree that care furnished to beneficiaries after they are discharged from an acute care hospital is outside of the hospital's control. As we stated in the FY 2012 IPPS/LTCH PPS final rule, we believe that hospitals that provide quality inpatient care, conduct appropriate discharge planning, and work with providers and suppliers on appropriate follow-up care can achieve efficiencies and perform well on the measure (76 FR 51621).

Regarding the comment that the MSPB measure does not account for quality, we continue to agree that it is beneficial to view a cost measure in light of other quality measures. We do not believe that a including measure of cost, independent of quality in the Hospital VBP Program, would result in a reduction of needed services or in a disincentive to develop new technologies, because as we stated in the FY 2012 IPPS/LTCH PPS final rule, for purposes of the Hospital VBP Program, we will weight and combine the Efficiency and Cost Control domain with the other domain scores, in order to calculate each hospital's TPS, ensuring that that MSPB and any other Efficiency and Cost Control Domain measures we adopt make up only a portion of the TPS and that the remainder is based on hospitals' performance on the other quality measures (76 FR 51622). As we stated in the FY 2013 IPPS/LTCH PPS final rule, section 1886(o)(2)(B)(ii) of the Act expressly requires the inclusion of “measures of Medicare spending per beneficiary” in the Hospital VBP Program. We do not believe that the MSPB measure itself should assess both cost and quality. We believe that a inclusion of a distinct measure of cost, independent of quality, as part of the Hospital VBP Program enables us to identify hospitals involved in the provision of high quality care at a lower cost to Medicare (77 FR 53586).

With regard to tracking the frequency of hospital admissions, we do not believe that the measure would adversely affect communities involved in minimizing hospitalizations because the risk adjustment takes into account the severity of illness of hospitalized beneficiaries so that hospitals admitting more complex patients would have their Medicare spending compared to the expected spending for similarly complex patients.

Comment: One commenter recommended that CMS delay any further implementation of the MSPB measure until after the Physician VM Program is implemented, stating that hospitals should not be expected to bear the consequences of physicians' decisions.

Response: We agree that alignment of incentives across programs is important. In the CY 2014 Physician Fee Schedule final rule (78 FR 74774 through 74780), we finalized the inclusion of the MSPB amount in the cost composite portion of the physician value-based modifier (VM), beginning with the 2016 VM. We do not believe that it would be appropriate to suspend the further use of the MSPB measure until after the VM is implemented.

We continue to believe, as we stated in the FY 2012 IPPS/LTCH PPS final rule, that the MSPB measure is an important step in encouraging hospitals to redesign and coordinate care with other providers and suppliers of care, and that its timely implementation is critical to incentivizing hospitals to provide the highest-quality, most efficient care possible to Medicare beneficiaries (76 FR 51657).

Comment: One commenter expressed concern that the MSPB measure overlaps conceptually with the Hospital Readmissions Reduction Program, as hospitals are already being penalized for excessive readmissions under that program. The commenter urged CMS to reevaluate the MSPB measures so that CMS does not place disproportionate domain weighting on spending outside of hospitals' control.

Response: We disagree that the MSPB measure inappropriately overlaps with measures used in the Hospital Readmissions Reduction Program. As we stated in the FY 2012 IPPS/LTCH PPS final rule, the MSPB measure is not a measure of readmission rates, but rather it is a measure of total Medicare spending per beneficiary, relative to a hospital stay. A Medicare spending per beneficiary measure is required by the section 1886(o)(2)(B)(ii) of the Act to be included in the Hospital VBP Program, and therefore, in the Hospital IQR Program. We also continue to believe that the Medicare payments made for readmissions must be attributable to the index hospital stay, in order to: fully capture Medicare spending relative to a hospital stay; encourage the provision of comprehensive inpatient care, discharge planning, and follow-up; and strengthen incentives to reduce readmissions (76 FR 51621).

We further disagree, as we stated earlier, that the MSPB measure represents services that are outside of the hospital's control. As we stated above, and in the FY 2012 IPPS/LTCH PPS final rule, we believe that hospitals that provide quality inpatient care, conduct appropriate discharge planning, and work with providers and suppliers on appropriate follow-up care can achieve efficiencies and perform well on the measure (76 FR 51621).

We thank commenters for this feedback.

b. Changes Affecting “Topped-Out” Measures

(1) Removal of Six “Topped-Out” Measures

For the FY 2017 Hospital VBP Program measure set, we evaluated whether any measures that we previously adopted are now “topped-out” by focusing on two criteria: (1) national measure data showing statistically indistinguishable performance levels at the 75th and 90th percentiles; and (2) national measure data showing a truncated coefficient of variation (TCV) less than 0.10. We refer readers to the Hospital Inpatient VBP Program final rule (76 FR 26496 through 26497) for further discussion of these current “topped-out” criteria and to our proposal below to modify the second criterion.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28119), based on our evaluation of the most recently available data, we stated our belief that PN-6, SCIP-Card-2, SCIP-Inf-2, SCIP-Inf-3, SCIP-Inf-9, and SCIP-VTE-2 are all now “topped-out.” Therefore, we proposed to remove these six measures from the FY 2017 Hospital VBP measure set because measuring hospital performance on these measures will have no meaningful effect on a hospital's TPS. We believe that removing these “topped-out” measures will continue to ensure that we make valid statistical comparisons through our finalized scoring methodology and will reduce the reporting burden on participating hospitals.

We welcomed public comments on this proposal.

Comment: Many commenters supported CMS' proposal to remove “topped-out” measures, expressing appreciation for our efforts to streamline the program.

Response: We thank the commenters for their support.

Comment: One commenter suggested that CMS flag additional measures that are approaching “topped-out” status in future rulemaking.

Response: We thank the commenter for this suggestion and will take it into consideration in future rulemaking.

Comment: Commenters urged caution with CMS' proposed removal of “topped-out” measures, stating that several are only recently “topped-out.” Commenters also suggested that CMS consider adding more measures to the Hospital VBP Program to make up for the proposed removal of “topped-out” measures and to ensure that no single measure has a disproportionate impact on hospital performance in more than one program.

Response: We will consider new measures as they become eligible for inclusion in the Hospital VBP Program.

Comment: One commenter suggested that identified “topped-out” measures remain available in other reporting programs because the commenter believes that reporting these six specific measures has contributed to recent increases and emphasis on improved healthcare quality in hospitals, with a significant impact on local improvement efforts.

Response: While we appreciate commenter's observation that quality reporting has contributed to improved healthcare quality in hospitals, we believe that topped-out measures should be assessed to supplement a clinically-based assessment of the measure's impact on a clinical topic or domain.

Comment: Commenters supported the removal of the “topped-out” measures but expressed confusion at why the measures will not be removed sooner than 2017.

Response: We evaluate the Clinical Care—Process Domain measures for “topped-out” status on an annual basis in order to propose changes, if necessary, during the rulemaking process, and we do not believe it would be helpful to participating hospitals to remove measures that have been previously adopted for the Program in previous rulemakings. We note that, for example, we are currently in the middle of the Clinical Process of Care domain's performance period for the FY 2016 Hospital VBP Program, which was adopted as CY 2014. We do not believe it would be helpful to hospitals to attempt to retire a measure in the middle of their performance period, barring substantial extenuating circumstances. We believe removing these measures for the FY 2017 Hospital VBP Program, adopted with a CY 2015 performance period, is most feasible.

Comment: Some commenters suggested that CMS retire measures when their evidentiary basis has changed, when the collection and measurement costs exceed their utility, or when measures have been demonstrated to have minimal impact on health outcomes and status.

Response: We thank the commenters for their suggestions, and may consider additional “topped-out” criteria in future rulemaking.

After consideration of the public comments we received, we are finalizing our proposal to remove PN-6, SCIP-Card-2, SCIP-Inf-2, SCIP-Inf-3, SCIP-Inf-9, and SCIP-VTE-2 from the FY 2017 measure set due to their being “topped-out.”

(2) Change to Truncated Coefficient of Variation Criterion to Determine Whether a Measure is “Topped-Out”

As stated above, we have adopted two criteria for determining the “topped-out” status of Hospital VBP Program measures:

  • Statistically indistinguishable performance at the 75th and 90th percentiles; and
  • Truncated coefficient of variation < 0.10.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28119), we proposed to modify the second criterion to the following:

  • Truncated coefficient of variation ≤ 0.10.

The coefficient of variation (CV) is a common statistic that expresses the standard deviation as a percentage of the sample mean in a way that is independent of the units of observation. Applied to this analysis, a large CV would indicate a broad distribution of individual hospital scores, with large and presumably meaningful differences between hospitals in relative performance. A small CV would indicate that the distribution of individual hospital scores is clustered tightly around the mean value, suggesting that it is not useful to draw distinctions among individual hospitals' measure performance. By proposing to change the truncated CV from “less than” to “less than or equal to” 0.10 under our “topped-out” test, we will better be able to distinguish measures with significant variation in performance among hospitals and more accurately determine what measures are “topped-out” for purposes of the Program.

We welcomed public comments on this proposal.

Comment: Commenters agreed with the methodology regarding calculations to determine whether a measure is topped-out, and agreed with the proposal to alter the threshold from “less than 0.10” to “less than or equal to 0.10.”

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing our modification to the truncated coefficient of variation criterion for determining whether a measure is “topped-out” as proposed.

c. New Measures for the FY 2017 Hospital VBP Program

We considered if we should adopt additional measures for the FY 2017 Hospital VBP Program. We considered which measures are eligible for adoption based on the statutory requirements, including specification under the Hospital IQR Program and posting dates on the Hospital Compare Web site, and our priorities for quality improvement as outlined in the National Quality Strategy (NQS) (available for download at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityInitiativesGenInfo/Downloads/CMS-Quality-Strategy.pdf).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28119 through 28121) we stated that we believe that the following three proposed measures meet the statutory requirements for inclusion in the FY 2017 Hospital VBP Program. We also believe that these measures represent important components of quality improvement in the acute inpatient hospital setting.

We received a number of general comments on quality measures for the Hospital VBP Program:

Comment: Many commenters supported CMS' proposals to adopt MRSA, C. difficile Infection, and PC-01 for the FY 2017 Program. These commenters believed that the measures are appropriate for the Program and will have been publicly posted on Hospital Compare in accordance with the Hospital VBP Program's statute.

Response: We agree and thank the commenters.

Comment: Commenters supported CMS' proposal to readopt the IMM-2 measure for FY 2017 and suggested that CMS consider adopting additional immunization measures in the future.

Response: As with other suggested measure topics, we will consider new measures as they become available to us under the statutory requirements for the Hospital VBP Program.

Comment: A few commenters recommended that the Hospital VBP Program should include a mix of measures, including measures that would test adherence to evidence-based medical interventions.

Response: We agree, and we have attempted to introduce a variety of quality measure types into the Hospital VBP Program, including measures of processes, outcomes, and efficiency.

Comment: One commenter believed that all measures in the Hospital IQR, HAC Reduction, and Hospital VBP Programs should be NQF-endorsed before their adoption by CMS, because NQF-endorsement ensures that the measures have been evaluated by a panel of experts in quality measurement. The commenter therefore supported the removal of measures that have lost NQF-endorsement.

Response: We note that the Hospital VBP Program relies on data submitted under the Hospital IQR Program, and the Hospital IQR Program's statute enables us to select measures that have not been endorsed by NQF, as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary. Our statistical and clinical assessment of the measures chosen for adoption in the Hospital VBP Program supports our belief that the measures are sufficiently valid and reliable. Each measure has been used in the Hospital IQR Program for at least one year, and we believe each measure we adopt will improve patient outcomes.

Comment: Commenters suggested that CMS consider exploring measures related to sepsis mortality as an alternative to current proposals. Commenters recommended that CMS prioritize the development of quality measures that promote nutrition screening and assessment of nutrition interventions. Additional commenters recommended that CMS consider measures of advance care planning, malnutrition care, measures related to diabetes, atrial fibrillation, COPD, and oncology, additional process measures, immunization measures, and a measure of all-cause readmission. Other commenters suggested that CMS consider PSI-4: Death among surgical inpatients with serious treatable complications, COPD 30-day mortality, and AMI Payment per Episode for the Hospital VBP Program.

Additional commenters suggested that CMS consider adopting STK-1 (venous thromboembolism (VTE) prophylaxis); STK-2 (discharged on antithrombotic therapy); and STK-4 (percentage of eligible patients receiving thrombolytic therapy within 0-3 hours of symptom onset). One commenter specifically noted that the STK-4 measure in the Hospital VBP Program as it is e-specified and has not been deemed “topped-out.”

Response: We will consider new measures for the Hospital VBP Program as they become eligible for inclusion in the measure set. We note, however, that section 1886(o)(2)(A) of the Act specifically excludes measures of readmissions from the Hospital VBP Program.

Comment: One commenter urged CMS to expand the Surgical Site Infection list within the Outcomes domain to include Major Joint Replacement Surgeries and Spine procedures so that surgical specialty hospitals are able to participate in future Hospital VBP Programs. Otherwise, the commenter believed, hospitals that qualify for the Hospital VBP Program, and whose excellent performance records bolster the overall quality and efficacy of the program, may be excluded because the SSI list does not include these common procedures which make up the majority of the procedures they perform.

Response: We thank the commenter for this suggestion. We are continuously evaluating the program and working to identify new, potentially suitable measures to fill measure gaps. We appreciate the commenter's input for measure selection and will take this feedback into consideration in future rulemaking. We note that CDC maintains ongoing collaborations with a number of professional surgical organizations and is currently in the process of developing additional SSI metrics for higher volume surgical procedures. Once these measures are finalized, we may consider them for future inclusion in our quality reporting and pay for performance programs.

Comment: Many commenters expressed concern that measures in both the Hospital VBP and HAC Reduction Programs overlap. Commenters pointed to a wide variety of concerns, including: Multiple competing benchmarks, various penalty calculation methodologies, wasting precious resources, and the potential for confusion among hospitals and beneficiaries.

Many commenters noted that using measures in both HAC Reduction and the Hospital VBP Programs potentially penalizes participating hospitals twice, or could result in instances where hospitals perform well in one program and are penalized in the other. Another commenter stated that the overlap inappropriately magnifies the impact and importance of the measures.

Some commenters were concerned that the overlap between measures in the HAC Reduction and Hospital VBP Programs may create a defeatist attitude among certain hospitals that are disproportionately affected, such as safety net hospitals. Commenters noted that such duplication between quality programs could draw needed dollars away from the very organizations that need to be focusing in this area.

Response: We acknowledge that there is some overlap in quality measures between the Hospital VBP Program and the HAC Reduction Program. While we are aware that commenters object to the possibility of scoring hospitals on certain measures under both programs, we note that these measures cover topics of critical importance to quality improvement in the inpatient hospital setting, and to patient safety. We selected these quality measures because we believe that HAC measures comprise some of the most critical patient safety areas therefore justifying the use measures in more than one program. The MRSA Bacteremia and C. difficile Infection measures that we have proposed to adopt track infections that could cause significant health risks to Medicare patients, and we believe it is appropriate to provide incentives for hospitals to avoid them under more than one program.

We further stress that the HAC Reduction Program and the Hospital VBP Program are separate programs with different purposes and policy goals. For example, the HAC Reduction Program is a program that reduces payments to hospitals for excess HACs to increase patient safety in hospitals. On the other hand, the Hospital VBP Program is an incentive program that redistributes a portion of the Medicare payments made to hospitals based on their performance on various measures. Therefore, although the measures exist in more than one program, the measures are used and calculated for very distinct purposes. Accordingly, as stated above, we believe that the critical importance of these measures to patient safety warrants their inclusion in both programs. We will, in the future, monitor the HAC Reduction and Hospital VBP Programs and analyze the impact of our measures selection, including any unintended consequences with having a measure in more than one program, and will revise the measure set in one or both programs if needed.

Comment: Many commenters stated that CMS' proposed measures for FY 2017, despite appearing to have the potential to be positive additions to the program, have not been publicly reported on the Hospital Compare Web site for 1 year as required by the Act.

Response: Section 1886(o)(2)(C)(i) of the Act requires that measures must have been “included on the Hospital Compare Internet Web site for at least 1 year prior to the beginning of the performance period.” As commenters noted, we first reported these measures' data in December 2013, and have proposed an FY 2017 performance period for these measures of CY 2015, which complies with the statutory requirement in section 1886(o)(2)(C)(i) of the Act. Accordingly, we believe that the three proposed measures meet the statutory requirements for inclusion in the FY 2017 Hospital VBP Program. We also believe that these measures represent important components of quality improvement in the acute inpatient hospital setting. However, to the extent that there remains any question regarding our interpretation of section 1886(o)(2)(C)(i) of the Act, we are finalizing that the effective date of the new FY 2017 measures, PC-01, MRSA Bacteremia, and C. difficile Infection, will be January 1, 2015, consistent with the beginning of the performance period for those measures.

Comment: One commenter believed that the AMI-7a measure is inappropriate for the Hospital VBP Program because it does not apply to most hospitals due to a low volume of cases.

Response: While we understand that many hospitals do not provide services that would be measured by the AMI-7a measure, the finalized Hospital VBP Program scoring methodology does not penalize hospitals that do not have sufficient cases for that measure, or any measures that we have adopted. Even if the measure will only apply to a small number of hospitals, we believe that this measure accomplishes the goals of the Hospital VBP Program and will improve patient outcomes in the hospitals where the measure will apply. We will consider proposing removal of this measure in future policy making.

Comment: Commenters expressed continued concern about the three 30-day mortality measures that we have adopted and placed into the Clinical Care—Outcomes domain. Commenters stated that the measures do not meaningfully reflect hospital performance because they do not meet the lower limit of moderate reliability identified by CMS' analytical contractor in a 2012 report. Commenters expressed their appreciation for our adoption of longer performance periods for these measures, but noted that even at 24 months, the measures' reliability is significantly less than we require for chart-abstracted measures. Commenters suggested that we consider a plan to improve or replace the mortality measures and consider reducing the domain weighting allocated to the Clinical Care—Outcomes domain in the meantime.

Response: As we stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53591) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50693), we believe that the mortality measures capture important quality data for purposes of the Hospital VBP Program. We believe that the three 30-day mortality measures are sufficiently reliable for inclusion in the Hospital VBP Program, particularly in light of our finalized policies to set a 25 case minimum and to extend the performance period's duration for these measures over successive years to reach 36 months.

Comment: One commenter requested that CMS change the mortality measures' populations to ensure that the same patient is not counted under more than one measure. The commenter explained that its mortality measure scores had been adversely affected by a patient that had been counted under both pneumonia and AMI mortality.

Response: If a patient was hospitalized for AMI and Pneumonia on a different date and died within 30 days from the first hospitalization in the three-year time frame we used to calculate the mortality measures, the patient could be included in both AMI and Pneumonia mortality measures. However, cohorts of mortality are determined by the principle diagnosis on the index hospitalization claims (that is, the denominator is defined as discharges/admissions not patients). There is only one principal diagnosis on each claim, therefore it is not likely that a specific patient's claim or admission would be in both AMI and Pneumonia measures.

(1) Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia (NQF #1716)

Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia (NQF #1716) is a risk-adjusted outcome measure monitoring hospital onset of MRSA Bacteremia bloodstream infection events using the standardized infection ratio (MRSA Bacteremia SIR) among all inpatients in the facility. The MRSA Bacteremia SIR is reported via the Center for Disease Control and Prevention's (CDC) National Healthcare Safety Network (NHSN). We adopted this measure beginning with the FY 2015 payment determination under the Hospital IQR Program in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51630). Initial measure data were posted on Hospital Compare in December 2013.

We remain concerned about the persistent public health threat presented by MRSA Bacteremia infections. According to a 2013 study available at the National Institute of Health's Web site, MRSA Bacteremia “results in longer hospitalization, increased expenses, and poorer patient prognosis” and “has been swiftly increasing worldwide over the past several decades.” As we noted in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51630), invasive MRSA Bacteremia infections may cause about 18,000 deaths during a hospital stay a year.

Tatokoro et al. BMC Urology 2013, 13:35. Available at http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3720197/pdf/1471-2490-13-35.pdf

Catherine Liu, Arnold Bayer, et al., Clinical practice Guidelines for the treatment of Methicillin-Resistant Staphylococcus aureus Infections in Adult and Children. Infectious Disease Society of America 2011; 52:e18.

The Measure Application Partnership (MAP) supported the direction of the MRSA Bacteremia measure for inclusion in the Hospital VBP Program in the MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS found at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72746. The MAP noted that the measure addresses an NQS priority not adequately addressed in the program measure set, the measure should be applied following public reporting on Hospital Compare, and the most recent version of the NQF-endorsed measure should be applied.

We believe that this measure is eligible for the Hospital VBP Program based on the MAP recommendation, our adoption of the most recent NQF-endorsed version under the Hospital IQR Program, and our posting of measure data on Hospital Compare. Based on the continued danger that MRSA Bacteremia infections present to patients and to public health, we further believe that this measure is appropriate for the Hospital VBP Program. Therefore, we proposed to adopt the MRSA Bacteremia measure for the FY 2017 Hospital VBP Program, and we proposed to place the measure into the Safety domain.

We invited public comment on this proposal.

Comment: Commenters supported CMS' proposal to adopt MRSA Bacteremia and C. difficile infection measures for the FY 2017 Program, stating that the measures will provide incentives for hospitals to employ appropriate infection control and prevention and antimicrobial stewardship programs. (CMS discusses C. difficile infection in more detail in the next section). Another commenter noted that the measure is a first step towards encouraging hospitals to focus on prevention and appropriate treatment of these infections.

One commenter noted that quality measures implemented in the U.K. had a positive effect on C. difficile infections and treatments and that appropriate treatment of C. difficile infections have important implications for patient outcomes, society, and the reduction of healthcare expenditures. Another commenter noted that MRSA Bacteremia and C. difficile infections are both largely preventable diseases. Another commenter expressed specific support for CMS' proposal to adopt the C. difficile Infection measure, stating that stoma care management is necessary at all clinical stages to avoid life threatening and costly infections.

Response: We thank the commenters for their support.

Comment: One commenter urged CMS to delay use of the MRSA Bacteremia and C. difficile measures until FY 2018 because, while the measures are NQF-endorsed, the MAP did not fully support them for the Hospital VBP Program. The commenter stated that the MAP voted to “support direction” and noted that the measure should be publicly reported for a sufficient amount of time prior to being added to the Hospital VBP Program.

Response: We disagree. We view the MRSA Bacteremia and C. difficile Infection measures as important quality measures to be added to the Safety domain because they track infections that present significant danger to patients. We believe that tracking hospitals on these measures—and providing incentives for better performance—will result in reduced harm to patients, better health care quality, and an improved health care system.

Comment: One commenter urged caution with the C. difficile infection and MRSA Bacteremia measures, and argued that they must track to hospital onset-infections. Commenters suggested that many infections emerge in the community, meaning hospitals are not at fault for the origination of the infection. One commenter noted that infections caused by MRSA Bacteremia vary widely geographically, and there has been a rise in the frequency of community-associated MRSA Bacteremia skin/soft tissue infections, many of which are likely best treated with direct interventions at the site of infection and do not require antibiotics. The commenter believed that as the proportion of community-associated strains become predominant, hospitals will have less ability to have any appreciable impact on their frequency.

A few commenters requested that the MRSA Bacteremia and C. difficile Infection measures control for known regional variation in the infection rates so that hospitals that care for high-risk populations are not inadvertently targeted or encouraged to limit access to care by such high-risk patients. Some commenters suggested that a better way to track MRSA Bacteremia and C. difficile infections is to include measures that focus on best practices and guidelines for patients who contract MRSA Bacteremia or C. difficile infections.

One commenter also asked CMS to consider that C. difficile infections are higher in surgical patients, rather than non-surgical patients, and are particularly high in gastrointestinal surgery patients. Therefore, the commenter believed that hospitals that perform a greater number of colorectal procedures will have higher rates of C. difficile infections in their patients, even if they are perfectly compliant with all the applicable guidelines and practices.

Response: The MRSA and C. difficile measures differentiate between community-acquired and hospital-onset events based on a patient's date of admission and date(s) of specimen collection, and includes an adjustment for many risk factors specifically facility size, medical teaching hospital affiliation, prevalence of community-onset infection, and for CDI test type. Therefore, we do not believe the measures need to be revised to account for these factors because the current approach already addresses many of the commenters' concerns. However, we will collaborate with CDC to evaluate whether there is a need to consider additional risk adjustment factors, such as occurrence of gastrointestinal surgeries, suggested by the commenters for future policy development. While we are willing to consider other risk factors, the additional adjustment gained must be weighed against the extra burden added to collected more required data elements. The issue of the same measures being included in multiple programs is addressed further below.

After consideration of the public comments we received, we are finalizing our proposal to adopt the MRSA Bacteremia measure for the FY 2017 Hospital VBP Program.

(2) Clostridium difficile (C. difficile) Infection (NQF #1717)

C. difficile Infection (NQF #1717) is a risk-adjusted outcome measure monitoring hospital onset of C. difficile Infection events using the standardized infection ratio (C. difficile SIR) among all inpatients in the facility. The C. difficile SIR is reported via CDC's NHSN. We adopted this measure beginning with the FY 2015 payment determination under the Hospital IQR Program in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51630 through 51631). Initial measure data were posted on Hospital Compare in December 2013.

As with MRSA Bacteremia infections, we are concerned about the seriousness of C. difficile infections. According to a 2012 study, “infection with Clostridium difficile is associated with poor outcomes for patients. Previous work has determined that, regardless of baseline risk of death, for every 10 patients that acquire C. difficile in the hospital, 1 patient will die. Clostridium difficile is also associated with increased health care costs. One of the primary mechanisms by which C. difficile increases costs is by increasing the length of time patients spend in hospital.”

Forster et al. “The effect of hospital-acquired infection with Clostridium difficile on length of stay in hospital.” Canadian Medical Association Journal, January 10, 2012. Available at http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3255231/pdf/1840037.pdf.

As we stated in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51630 through 51631), C. difficile infections have become more frequent, more severe, and more difficult to treat in recent years. Each year, tens of thousands of people in the United States get sick from C. difficile, including some otherwise healthy people who are not hospitalized or taking antibiotics.

The MAP noted that the measure addresses an NQS priority not adequately addressed in the program measure set, the measure should be applied following public reporting on Hospital Compare, and that the most recent version of the NQF-endorsed measure should be applied.

We believe that this measure is eligible for the Hospital VBP Program based on the MAP recommendation, our adoption of the most recent NQF-endorsed version under the Hospital IQR Program, and our posting of measure data on Hospital Compare, as well as the continued danger that C. difficile infections present to patients and the public health. Therefore, we proposed to adopt the C. difficile SIR measure for the FY 2017 Hospital VBP Program, and we proposed to place the measure into the Safety domain.

We invited public comment on this proposal.

Comment: One commenter urged CMS to delay use of the C. difficile Infection measure until FY 2018 because C. difficile infections have diverse sources and are not associated with symptomatic cases for which infection control interventions are primarily targeted. Further, the commenter had concerns about current lab identification definitions used for public reporting because (1) asymptomatic cases with positive lab identification events are included, (2) recurrences are counted as new cases if tested again after two weeks, and (3) patients may be asymptomatically colonized prior to admission and develop the disease, resulting in attribution of a healthcare associated infection, regardless of any hospital's infection prevention strategies. Finally, the commenter noted that there is no standard strategy for testing patients for C. difficile infections.

Response: The CDC Web site includes posted information for appropriate clinical practice, testing, and identification of C. difficile infections at: http://www.cdc.gov/HAI/organisms/cdiff/Cdiff_clinicians.html. These practices are strongly suggested when tracking C. difficile cases for reporting to NHSN, and include guidance to only perform the test for C. difficile and its toxins on diarrheal (unformed) stool from symptomatic patients, unless ileus due to C. difficile is suspected, and to avoid testing stool from asymptomatic patients, for routine identification of asymptomatic carriers, or as a test of cure. Following this guidance as a standard of practice will avoid reporting of asymptomatic, colonized patients, who are not to be reported per NHSN protocol. Recurrent cases are counted separately from incident cases and are not included in the hospital-onset, incident C. difficile metric reported to CMS. Per published research and the NHSN protocol, a recurrent C. difficile LabID Event is defined as a specimen obtained >2 weeks (>= 2 weeks is a duplicate and not reported) and ≤8 weeks after the most recent CDI LabID Event for that patient. Incident cases are defined and counted as specimens obtained >8 weeks after the most recent CDI LabID Event for that patient (McDonald LC, et al. Infect Control Hosp Epidemiol 2007; 28:140-145).

Comment: One commenter cautioned that the C. difficile Infection measure may result in discouraging healthcare professionals from screening for or attempting to diagnose mild cases of CDI because the measure focuses on rates of infection rather than screening. The commenter suggested that CMS consider rewarding hospitals for limiting prolonged periods of multiple antibiotic use among patients, for optimizing antimicrobial therapy, and for instituting CDI prevention programs.

Response: We will consider this feedback; however we do not think that this measure will discourage healthcare professionals from testing for C. difficile when clinically indicated, particularly given the potential for serious harm that C. difficile infections present to patients. Though healthcare professionals may have incentives to avoid diagnostic testing, they also have incentives to treat with confirmation of the diagnosis, in part because of the danger of overprescribing antibiotics and its associated complications for patients. We will consider the commenter's suggestions in the future.

After consideration of the public comments we received, we are finalizing our proposal to adopt the C. difficile infection measure for the FY 2017 Hospital VBP Program.

(3) PC-01: Elective Delivery Prior to 39 Completed Weeks Gestation (NQF #0469)

PC-01: Elective Delivery Prior to 39 Completed Weeks Gestation (NQF #0469) is a chart-abstracted measure that we adopted beginning with the FY 2015 payment determination for the Hospital IQR Program in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53528 through 53530). Initial measure data were posted on Hospital Compare in December 2013. Although this is a chart-abstracted measure, we finalized our policy in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53528 through 53529) that this measure would be collected in aggregated numerator, denominator, and exclusion counts per hospital via a Web-based tool, instead of collecting patient-level data from hospitals.

As we described in the FY 2013 IPPS/LTCH PPS final rule referenced above, the Strong Start Initiative (http://www.innovation.cms.gov/initiatives/strong-start/) was launched to help reduce early elective births. At launch, the HHS Secretary stated that more than half a million infants are born prematurely in America each year. Fortunately, the early elective birth rate has steadily decreased. In 2012, the number of early elective births had decreased to approximately 456,000 or 11.55 percent of the total number of births. Early elective births may require additional medical attention and early intervention services. Research indicates that elective deliveries before 39 weeks increase the risk of significant complications for mother and baby, as well as long-term health problems. Early elective births are a public health problem that has significant consequences for families well into a child's life.

Martin, JA, Hamilton, BE, Osterman, MJK, Curtin, SC, Mathews, TJ. (2013). Births: Final data for 2012. Natl Vital Stat Rpt. 62(9). Retrieved from http://www.cdc.gov/nchs/data/nvsr/nvsr62/nvsr62_09.pdf.

Glantz, J. (Apr. 2005). Elective induction vs. spontaneous labor associations and outcomes. J Reprod Med. 50(4):235-40.

Vardo, J., Thornburg, L., Glantz J., (2011). Maternal and neonatal morbidity among nulliparous women undergoing elective induction of labor. J. report med. 56(1-2): 25-30.

Tita, A., Landon, M., Spong, C., Lai, Y., Leveno, K., Varner, M., et al. (2009). Timing of elective repeat cesarean delivery at term and neonatal outcomes. [Electronic Version]. NEJM. 360:2, 111-120.

Clark, S., Miller, D., Belfort, M., Dildy, G., Frye, D., & Meyers, J. (2009). Neonatal and maternal outcomes associated with elective delivery. [Electronic Version]. Am J Obstet Gynecol. 200:156.e1-156.e4.

As a public campaign to reduce early elective births, the Strong Start Initiative's objective is to test ways to reverse this trend by helping provide expectant mothers with the care they need for a healthy delivery and a healthy baby, and by focusing on reducing early elective deliveries, which can lead to a variety of health problems for mothers and infants. The Strong Start Initiative cuts across many agencies within HHS and involves external organizations including the March of Dimes and the American College of Obstetricians and Gynecologists (ACOG). We believe that a reduction in the number of nonmedically indicated elective deliveries at ≥37 to <39 weeks gestation will result in a substantial decrease in neonatal morbidity and mortality, as well as a significant savings in health care costs. In addition, the rate of cesarean sections should decrease with fewer elective inductions, resulting in decreased length of stay and health care costs.

The MAP supported adoption of the PC-01 Elective Delivery measure for inclusion in the Hospital VBP Program in the MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS found at https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72746. The MAP noted that the measure addresses an NQS priority not adequately addressed in the program measure set.

We proposed to adopt this measure for the Hospital VBP Program and we proposed to place the measure into the Clinical Care—Process domain because we believe this measure furthers the NQS's three-part aim of better healthcare for individuals, better health for populations, and lower costs of healthcare. In addition, although the PC-01 Elective Delivery measure captures data from all applicable patients, we also believe that the measure is specifically relevant to the nearly 2 million Medicare beneficiaries who are aged 44 and under, most of whom are dual eligible beneficiaries, who have the potential to be impacted by early elective births. In 2011, Medicare paid for roughly 14,000 births.

We welcomed public comment on this proposal.

Comment: Many commenters supported CMS' proposal to include the PC-01 measure in the Hospital VBP Program, noting that many hospitals continue to have rates of early elective delivery in excess of 15 percent despite the American College of Obstetricians and Gynecologists recommendations that no elective delivery be performed before the gestational age of 39 weeks without a medical indication. One commenter believed that this measure will reduce costs and also have the potential to greatly improve newborn outcomes of care.

Response: We thank the commenters for their support.

Comment: Several commenters expressed opposition to the PC-01 measure for the FY 2017 Hospital VBP Program because the measure is Web-based, and there has not been any chart validation for accuracy and consistency of data collection across hospitals. Some commenters specifically opposed using any data that are not validated under the Hospital VBP Program, stating that PC-01 should therefore not be finalized for the program based on data accuracy concerns. Commenters stated that, while hospitals are working diligently to collect accurate data for this measure, it is possible that hospitals collecting the most accurate data will have the lowest scores.

Commenters stated that the benchmark of 0 percent is not realistic considering that justifications for Elective Delivery are based off of ICD-9-CM codes and The Joint Commission (TJC) has stated that not all the justifications for an elective delivery are included on the ICD-9-CM Justification Table. Further, commenters noted that TJC has stated that the purpose of this measure is to enable hospitals to establish a baseline for their performance, which in turn serves as a determinant of whether improvement efforts are effective over time.

One commenter suggested that CMS wait to adopt this type of measure until the electronic clinical quality measure version is available. One commenter did not support the recommendation to add the PC-01 measure to the Clinical Care—Process domain because the measure algorithm exclusions are applied prior to denominator selection. The commenter stated that these exclusions make the quarterly denominators very low, even for a large facility, and that, therefore, the measure does not truly assess the quality of care provided.

Response: We disagree with the concept that this measure may be inherently invalid because not all justifications for an elective delivery are included in the ICD-9-CM Justification Table, or invalid because of the volume of exclusions. All NQF-endorsed measures must meet strict reliability and validity criteria to gain endorsement. PC-01 is NQF-endorsed therefore the measure as defined is clinically valid. Regarding the accuracy of the submitted data, hospitals are required to acknowledge the accuracy of the data submitted through the Hospital IQR Program's Data Accuracy and Completeness Acknowledgment statement on an annual basis. To validate the accuracy of submitted data, we employ logic checks as we do for other measures. For example, the number of cases entered in the numerator cannot be greater than the number of cases entered in the denominator.

As explained in section IX.A.11 of this preamble, because the PC-01 data are collected in aggregate instead of for individual patients, we cannot use the same mechanism to assess reliability of PC-01 as we use for chart-abstracted clinical process of care measures reported at the patient level. The approach for other clinical process of care measures involves sampling, whereas the analogous approach for aggregate data would involve collecting all data from a hospital. We believe that the benefits of validating aggregate data in this way are outweighed by the burden to hospitals in submitting potentially hundreds of records to validate one measure, and also believe that this approach would be cost-prohibitive for CMS.

However, we are exploring different options to assess the general validity of PC-01 data more robustly. For the reasons outlined in the proposed rule, we continue to believe this measure is appropriate for the Hospital VBP Program. We have adopted the e-CQM version of this measure under our voluntary electronic reporting option for the Hospital IQR Program.

Comment: Some commenters opposed CMS' proposed adoption of the PC-01 measure, stating that CMS should first determine that there is sufficient room for making additional substantive improvements that would result in better patient care.

Response: The NQF notes that pre-term births are a rapidly escalating public health problem, and that early elective delivery contributes to this problem. As stated above, many commenters have noted that many hospitals continue to have rates of early elective delivery in excess of 15 percent despite the American College of Obstetricians and Gynecologists recommendations that no elective delivery be performed before the gestational age of 39 weeks without a medical indication. Therefore, we believe that hospitals have the opportunity to improve upon a detrimental practice that was until very recently rapidly expanding.

Osterman MJK, Martin JA. Changes in cesarean delivery rates by gestational age: United States, 1996-2011. NCHS data brief, no 124. Hyattsville, MD: National Center for Health Statistics. 2013.

Comment: Commenters expressed concerns about CMS' proposal to adopt the PC-01 measure, noting that many hospitals do not provide perinatal care services and stating that the volume of Medicare births is not high enough to justify this measure's placement into the Hospital VBP Program. Commenters suggested that CMS remove PC-01 from the proposed measure set.

Response: We continue to believe this measure is appropriate for the Hospital VBP Program, as we described in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28120). The measure is NQF-endorsed and was supported for the Hospital VBP Program by the MAP, and addresses an NQS priority not adequately addressed in the Program's measure set to date. In addition, as we noted, nearly 2 million Medicare beneficiaries are aged 44 and under, and in 2011, Medicare paid for roughly 14,000 births.

After consideration of the public comments we received, we are finalizing our proposal to adopt the PC-01 measure for the FY 2017 Hospital VBP Program.

d. Adoption of the Current Central Line-Associated Bloodstream Infection (CLABSI) Measure (NQF #0139) for the FY 2017 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50682 and 50686), we adopted the CLABSI measure for the FY 2016 Hospital VBP Program. We stated our belief that adopting the current CLABSI measure is consistent with the MAP's recommendation in the MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS found at https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72746,, to use the standardized infection ratio version of the measure until the reliability-adjusted CLABSI measure is NQF-endorsed. We have stated our intent to consider adopting the reliability-adjusted CLABSI measure in future rulemaking.

The reliability-adjusted standardized infection ratio (SIR) is an outcome measure that summarizes the healthcare-associated infection experience by type of infection (for example, central-line associated bloodstream infection, surgical site infection) for individual hospitals. The reliability-adjusted measure enables more meaningful statistical differentiation between hospitals by accounting for differences in patient case-mix, exposures to medical devices or procedures (for example, central line days, surgical procedure volume), and unmeasured factors that are not reflected in the unadjusted SIR and that cause variation in outcomes between hospitals. Accounting for these sources of variability enables better measure discrimination between hospitals and leads to more reliable quality measurements.

However, in the absence of NQF endorsement of the reliability-adjusted CLABSI measure or any additional MAP recommendations, and unless and until the Hospital IQR Program adopts the reliability adjustments, we believe we may only consider the current version of the CLABSI measure for adoption under the Hospital VBP Program. We continue to believe that the CLABSI measure encourages hospitals to minimize infection events that present significant health risks to patients. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28120 through 28121), we proposed to adopt the current version of the CLABSI measure for the FY 2017 Hospital VBP Program and subsequent years. If a reliability-adjusted version of the measure becomes available to us in the future, we will consider adopting it.

We welcomed public comment on this proposal.

Comment: Commenters requested that CMS clarify whether the CLABSI and CAUTI measures will include non-ICU locations. Commenters also requested that CMS clarify whether hospitals that report CLABSI and CAUTI to NHSN as Mixed Acuity Units instead of ICUs will receive SIRs for the Hospital VBP Program, or if the measures will not be applicable for hospitals that do not report for ICUs.

Response: For the CLABSI and CAUTI measures, we will score hospitals using adult, pediatric, and neonatal ICU data only for the FY 2017 and FY 2018 Hospital VBP Programs, because the baseline periods for FY 2017 and FY 2018 are CY 2013 and 2014 respectively. These baseline periods are prior to the Hospital IQR Program requirement that hospitals report data on selected non-ICU locations (78 FR 50787). Therefore, we will have no data on non-ICU locations to use for performance or improvement benchmarks for these program years.

Beginning with the FY 2019 Program, we intend to publicly report the CLABSI and CAUTI SIR data reported to the Hospital IQR Program on selected non-ICU locations (that is, adult or pediatric medical ward, surgical ward, and medical/surgical ward). We will consider inclusion of these locations in the Hospital VBP Program as soon as applicable reliable baseline data are available.

Mixed acuity units do not meet NHSN definitions for the six select non-ICU locations, and therefore are not required to be reported for Hospital IQR Program purposes, so we will not use data from those units for the Hospital VBP Program for any of the baseline or performance periods. We refer readers to the NHSN Helpdesk Mailbox (nhsn@cdc.gov) with any specific questions about correctly defining and mapping patient care locations into NHSN.

Comment: One commenter supported the continued inclusion of the existing risk-adjusted, rate-based ICU-only NHSN CLABSI measure in the FY 2017 Hospital VBP Program. The commenter also urged CMS to calculate the CLABSI measure using the ICU-only specifications until the facility-wide measure is available for both the baseline and performance periods of the Hospital VBP Program. The commenter was unaware of how CMS will deal with the CLABSI measure once it transitions to a facility-wide measure and expressed concern that CMS might dispense with the improvement score when the baseline and performance periods do not match. The commenter noted that the CDC has sufficiently granular data to continue reporting ICU-only results to CMS despite the collection moving to facility wide.

Response: We agree that improvement scores are an important part of the Hospital VBP Program. We refer readers to our response to the previous commenter, in which we explain our intention to follow the commenter's suggestion, and provide the timelines for transitioning from the ICU-only measure to the broader measure of CLABSI in ICU and select non-ICU locations.

Comment: One commenter urged CMS to rapidly incorporate a reliability-adjusted Standardized Infection Ration (SIR) calculation for the CLABSI measure because it provides a more robust calculation to identify differences among hospital rates.

Response: We continue to believe that the CLABSI measure encourages hospitals to minimize infection events that present significant health risks to patients. However, in the absence of NQF-endorsement of the reliability-adjusted measure and any additional MAP recommendations, and unless we decide to adopt the reliability adjustments in the Hospital IQR Program, we believe we may only consider the current version of the CLABSI measure for adoption under the Hospital VBP Program. If a reliability-adjusted version of the measure becomes available to us in the future, we will consider adopting it.

Comment: One commenter stated that CMS should not finalize the CLABSI measure and should wait until the reliability-adjusted version of the measure is endorsed by NQF. The commenter explained that many hospitals are having difficulty reporting the current measure, resulting in deviations in accuracy that may create profound differences in hospital performance.

Response: We will consider adopting the new version of the measure if it is endorsed by NQF. However, reliability adjustment is a methodology designed to address hospitals with small numerators and denominators. The methodology is not designed to assist hospitals in reporting CLABSI data accurately. To assist hospitals in accurately reporting CLABSI, CMS and CDC have been working collaboratively to clarify NHSN protocol specifications and to educate hospitals on these protocols.

Comment: One commenter was pleased with CMS' proposal to adopt the CLABSI measure, stating that it measures important safety outcomes for consumers and purchasers.

Response: We agree and thank the commenter.

After consideration of the public comments we received, we are finalizing our proposal to adopt the current CLABSI measure for the FY 2017 Hospital VBP Program.

e. Summary of Previously Adopted and New Measures for the FY 2017 Hospital VBP Program

The following table outlines the measures for the FY 2017 Hospital VBP Program, including those that we are readopting and those measures we are adopting for the first time. As discussed further below, this table also includes the FY 2017 domains into which we are placing the readopted measures, as well as the domains into which we are placing the newly adopted measures.

Previously Adopted and New Measures for the FY 2017 Hospital VBP Program

MeasureDescriptionDomain
CAUTI *Catheter-Associated Urinary Tract Infection (NQF #0138)Safety.
CLABSI **Central Line-Associated Blood Stream Infection (NQF #0139)Safety.
C. difficile ***Clostridium difficile Infection (NQF #1717)Safety.
MRSA ***Methicillin-Resistant Staphylococcus aureus Bacteremia (NQF #1716)Safety.
PSI-90 *Complication/patient safety for selected indicators (composite) (NQF #0531)Safety.
SSI *Surgical Site Infection: (NQF #0753) • Colon • Abdominal HysterectomySafety.
MORT-30-AMI *Acute Myocardial Infarction (AMI) 30-day mortality rate (NQF #0230)Clinical Care—Outcomes.
MORT-30-HF *Heart Failure (HF) 30-day mortality rate (NQF #0229)Clinical Care—Outcomes.
MORT-30-PN *Pneumonia (PN) 30-day mortality rate (NQF #0468)Clinical Care—Outcomes.
AMI-7a *Fibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival (NQF #0164)Clinical Care—Process.
IMM-2 *Influenza Immunization (NQF #1659)Clinical Care—Process.
PC-01 ***Elective Delivery Prior to 39 Completed Weeks Gestation (NQF #0469)Clinical Care—Process.
MSPB-1 *Medicare Spending per Beneficiary (NQF #2158)Efficiency and Cost Reduction.
HCAHPS *Hospital Consumer Assessment of Healthcare Providers and Systems Survey (NQF #0166)Patient and Caregiver Centered Experience of Care/Care Coordination.
* Measures readopted for the FY 2017 Hospital VBP Program.
** Measure adopted for the FY 2017 Hospital VBP Program that were not previously subject to automatic readoption.
*** Measures newly adopted for the FY 2017 Hospital VBP Program in this final rule.

5. Additional Measures for the FY 2019 Hospital VBP Program

a. Hospital-Level Risk-Standardized Complication Rate (RSCR) Following Elective Primary Total Hip Arthroplasty (THA) and Total Knee Arthroplasty (TKA)

Hospital-level Risk-Standardized Complication Rate (RSCR) Following Elective Primary Total Hip Arthroplasty (THA) and Total Knee Arthroplasty (TKA) (NQF #1550) is an outcome measure that we adopted beginning with the FY 2015 payment determination under the Hospital IQR Program in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53516 through 53518). The measure assesses complications occurring after THA and TKA surgery from the date of the index admission to 90 days post date of the index admission. The outcome is one or more of the following complications: Acute myocardial infarction, pneumonia, or sepsis/septicemia within 7 days of admission; surgical site bleeding, pulmonary embolism or death within 30 days of admission; or mechanical complications, periprosthetic joint infection or wound infection within 90 days of admission. We posted THA/TKA measure data on the Hospital Compare Web site in December 2013. We refer readers to the FY 2013 IPPS/LTCH PPS final rule and to the THA/TKA complication methodology report (http://qualitynet.org/dcs/BlobServer?blobkey=id&blobnocache=true&blobwhere=1228890067881&blobheader=multipart%2Foctet-stream&blobheadername1=Content-Disposition&blobheadervalue1=attachment%3Bfilename%3DTHK_CmpMsrUpdtSpecs_080113.pdf&blobcol=urldata&blobtable=MungoBlobs) for additional details on the THA/TKA measure.

We continue to believe that measuring and reporting risk-standardized complication rates will inform health care providers about opportunities to improve care, strengthen incentives for quality improvement, and promote improvements in the quality of care received by patients and in the outcomes they experience. We believe that THA/TKA is an important measure of clinical outcomes, and, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28121 through 28122), we proposed to adopt it for the FY 2019 Hospital VBP Program and subsequent years. The MAP supported the adoption of the measure for inclusion in the Hospital VBP Program in its MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS found at https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72746,, noting it addresses a high-volume elective procedure with variation in performance. We proposed to adopt this measure for FY 2019 now based on the length of the measure's reporting period and the time necessary to complete scoring calculations. Because it is an outcome measure, we proposed to place it in the Clinical Care—Outcomes domain.

We welcomed public comments on this proposal.

Comment: Several commenters supported CMS' proposal to adopt THA/TKA for the FY 2019 Program, stating that the measure will further drive hospitals to boost their care quality initiatives focused on this procedure. Some commenters urged CMS to consider adopting it as early as FY 2018.

Response: We believe that the time periods necessary to collect sufficiently reliable performance data on this measure preclude us from adopting the measure sooner than FY 2019.

Comment: Several commenters opposed adoption of the THA/TKA measure, stating that it has not met the one-year public reporting requirement outlined in the Hospital VBP Program statute.

Response: As described above with respect to measures proposed for FY 2017, section 1886(o)(2)(C)(i) of the Act requires that measures must have been “included on the Hospital Compare Internet for at least 1 year prior to the beginning of the performance period.” As commenters noted, we reported these measures' data in December 2013, and have proposed an FY 2017 performance period for these measures of CY 2015, which complies with the statutory requirement in section 1886(o)(2)(C)(i) of the Act. We believe that this measure meets the statutory requirements for inclusion in the FY 2019 Hospital VBP Program. We also believe that this measure represents an important component of quality improvement in the acute inpatient hospital setting. However, to the extent that there remains any question regarding our interpretation of section 1886(o)(2)(C)(i) of the Act, we are finalizing that the effective date of the THA/TKA measure will be July 1, 2015, consistent with the beginning of the performance period for that measure.

Comment: One commenter supported the addition of the THA/TKA quality measure because it is MAP-approved and will further drive hospitals to boost their quality of care initiatives around these high-volume procedures that reduce pain and increase mobility for hundreds of thousands of Medicare beneficiaries each year. The commenter noted that this measure is particularly important because it captures multiple complications and adverse events at various post-operative time intervals and would give hospitals a common benchmark around which to organize their quality improvement efforts.

Response: We agree and thank the commenter for its support.

Comment: One commenter expressed concern about the accuracy of the administrative data sets that are the basis for the THA/TKA measure, stating that the coding data have been known to underreport significant comorbidities that may therefore skew quality measurement.

Response: We believe that the administrative claims data used for the Hip/Knee Complication measure is sufficiently accurate for purposes of Hospital VBP Program inclusion. We have validated the AMI, HF, and pneumonia mortality measures by building comparable models using medical record data for risk adjustment for heart failure patients (National Heart Failure data), AMI patients (Cooperative Cardiovascular Project data), and pneumonia patients (National Pneumonia Project dataset). When the medical record-based models were applied to the corresponding patient population, the hospital risk-standardized rates estimated using the claims-based risk adjustment models had a high level of agreement with the results based on the medical record model, thus supporting the use of the claims-based models for public reporting.

Regarding the commenter's concern about underreporting significant comorbidities, during measure development, we also conducted a medical record validation study of the THA/TKA complications measure. The goal of that study was to determine the overall agreement between arthroplasty patients identified as having a complication (or no complication) in the claims-based measure and those who had a complication (or no complication) also documented in the medical record. Overall measure agreement was 93 percent (598/644 patients) before any changes were made to the model specifications. After the measure specifications were changed based upon both the results of this validation study, the measure agreement between claims data and the medical record was 99 percent (635/644).

Comment: Some commenters opposed the proposed adoption of the THA/TKA measure, stating that CMS should verify that the measure is properly risk-adjusted across patient populations to ensure that hospitals are not deterred from performing these surgeries for older, high-risk beneficiaries. One commenter opposed the adoption of the THA/TKA measure because it uses the same hierarchical logical modeling methodology that is specified for the mortality measures included in the Hospital VBP Program, and the commenter continued to have concerns about the ability of this model to accurately distinguish between hospitals' performance. The commenter suggested instead that the model should include an adjustment for socioeconomic status, which commenter believes is an important predictor of complication rates. The commenter believes the measure is insufficient for inclusion in payment policies, for these reasons. Another commenter expressed support for the proposed THA/TKA measure, conditioned on CMS' adoption of a sociodemographic adjustment to the measure.

Response: We refer readers to our earlier discussion of risk adjustment based on socioeconomic status with respect to the MSPB measure in section IV.I.4. of the preamble of this final rule, which also is relevant for this measure. As discussed in the previous section, we believe that the THA/TKA measure's risk adjustment methodology appropriately considers and adjusts for clinical factors.

After consideration of the public comments we received, we are finalizing the THA/TKA measure for the FY 2019 Hospital VBP Program and subsequent years.

b. PSI-90 Measure

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50698), we declined to finalize the PSI-90 measure for the FY 2019 Hospital VBP Program in order to adopt a more recent baseline period than would have been possible at that time. However, we did not intend to signal that we would not adopt the PSI-90 measure for FY 2019 and subsequent years. We continue to believe that adopting this Agency for Healthcare Research and Quality (AHRQ) Patient Safety Indicator (PSI) composite measure provides strong incentives for hospitals to ensure that patients are not harmed by the medical care they receive, which is a critical consideration in quality improvement. In order to clarify the measure's status under the Hospital VBP Program and ensure that there is no confusion about our intent, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28122), we proposed to readopt the PSI-90 measure for FY 2019 Hospital VBP Program and subsequent years.

We welcomed public comments on this proposal.

Comment: Several commenters supported CMS' proposal to adopt the PSI-90 measure for the FY 2019 Program. One commenter noted that the measure captures important patient safety outcomes for consumers and purchasers.

Response: We thank the commenters for their support.

Comment: Several commenters suggested that CMS publish hospitals' performance on both the full composite measure and its individual indicators. One commenter suggested that CMS consider separate patient safety indicators for the Hospital VBP Program rather than the composite.

Response: With respect to commenters' suggestions that we publish hospitals' performance on individual indicators, we may consider doing so in the future. However, since we have adopted the composite measure for the Hospital VBP Program, we believe it is appropriate to publish hospitals' performance on that measure, rather than its components, as a reflection of performance measured and scored under the Program. The composite measure is the basis for awarding achievement and improvement points under the Hospital VBP Program, not its underlying indicators, and we believe it is appropriate to focus the Program's public reporting on the measures that receive points under the Program.

Comment: Some commenters urged CMS to remove the PSI-90 measure from the Hospital VBP, Hospital IQR, and HAC Reduction Programs immediately based on NQF's recent report on patient safety measures. Several commenters noted that the NQF's Patient Safety Standing Committee did not recommend the measure for endorsement during maintenance review.

Commenters also noted that the PSI-90 measure is undergoing maintenance review by the NQF. One commenter stated that AHRQ's proposed changes to the measure to regain NQF's endorsement may be significant and suggested that CMS consider whether it should continue to adopt the measure for the Hospital VBP, HAC Reduction, and Hospital IQR Programs.

Response: We would like to clarify the status of the PSI-90 measure with regard to NQF endorsement. As part of the routine NQF measure maintenance process, the Patient Safety Committee expressed concerns about the weighting of the PSI-90 component measures and requested to see additional measure information related to re-weighting of PSI-90 with three additional components (PSI-9, PSI 10 and PSI-11 before deciding if it would recommend continued endorsement of the measure. AHRQ has submitted the requested data for the NQF Patient Safety Committee's consideration.

If, during the NQF review process, significant changes are made to the measure, we will evaluate those changes, including whether the measure remains appropriate for the Hospital VBP Program.

Comment: Several commenters stated that the PSI-90 measure lacks robust risk-adjustment and tends to penalize hospitals with larger case volumes. Several commenters argued that the measure relies on inadequately validated claims data. Commenters stated that claims-based measures are not necessarily reliable for Hospital VBP Program purposes. Commenters argued that the measure's basis in administrative claims data presents significant limitations and that using administrative claims data is a less accurate method of identifying patient severity than clinical data abstracted from medical records.

Another commenter was opposed to further adoption of the PSI-90 measure for the Hospital VBP Program, stating that composite measures calculated using retrospective claims data create many problems for quality improvement activities, as the commenter believes claims-based data create inherent difficulties that are not present in non-claims data. The commenter was also opposed to rebalancing the PSI-90 measure by adding new metrics or shifting weighting to better measures in the composite, and stated that non-claims data should be considered for future composites when feasible.

Response: Each of the PSI-90 composite component measures includes detailed risk-adjustment for clinical factors (for example, modified diagnostic related groupings, major diagnostic categories, comorbidities), age, and gender that influence the risk for experiencing a patient safety event during hospitalization. AHRQ's Quality Indicator program continually updates and refines the indicators to capture the best possible quality indicators for the measure.

We also note that there are previously conducted validation studies examining the relationship between billing or claims data and medical records.

In addition, AHRQ has advised us that the NQF-convened a group of experts to determine what criteria should be used for evaluating the indicators in the PSI-90 measure. The Technical Expert Panel provided clear guidance on the relationship between the individual component indicators and the composite in the Composite Performance Measure Evaluation Guidance document (NQF, April 2013), available at http://www.qualityforum.org/Publications/2013/04/Composite_Performance_Measure_Evaluation_Guidance.aspx. Specifically, individual component measures that are included in the composite performance measure: (1) Should be justified based on the clinical evidence; (2) do not need to be NQF endorsed; (3) generally should demonstrate a gap in performance; and (4) may not be sufficiently reliable independently, but contribute to the reliability of the composite performance measure.

AHRQ convened a Composite Measure Workgroup of experts in the field to determine the best weighting strategy. The methodology of the PSI-90 measure is detailed in the original technical report by the AHRQ Composite: http://qualityindicators.ahrq.gov/Downloads/Modules/PSI/PSI_Composite_Development.pdf. Several alternative approaches were discussed with the AHRQ Composite Measure Workgroup and the first NQF Composite Measure Steering Committee. Factor analysis was considered as one approach and was deemed to have no clear advantages over less complex, more intuitively clear weighting schemes. In brief, numerator weighting that is used in PSI-90 was preferred due to its greater simplicity and clarity.

Comment: A few commenters strongly opposed the duplicative use of PSI-90 in both the Hospital VBP and HAC Reduction Programs.

Response: As discussed further above, while we are aware that commenters object to the possibility of scoring hospitals on certain NHSN measures under both the Hospital VBP and HAC Reduction Programs, we note that these measures cover topics of critical importance to quality improvement in the inpatient hospital setting, and to patient safety.

Comment: Some commenters opposed adoption of the PSI-90 composite measure, stating that its component indicators have serious flaws. Commenters stated that, for example, the PSI-15 indicator (accidental puncture or laceration), does not clearly define what constitutes an “accidental puncture.” Commenters also stated that PSI-12 (postoperative PE/DVT rate) relies on risk adjustment criteria that could lead to potential unintended consequences such as tagging every LE thrombophlebitis, whether or not they are clinically significant. One commenter stated that emergent cases and patients with a prior history of PE or DVT should also be excluded from that measure.

Response: We continue to believe the PSI-90 measure is an important measure of patient safety, and therefore warrants inclusion in the Hospital VBP Program. PSI 15—accidental puncture and laceration and PSI 12—Perioperative Pulmonary Embolism or Deep Vein Thrombosis Rate are endorsed as valid and reliable measures (NQF 0345, NQF 0450, respectively). Expert panels have felt that these are scientifically sound measures.

Comment: One commenter expressed concern about the reliability and reproducibility of the PSI-90 claims-based composite measure because of generally poor agreement between these and NHSN-based surveillance criteria, with the exception of surgical site infection (SSI). The commenter encouraged AHRQ and other independent researchers to examine the value, validity, reliability, and reproducibility of PSI-90 by comparing it to epidemiologic measures within NHSN's domain. The commenter recommended that CMS study how these measures correlate with SSI and NHSN-based surveillance criteria.

Response: We agree with the commenter that studying the correlation between PSI-90 and with SSI and NHSN-based surveillance criteria would provide additional insights into PSI-90 measure validity, and will consider this in the future. We note that we are finalizing a policy to access certain NHSN data reported to the Hospital IQR Program which would make it possible to conduct this type of alignment analysis between the PSI-90 measure and the NHSN measures.

Comment: Commenters suggested that CMS consider removing the PSI-12 indicator from the PSI-90 composite for the FY 2015 Program until stakeholder concerns with the indicator's validity have been resolved.

Response: We do not believe the PSI-12 indicator should be removed from the PSI-90 composite measure because it is designed to improve surveillance and awareness of post-operative deep vein thrombosis and pulmonary embolism. We believe that monitoring these conditions is important to protect patients from post-operative complications.

Comment: A few commenters asked CMS not to finalize several proposed new measures for the FY 2019 Hospital VBP Program until they are NQF-endorsed, recommended by the MAP, and hospitals have experience in reporting and understanding the measures.

Response: We believe that we have complied with the Hospital VBP Program's statutory requirements with respect to endorsement from NQF, MAP recommendations, and reporting through the Hospital IQR Program prior to adopting these measures under the Hospital VBP Program. Further, for the reasons we described in the proposed rule and in our responses to comments on that proposed rule, we continue to believe that the proposed measures represent improvements to the Hospital VBP Program's measure set by expanding to new clinical topics and addressing public health concerns.

After consideration of the public comments we received, we are finalizing our proposal to adopt the PSI-90 composite measure for the FY 2019 Hospital VBP Program.

6. Possible Measure Topics for Future Program Years

a. Care Transition Measure (CTM-3) Items for Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) Survey

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28122), we stated that we are considering proposing to add the Care Transition Measure (CTM) from the HCAHPS Survey to the Patient and Caregiver Centered Experience of Care/Care Coordination (PEC/CC) domain of the FY 2018 Hospital VBP Program. We sought public comments on this topic.

The CTM was added to the HCAHPS Survey of hospital inpatients in January 2013 (77 FR 53513 through 53516). Three items were added to the HCAHPS Survey to create the new Care Transition Measure composite. After collecting four quarters of data on these items (January 2013 through December 2013), we intend to publicly report CTM scores for the first time on our Hospital Compare Web site in October 2014.

Once the CTM has been publicly reported on Hospital Compare for one year, in accordance with the statutory requirements of the Hospital VBP Program, we are considering proposing to adopt CTM as the ninth dimension of the HCAHPS survey in the PEC/CC domain for the FY 2018 Hospital VBP Program. We intend to propose that the PEC/CC domain in the FY 2018 Hospital VBP Program would have a baseline period of January 1, 2014 through December 31, 2014, and a performance period of January 1, 2016 through December 31, 2016.

Currently, the PEC/CC domain is comprised of eight dimensions of the HCAHPS Survey. Scoring in this domain is based on two elements: The HCAHPS Base Score and HCAHPS Consistency Points Score. For additional information on the calculation of the PEC/CC domain score, we refer readers to “A Step-by-Step Guide to Calculating the Patient Experience of Care Domain Score in the Hospital Value-Based Purchasing FY 2013 Actual Percentage Payment Summary Report,” at: http://www.hcahpsonline.org/HospitalVBP.aspx.

We specifically sought public comments on how the new CTM dimension should be included in the scoring methodology that we have adopted for the PEC/CC domain. In accordance with the finalized Hospital VBP Program scoring methodology for other domains, we are considering the “normalization” approach, which would introduce only minor changes to the original scoring formula, as follows.

For purposes of the HCAHPS Base Score, the new CTM dimensions would be calculated in the same manner as the eight existing HCAHPS dimensions. For each of the nine dimensions, Achievement Points (0-10 points) and Improvement Points (0-9 points) would be calculated, the larger of which would be summed across the nine dimensions to create a pre-normalized HCAHPS Base Score (0-90 points, as compared to 0-80 points when only eight dimensions were included). The pre-normalized HCAHPS Base Score would then be multiplied by 8/9 (0.88888) and rounded according to standard rules (values of 0.5 and higher are rounded up, values below 0.5 are rounded down) to create the normalized HCAHPS Base Score. Each of the nine dimensions would be of equal weight, so that, as before, the normalized HCAHPS Base Score would range from 0 to 80 points.

HCAHPS Consistency Points would then be calculated in the same manner as before and would continue to range from 0 to 20 points. The Consistency Points Score would now consider scores across all nine of the PEC/CC domain dimensions, whereas before it considered only the eight dimensions that preceded the CTM measure.

The final element of the scoring formula would be the sum of the HCAHPS Base Score and the HCAHPS Consistency Points Score and would range from 0 to 100 points, as before.

We welcomed public comments on this approach to including the CTM-3 dimensions in the PEC/CC domain score.

Comment: Many commenters supported incorporating the HCAHPS Care Transition Measure (CTM-3) into the PEC/CC domain, given the critical importance of the care transition for improving patient outcomes and reducing patient suffering. Other commenters strongly supported the addition and urged CMS to finalize it. Commenters also supported the proposed methodology for scoring and weighting the measure within the domain.

One commenter noted that the measure develops a ninth dimension of the HCAHPS Survey in the PEC/CC domain for FY 2018. The commenter stated that this measure is a significant first step in addressing shared accountability and quality of care during transitions of care periods and discharges from the health-system setting. The commenter further agreed that the normalization approach should be used for this care transition measure and calculation of total performance score.

One commenter commended CMS for considering adopting the CTM-3 items on the HCAHPS Survey, stating that effective management of care transitions is essential to ensuring proper patient recoveries while reducing readmissions and ensuring medication adherence. Another commenter supported our plan to include the CTM-3 items on the HCAHPS survey in the Hospital VBP Program in future years, noting that providing incentivizes for hospitals to coordinate patient transitions will aid significantly in decreasing readmissions and potentially mortality among Medicare patients. Other commenters supported adoption of the CTM-3 items on the HCAHPS Survey under the Hospital IQR Program and offered to evaluate their inclusion under the Hospital VBP Program once the items have been publicly reported. Other commenters noted that their support because managing safe and effective transitions of care is a critical competency in the health care system.

Response: We appreciate the comments in support of adding the Care Transition Measure to the Hospital VBP Program and the proposed methodology and weighting of this dimension in the PEC/CC domain.

Comment: Several commenters did not support the addition of the three-question care transition measure as a ninth dimension to the HCAHPS Hospital VBP Program scoring before evidence supporting its validity and materiality to the Hospital VBP Program was released. One commenter suggested that CMS exclude HCAHPS scores from the program or adjust provider scores to account for demographic factors that have been shown to impact survey results. One commenter requested additional analysis of the measure results after its first year of implementation.

Response: Should we decide to formally propose the addition of the HCAHPS Care Transition Measure to the Patient Experience of Care domain of the Hospital VBP Program through the rulemaking process, we will release additional information about the validity, reliability and statistical properties of the CTM.

In order to achieve the goal of fair comparisons across all hospitals that participate in HCAHPS, it is necessary to adjust for factors that are not directly related to hospital performance but do affect how patients answer HCAHPS survey items. The HCAHPS patient-mix adjustment is intended to eliminate any advantage or disadvantage in scores that might result from patient characteristics beyond a hospital's control. We do not collect or adjust for patients' socioeconomic status, however the HCAHPS patient-mix adjustment does include patients' highest level of education, which can be related to socioeconomic status. (HCAHPS On-Line Web site, Mode and Patient-mix Adjustment: http://www.hcahpsonline.org/modeadjustment.aspx.)

Comment: One commenter pointed to an analysis by the Cleveland Clinic that that shows that as patients' severity of illness worsens, HCAHPS scores decline in a statistically significant manner. Further, the commenter notes that the same relationship was observed when the researchers examined the relationship between patients' symptoms of depression and responses to HCAHPS—as symptoms of depression worsened, HCAHPS scores declined. The commenter believed this trend also may affect scores for other surveys in the CAHPS family. The commenter encouraged CMS to conduct an analysis that assesses the extent of the issue, and identifies potential mechanisms for enhancing how CAHPS scores are adjusted for patient factors.

Response: Since its national implementation in 2006, the HCAHPS Survey has included an item that asks for patients' assessment of their overall health. We use this information in a transparent manner in the standard patient-mix adjustment of HCAHPS scores, as explained on the official HCAHPS On-Line Web site, www.HCAHPSonline.org,, in our research documents, in the patient-mix adjustment coefficients that are posted on this Web site, and in published research.

Responding to comments about HCAHPS in previous IPPS/LTCH PPS rulemaking, we added an item to the HCAHPS Survey in January 2013 that asks patients to assess their overall mental or emotional health. We have analyzed the impact of this item and found that its inclusion in patient-mix adjustment does not explain more or improve the model in which the `overall health' item also appears. Therefore we include only the `overall health' item in the HCAHPS patient-mix adjustment, as this adequately adjusts for patient severity.

With respect to a Cleveland Clinic analysis that is said to show a greater than expected impact of severity of illness on HCAHPS scores, we understand that this analysis does not examine associations between patient characteristics and HCAHPS scores after the standard HCAHPS patient-mix adjustment has been applied. The standard HCAHPS patient-mix adjustment would be expected to remove most or all of the association mentioned. We also understand that the Cleveland Clinic analysis is not based on national data. In addition, recent research found that using patients' clinical characteristics in adjustment models had relatively little impact relative to survey questions about patients' health and that adding such measures to the existing HCAHPS case-mix adjustment model would have very little effect.

Cleary, et al. Medical Care. 52: 619-625. 2014.

Comment: One commenter urged CMS to expedite the initiative to include additional patient-centered palliative care measures into the Hospital VBP Program because the HCAHPS Survey is currently the only measure of patient experience, which misses all who die in the hospital or who are too ill to fill out the survey. The commenter noted that these individuals are most vulnerable due to the severity of their illness and deserve to have their and their families' experiences measured.

Response: The survey methodology and question wording at this point cannot accommodate proxy respondents, so HCAHPS cannot measure the experience of care of those who died in the hospital. However, as about 6.6 percent of hospice patients in 2012 died in a hospital setting, the new Hospice Experience of Care Survey, which is specifically designed for proxy respondents, will be able to capture some information about the experience in the hospital setting.

NHPCO's Facts and Figures: Hospice Care in America, 2013 Edition. National Hospice and Palliative Care Organization. Available at http://www.nhpco.org/sites/default/files/public/Statistics_Research/2013_Facts_Figures.pdf.

b. Possible Future Efficiency and Cost Reduction Domain Measure Topics

In the interest of expanding the Efficiency and Cost Reduction domain to include a more robust measure set, including measures that supplement the MSPB measure with more condition and/or treatment specific episodes, as well as facilitating alignment with the Physician VM Program, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28122 through 28224), we stated that we are considering proposing to add new episode-based payment measures to the Hospital VBP Program through future rulemaking. Expanding the Efficiency and Cost Reduction domain to include such measures would create incentives for coordination between hospitals and physicians to optimize the care they provide to Medicare beneficiaries and would increase alignment between the Hospital VBP and Physician VM Programs. Any future Hospital VBP Program measures would first be finalized for inclusion in the Hospital IQR Program and included on the Hospital Compare Web site for one year, as required by section 1886(o)(2)(C) of the Act.

As we discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28123), the six episode-based standardized payment measures we are considering are similar in many ways to the NQF-endorsed MSPB measure already included in the Efficiency domain and, like the MSPB measure, Medicare payments included in these episode-based measures would be standardized according to the CMS standardization methodology finalized for the MSPB measure in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51626). In the FY 2013 IPPS/LTCH PPS proposed rule (79 FR 28123 through 28124), we also discussed notable differences between these new measures under consideration and the MSPB measure.

Most notably, we would only include Medicare payments for services that are clinically related to the health conditions treated during the hospital stay that triggered the episode. We stated that the aim of including these episode-based payment measures in the Hospital VBP Program would be to differentiate between hospitals that provide care efficiently (that is, high quality care at a lower cost to Medicare). We stated our belief that risk-adjusted standardized Medicare payments are an appropriate indicator of efficiency as they allow us to compare hospitals without regard to such factors as geography and teaching status. This comparison is particularly important with clinically coherent episodes because it distinguishes the degree to which practice pattern variation influences the cost of care. We believe that creating incentives for appropriately reducing practice pattern variation is an important part of our aims to lower the cost of care appropriately and create better coordinated care for Medicare beneficiaries.

We noted another difference between the episode-based measures we are considering and the MSPB measure, which occurs when, during the 30 days following discharge from an index admission, a beneficiary is readmitted for a condition that is clinically related to the index admission and that also triggers an episode-based cost measure episode. We provided details of which admissions would begin a new episode and contribute to a preceding episode may be found at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/hospital-value-based-purchasing.

We stated that we are considering three medical and three surgical episodes for the potential inclusion in the initial expansion of the Efficiency domain. The medical episodes would address the following conditions: (1) Kidney/urinary tract infection; (2) cellulitis; and (3) gastrointestinal hemorrhage. A medical episode would be `triggered' by an inpatient claim with a specified MS-DRG. The surgical episodes currently under consideration are (1) hip replacement/revision; (2) knee replacement/revision; and (3) lumbar spine fusion/refusion. A surgical episode would be triggered when an inpatient claim has one of the specified MS-DRGs and at least one of the procedure codes specified for that episode. We welcomed public comment on the three medical and three surgical conditions that we are considering as new episode-based measures for initial expansion of the Efficiency domain.

Comment: A few commenters expressed support for one or more specific episodes, and some commenters suggested that CMS also consider adding additional measures to the domain in the future. One commenter supported the proposal to adopt a hip/knee replacement/revision measure in the future efficiency domain, as the episode would encourage care coordination. Some of those commenters who supported one or more of these episodes also expressed concerns.

Many commenters did not support inclusion of the episode-based standardized measures into the Hospital VBP Program. One commenter stated that the DRG triggers for urinary tract infection and cellulitis are often unrelated to an index inpatient admission. A few commenters also requested additional information on the measures CMS is considering.

Response: We appreciate the commenter's support of the hip and knee replacement/revision condition-specific measures.

Regarding the comment on the kidney/urinary tract infection and cellulitis episodes, we would like to clarify that these episode are only triggered by the presence of a specific MS-DRG on an inpatient claim. Thus, the episodes can only be initiated when the kidney/urinary tract infection or cellulitis is the primary reason for inpatient hospitalization.

With regard to the request for additional information, we note that we provided detailed measure specifications at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/hospital-value-based-purchasing,, and we reiterate that would implement any future measures for the Hospital VBP Program by first proposing and finalizing them for inclusion in the Hospital IQR Program, through notice and comment rulemaking.

We thank the commenters for the responses and we will consider them as we develop future measures for the Hospital VBP Program.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28122 through 28123), we noted that there are a number of other types of episodes that could also meet the episode selection criteria we describe below, including those related to heart and lung (for example, heart failure and pneumonia). We stated that we are exploring data related to episodes for these types of conditions under the Physician VM Program. We welcomed comment regarding the applicability of episode-based measures for these or other conditions for future expansion of the Efficiency domain.

Comment: One commenter applauded CMS' consideration of condition-specific episode-based cost measures, and suggested that CMS consider focusing on additional high-impact conditions such as heart failure, stroke, and diabetes. The commenter also suggested that CMS attempt to identify geographic areas and hospitals where volume may be unduly high. Another commenter stated that, in FY 2017, CMS will be reporting Cost per Episode for pneumonia and heart failure through the Hospital IQR Program and was unclear why CMS is using different medical episodes here. Another commenter recommended that CMS consider the development and inclusion of additional measures outside of therapeutic areas already represented in the Hospital VBP Program, including measuring relating to diabetes, atrial fibrillation, COPD, and oncology. Several commenters who supported the measures encouraged CMS to develop additional episodes although these commenters did not identify specific episode names.

Response: We thank the commenters for the support of the six measures and the suggestions for additional high impact conditions and will consider their suggestions in the future. Regarding the comment that pneumonia and heart failure episodes were in the Hospital IQR Program but not among the six measures among the proposed conditions for potential inclusion in the Efficiency domain, the 6 measures were selected for common conditions with the five criteria discussed below. Other measures such as pneumonia and heart failure could be considered among the medical episodes for potential inclusion in the future. As stated earlier, we would first propose any future Hospital VBP Program measures for the Hospital IQR Program, through notice and comment rulemaking.

We thank the commenters for the responses and we will consider them as we develop future measures for the Hospital VBP Program.

In selecting the six conditions around which we would develop episode measures for future expansion of the Efficiency domain, we considered the following five criteria: (1) The condition constitutes a significant share of Medicare payments for hospitalized patients during and surrounding the hospital stay; (2) the degree to which clinical experts consulted for this project agree that standardized Medicare payments for services provided during the episode can be linked to the care provided during the hospitalization; (3) episodes of care for the condition are comprised of a substantial proportion of payments for post-acute care, indicating episode payment differences are driven by utilization outside of the MS-DRG payment; (4) episodes of care for the condition reflect high variation in post-discharge payments, enabling differentiation between hospitals; and, (5) the medical condition is managed by general medicine physicians or hospitalists and the surgical conditions are managed by surgical subspecialists, enabling comparison between similar practitioner types within each episode measure.

For analysis purposes, the five selection criteria were applied to 2012 Medicare acute inpatient hospital data in a hierarchical manner, to prioritize the inpatient conditions. After the selection criteria were applied, we narrowed the medical and surgical episodes to those episodes that are less complex, in order to allow CMS and hospitals to gain experience with this new measure type. Full details of the episode selection criteria are available on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/hospital-value-based-purchasing. We welcomed public comments on the episode selection criteria we utilized.

Comment: Some commenters expressed support for the criteria. One commenter asked who is responsible for defining the episodes of care for cost management purposes.

Response: We thank the commenters for the support of the criteria. We have worked closely with clinicians and contractors experienced in health services research to develop the episode measure selection criteria and to define the episodes of care cost measures.

We thank the commenters for the responses and we will consider them as we develop future measures for the Hospital VBP Program.

Complete episode specifications, including the MS-DRG and ICD-9-CM procedure codes used to identify each of the episodes, details of episode construction methodology, and information on the clinical expert reviewers for this project are available on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/hospital-value-based-purchasing. We welcomed public comments on these specifications and the construction of the six episode-based payment measures that we are considering.

Comment: A number of commenter expressed concern regarding the lack of assessment of quality within the 6 cost measures or association with existing quality measures, both among those who supported the measures and those who did not. One commenter did not support the addition of six episode-based payment measures to the Efficiency domain in addition to the MSPB measure until a sufficient number of appropriate clinical outcome or clinical process measures related to these therapeutic areas are included in the program and have demonstrated high provider performance, and noted that the inclusion of cost measures without relevant quality measures could have the unintended consequence of sacrificing quality of care for the sake of cost reduction.

Response: As we take incremental steps towards providing all stakeholders with comprehensive metrics, we have selected condition-specific cost measures for common conditions with evidence of large variation in payments to encourage higher value care where there is the most opportunity for improvement, the greatest number of patients to benefit from improvements, and the largest sample size to ensure reliability. Regarding the comment that the measures under consideration do not account for quality, we continue to believe that it is beneficial to view a cost measure in light of other quality measures. As we stated in the FY 2012 IPPS/LTCH PPS final rule, for purposes of the Hospital VBP Program, we would weight and combine the Efficiency and Cost Control domain with the other domain scores, in order to calculate each hospital's TPS. This ensures that any future spending measures would make up only a portion of the TPS and that the remainder would be based on hospitals' performance on the other quality measures (76 FR 51622). We continue to believe that distant measures of cost, independent of quality, enable us to identify hospitals involved in the provision of high quality care at a lower cost to Medicare (77 FR 53586).

Comment: Several commenters expressed concern about the potentially small number of episodes, which leads to more random variation. These commenters expressed concern about the reliability of the proposed condition-specific cost measures. One commenter noted that there would be fewer observations for each condition than there would be for an all-condition measure, such as the MSPB measure, and expressed concern that this would result in more random variation without providing clear additional information about the average costliness of the hospitals' care. To ensure reliability, the commenter expressed the belief that it is important that the cost measures used should be as broadly based as possible. Another commenter expressed concern that CMS may not be able to reliably and validly calculate Hospital VBP improvement scores and recommended that CMS focus on achievement scores. This commenter suggested that condition-specific cost measures will split efficiency data into small pools of information that are more prone to random variation and inconclusive results.

Response: As we take incremental steps towards providing all stakeholders with comprehensive metrics, we have selected for potential future inclusion in the Efficiency domain condition-specific cost measures for common conditions with evidence of large variation in payments to encourage higher value care where there is the most opportunity for improvement, the greatest number of patients to benefit from improvements, and the largest sample size to ensure reliability. To further ensure reliability, inclusion of the condition-specific cost measures for individual hospitals would require a minimum number of cases, which would be based on statistical tests of reliability and would be proposed through future rulemaking.

We also note that commenters have previously suggested that we narrow the MSPB measure to condition-specific measures, and we responded in the FY 2012 IPPS/LTCH PPS final rule that we would consider adding condition-specific measures to the Efficiency domain through future rulemaking (76 FR 51623). As we stated in the FY 2015 IPPS/LTCH PPS proposed rule, we believe that these condition-specific, cost and-outcome-measure groupings would allow patients and payers to make more fully informed comparisons of hospitals' performance. Including condition-specific cost measures would also provide hospitals with actionable feedback that would better assist them in targeting resources for improvements than would an overall cost-measure alone.

Comment: A number of commenters expressed their concern that the episode measures, like the MSPB measure, include the cost of services that they perceived to be beyond hospitals' control, including post-acute care and readmissions. Commenters also expressed their concern that including post-acute care may skew measure results, due to including greater effects of patient comorbidities. Some commenters suggested that the measures would be more appropriate for inclusion in the Shared Savings Program or after they are implemented in the Physician Value Modifier (VM). Some commenters also suggested that the measures account for site of service choices made by beneficiaries.

Response: We disagree that Medicare payments for post-discharge services are beyond the influence of hospitals, and we believe that including post-acute care services in the episodes of care is important because it ensure that these high-cost services, often with alternative post-acute options with large variations in cost, are included in the overall condition-specific episode costs. Patient comorbidities that contribute to higher post-acute care costs are included in the risk adjustment models to address the concerns raised.

We agree that it is important to align incentives across CMS payment incentive programs. While these measures have not been proposed for inclusion in the Shared Savings Program or the VBM at present, they have been included in the Supplemental Quality and Resource Use Reports distributed to groups of 100 or more EPs in the summer of 2014 and we intend to continue to include them in these reports, as they are disseminated to more groups of EPs, including solo practitioners, in the future. We would also consider proposing them for inclusion in the VM and MSSP programs through future rulemaking.

As we stated in the stated in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51625), we do not believe that site of service adjustments are appropriate for spending measures, because such adjustments would undermine the ability of the measures to meaningfully capture differences in Medicare spending. However, we would consider the potential inclusion of site of service choice as we further examine the measure.

Comment: Some commenters suggested that CMS explore the Bundling Payments for Care Improvement (BPCI) initiative before these six potential measures are implemented. These commenters expressed concern that the measure specifications and episode construction rules were not aligned with the Bundled Payments for Care Improvement initiative, resulting in confusion among hospitals, and suggested that CMS consider this initiative before further pursuing these six episodes.

Response: We considered the BPCI methodology when we developed the episode based payment measures we discuss in this rule. We believe the episodes included in the Hospital VBP Program should be more specific in their inclusion of clinically-related costs, because these measures would be publicly reported and used to evaluate hospitals and adjust their payments, based on performance for specific conditions.

The BPCI approach (model 2) includes the inpatient hospital stay for the anchor MS-DRG and all related care covered under Medicare Part A and Part B within 30, 60, or 90 days following discharge from the acute care hospital. Unrelated services are not included in the BPCI episode. These excluded services can be found at http://innovation.cms.gov/Files/x/BPCI2-4_PartA-B_Exclusion.pdf. In contrast, the 6 condition-based episodes discussed in the proposed rule include all costs from the index admission and only clinically-related costs from Part A and B services occurring immediately before and after the index admission. Service costs may only be included in the condition-based episodes if they meet certain cost thresholds and are billed with select procedures, services, and/or diagnoses. In other words, the BPCI approach is designed to pay for an episode of care, which includes all relevant services for a set period of time. The six condition-based episodes proposed for potential future consideration are designed to support more targeted assessments of hospital performance by using the cost of major, clinically-related services in the post-discharge period as an indicator of a hospital's success in delivering clinically-relevant, high quality, and appropriate services during the index hospital admission.

Comment: Many commenters stated that the 6 condition-based episode measures under consideration did not risk adjust for sociodemographic factors and encouraged CMS to review its risk adjustment models. One commenter noted that lack of proper risk-adjustment for sociodemographic status could result in unintended negative consequences. Some commenters discussed the recent NQF draft report on the subject that suggested that measures take these factors into account.

Response: We refer readers to our earlier discussion of risk adjustment based on socioeconomic status with respect to the MSPB measure which also is relevant for these measures.

Comment: Many commenters stated that inclusion of the 6 measures would mean double counting the services that are already included in the MSPB measure, which is the only measure in the Efficiency domain. Some commenters suggested that if these measures are adopted for inclusion in the Efficiency domain, then they should replace, rather than supplement the MSPB.

Response: We disagree that inclusion of additional condition-specific measures in the Efficiency and Cost Control domain would inappropriately double count payments for episodes attributed to hospitals. Unlike the MSPB measure, the condition-specific cost measures only include costs from services/procedures related to the condition. These condition-specific, cost-and-outcome-measure groupings would allow patients and payers to make more fully informed comparisons of hospitals' performance.

Including condition-specific cost measures would also provide hospitals with actionable feedback that will better equip them to implement targeted improvements than an overall cost-measure alone. Relying on condition-specific measures alone would disregard differences in overall cost. The MSPB-1 measure is reported as a ratio of payment-standardized, risk-adjusted MSPB amount for each hospital divided by the weighted median MSPB amount across all hospitals. These six clinical episode measures, if adopted in the future, are intended to supplement the information provided by the MSPB. We note that, as mentioned above, commenters have previously suggested that we narrow the MSPB measure to condition-specific measures, and we responded in the FY 2012 IPPS/LTCH PPS final rule that we would consider adding condition-specific measures to the Efficiency domain through future rulemaking (76 FR 51623).

Comment: Several commenters noted that CMS should follow the MAP process and propose to include these measures in the Hospital IQR Program first, prior to inclusion in the Hospital VBP Program.

Response: Any future Hospital VBP Program measures would first be finalized for inclusion in the Hospital IQR Program and included on the Hospital Compare Web site for one year, as required by section 1886(o)(2)(C)(i) of the Act.

Comment: Several commenters, including those who supported the measure, requested additional information on the six measures. One commenter expressed concern about the proposed cost measures and recommended that the public have additional opportunity to review and comment on this proposal before CMS moves ahead on a hip/knee surgical episode under the Hospital VBP Program.

Response: The six episode cost measures have been designed specifically for the Medicare program using transparent methodology that is described in materials that are publicly available on the CMS hospital value-based purchasing Web site: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html. There will be additional opportunity to review and comment on this proposal before we would move ahead on any of the six episodes under consideration.

Comment: One commenter stated that readmissions for a condition clinically related to the index admission should not start a new index admission because it would be holding hospitals accountable twice for the care provided.

Response: The methodology of the condition-specific cost measures assigns separate significance to a readmission that is grouped to a related previous hospitalization and a second triggered episode for the same condition. These admissions would not necessarily be attributed to the same hospitals, and assigning one as a readmission to the previous hospitalization and also allowing it to begin a second episode provides an opportunity for both managing hospitals to be evaluated.

We will consider the suggestion that readmissions not trigger new episodes, but we believe that it may be appropriate to begin a new episode in these cases, because Medicare payments made for the care provided during these subsequent hospitalizations represents a significant cost that would otherwise not be captured. We also note that the measures could be calculated similarly to the MSPB measure, where the total cost per episode could be divided by the number of episodes, so that the amount would represent an average of the episode costs for multiple admissions.

Comment: One commenter suggested that CMS focus on the development of a multi-dimensional patient-reported composite measure of maternity care in the near-term, which could be collected six weeks after birth to measure outcomes and identify common new-onset morbidities during a post-partum visit. Another commenter recommended the adaption of the generic Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey to measure the experience of care of childbearing women and newborns.

Response: Patients admitted for maternity care are eligible for the HCAHPS Survey and comprise a significant portion of patients who report their experience of care. We are considering whether to extend the HCAHPS Survey to encompass the pediatric population; currently the HCAHPS Survey is oriented toward patients 18 years of age and older.

We thank the commenters for the responses and we will consider them as we develop future measures for the Hospital VBP Program.

7. Previously Adopted and Newly Finalized Performance Periods and Baseline Periods for the FY 2017 Hospital VBP Program

a. Background

Section 1886(o)(4) of the Act requires the Secretary to establish a performance period for the Hospital VBP Program for a fiscal year that begins and ends prior to the beginning of such fiscal year. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50689 through 50692) and the CY 2014 OPPS/ASC final rule with comment period (78 FR 75020 through 75021) for the performance periods and baseline periods for the Clinical Care—Process, Patient Experience of Care, Clinical Care—Outcomes, and Efficiency and Cost Reduction domains for the FY 2016 Hospital VBP Program.

As discussed further below, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50702 through 50704), we adopted new NQS-based quality domains for FY 2017, and in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28124 through 28125), we proposed to adopt performance and baseline periods using those new domains for the FY 2017 Hospital VBP Program.

b. Previously Adopted Baseline and Performance Periods for the FY 2017 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50692 through 50694 and 50698 through 50699), because of the time needed to process measure data for the three 30-day mortality measures (Clinical Care—Outcomes domain) and the PSI-90 measure (also referred to in previous rulemaking as the AHRQ patient safety PSI-90 composite measure) (Safety domain), and in consideration of our policy goal to collect enough data to generate the most reliable scores possible, we adopted performance periods and performance standards for the 30-day mortality measures for FY 2017, FY 2018, and FY 2019, and for the PSI-90 measure for FY 2017 and FY 2018.

c. Clinical Care—Process Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule, we adopted a 12-month performance period for the FY 2016 Clinical Care—Process domain measures of CY 2014 (January 1, 2014 through December 31, 2014). We also adopted a corresponding 12-month baseline period of CY 2012 (January 1, 2012, through December 31, 2012), for purposes of calculating improvement points and performance standards.

Based on our review of FY 2013 and FY 2014 Hospital VBP performance period denominator data, we continue to believe that a 12-month performance period provides us with reliable and sufficient data for scoring Clinical Care—Process domain measures under the Hospital VBP Program. These data are available for public review on our Hospital Compare Web site. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28124), we proposed to adopt a 12-month performance period for FY 2017 Clinical Care—Process domain measures (including the proposed PC-01 measure) of CY 2015 (January 1, 2015, through December 31, 2015). We also proposed to adopt a corresponding 12-month baseline period of CY 2013 (January 1, 2013, through December 31, 2013) for purposes of calculating improvement points and calculating performance standards.

We invited public comment on these proposals.

Comment: Many commenters supported CMS' proposed baseline and performance periods for FY 2017 measures in the Safety, Clinical Care—Process, and Efficiency and Cost Reduction domains.

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing the FY 2017 Clinical Care—Process performance and baseline periods as proposed.

d. PEC/CC Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50689), we adopted a 12-month performance period for FY 2016 Patient Experience of Care domain measures of CY 2014, or January 1, 2014 through December 31, 2014, for the FY 2016 Hospital VBP Program. We also adopted a corresponding 12-month baseline period of CY 2012 (January 1, 2012 through December 31, 2012), for purposes of calculating improvement points and calculating performance standards. We continue to believe that a 12-month performance period provides us sufficient HCAHPS data on which to score hospital performance, which is an important goal both for CMS and for stakeholders.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28124), we proposed to adopt a 12-month performance period for the FY 2017 PEC/CC domain of CY 2015 (January 1, 2015 through December 31, 2015). We also proposed to adopt a corresponding 12-month baseline period of CY 2013 (January 1, 2013 through December 31, 2013) for purposes of calculating improvement points and calculating performance standards.

We invited public comment on these proposals. However, we did not receive specific comments on the PEC/CC domain's performance period for FY 2017. Accordingly, we are finalizing the FY 2017 performance and baseline periods for the PEC/CC domain as proposed.

e. Performance Period and Baseline Period for NHSN Measures in the Safety Domain for the FY 2017 Hospital VBP Program

In the CY 2014 OPPS/ASC final rule with comment period (78 FR 75121), for the three NHSN HAI measures that we have adopted for the FY 2016 Hospital VBP Program (Catheter-Associated Urinary Tract Infection (CAUTI), CLABSI, and Surgical Site Infection (SSI)), we adopted an FY 2016 performance period of CY 2014 (January 1, 2014 through December 31, 2014), with a corresponding baseline period of CY 2012 (January 1, 2012 through December 31, 2012) for purposes of calculating improvement points and calculating performance standards.

We continue to believe that a 12-month performance period provides us with sufficient data on which to score hospital performance on NHSN measures in the Safety domain. We also noted that 12-month performance and baseline periods are consistent with the reporting periods used for these measures under the Hospital IQR Program (78 FR 50689). Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28124) for the FY 2017 NHSN measures in the Safety domain (including the proposed CLABSI, C. difficile Infection and MRSA Bacteremia measures), we proposed to adopt a performance period of CY 2015 (January 1, 2015 through December 31, 2015), and a corresponding baseline period of CY 2013 (January 1, 2013 through December 31, 2013) for purposes of calculating improvement points and calculating performance standards.

We invited public comment on these proposals.

Comment: One commenter supported the performance and baseline periods for the FY 2017 NHSN measures in the Safety domain but recommended collaboration with NHSN on limitations of SIR analysis, especially for smaller size facilities or those with lower volumes of use of devices such as central lines, urinary catheters, and surgical procedures. The commenter also expressed concern that an SIR may not calculate even for a 12-month block of time for some hospitals.

Response: We intend to continue working with CDC to ensure that reliable SIRs are calculated for participating hospitals.

After consideration of the public comment we received, we are finalizing the FY 2017 performance period and baseline periods for the NHSN measures in the Safety domain as proposed.

f. Efficiency and Cost Reduction Domain Performance Period and Baseline Period for the FY 2017 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule, we adopted a 12-month performance period for the MSPB measure for the FY 2016 Hospital VBP Program of CY 2014 (January 1, 2014, through December 31, 2014), with a corresponding baseline period of CY 2012 (January 1, 2012, through December 31, 2012). This performance and baseline period enable us to collect sufficient measure data, while allowing time to calculate and incorporate MSPB measure data into the Hospital VBP Program scores in a timely manner.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28124 through 28125), we proposed to adopt a 12-month performance period for the FY 2017 Efficiency and Cost Reduction domain of CY 2015 (January 1, 2015 through December 31, 2015), with a corresponding baseline period of CY 2013 (January 1, 2013 through December 31, 2013). We noted that this proposed performance and baseline period aligns with the performance and baseline periods for Clinical Care—Process, PEC/CC, and certain Safety measures under the new domain structure.

We invited public comments on these proposals.

Comment: Commenters supported CMS' proposed baseline and performance periods for FY 2017 measures in the Safety, Clinical Care—Process, and Efficiency and Cost Reduction domains.

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing the FY 2017 performance and baseline periods for the Efficiency and Cost Reduction domain as proposed.

g. Summary of Previously Adopted and Newly Finalized Performance Periods and Baseline Periods for the FY 2017 Hospital VBP Program

The table below summarizes the newly finalized baseline and performance periods for the FY 2017 Hospital VBP Program (with previously adopted baseline and performance periods for the mortality and AHRQ PSI composite (PSI-90) measures noted).

Previously Adopted and Newly Finalized Performance and Baseline Periods for the FY 2017 Hospital VBP Program

DomainBaseline periodPerformance period
Safety:
• PSI-90*• October 1, 2010-June 30, 2012* • January 1, 2013-December 31, 2013• October 1, 2013-June 30, 2015.*
• NHSN (CAUTI, CLABSI, SSI, C. difficile Infection, MRSA Bacteremia)• January 1, 2015-December 31, 2015.
Clinical Care—Outcomes:
• Mortality* (MORT-30-AMI, MORT-30-HF, MORT-30-PN)• October 1, 2010-June 30, 2012*• October 1, 2013-June 30, 2015.*
Clinical Care—Process
• (AMI-7a, IMM-2, PC-01)January 1, 2013-December 31, 2013January 1, 2015-December 31, 2015.
Efficiency and Cost Reduction (MSPB-1)January 1, 2013-December 31, 2013January 1, 2015-December 31, 2015.
Patient and Caregiver-Centered Experience of Care/Care Coordination (HCAHPS)January 1, 2013-December 31, 2013January 1, 2015-December 31, 2015.
* Previously adopted performance and baseline periods.

We note that we intend to propose additional baseline and performance periods for the FY 2018 Hospital VBP Program in future rulemaking.

8. Previously Adopted and Newly Finalized Performance Periods and Baseline Periods for Certain Measures for the FY 2019 Hospital VBP Program

a. Previously Adopted and Newly Finalized Performance Period and Baseline Period for the FY 2019 Hospital VBP Program for Clinical Care—Outcome Domain Measures

As described above, we have previously adopted the FY 2019 performance and baseline periods for the three 30-day mortality measures that we have adopted for the former Outcome domain and that we have since placed into the Clinical Care—Outcomes domain under the new domain structure.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28121 through 28122), we proposed to adopt the THA/TKA measure for the FY 2019 Hospital VBP Program and to place that measure in the Clinical Care—Outcomes domain. THA/TKA is reported to the Hospital IQR Program for 36-month time periods. However, we do not believe that we can feasibly adopt a 36-month performance period for this measure and adopt it for the FY 2019 Hospital VBP Program. Based on the time needed to complete measure calculations and performance scoring, we believe that we must conclude the performance period for this measure by June 30, 2017. We believe that a 30-month performance period will result in sufficiently reliable quality measure data for purposes of Hospital VBP Program scoring, and our analysis of historic data supported our belief that comparisons between a 36-month baseline period and a 30-month performance period will not result in significant differences in measure scores. Further, adopting this proposed performance period would enable us to include the measure in the FY 2019 Hospital VBP Program, which would ensure that hospitals continue focusing on measures of outcomes under the Hospital VBP Program and that we continue transitioning the Hospital VBP Program from its initial focus on process measures to outcome measures.

We note that we have proposed below to adopt a 36-month performance period for the THA/TKA measure for the FY 2020 Hospital VBP Program. We have examined the correlation between hospitals' performance on the THA/TKA measure for 30-month and 36-month periods, and we believe that the 30-month period meets our standard for moderate reliability of quality measure data during the specified time period. However, as with the 30-day mortality and PSI-90 measures, we are attempting to align performance periods under the Hospital VBP Program with reporting periods under the Hospital IQR Program, while introducing measures covering important clinical topics into the Hospital VBP Program as quickly as possible. We believe that our proposal for a 30-month performance period for this measure for the FY 2019 Hospital VBP Program allows us to bring the measure into the program in FY 2019 and to accomplish that alignment beginning with the FY 2020 Hospital VBP Program.

Therefore, we proposed to adopt an FY 2019 performance period of January 1, 2015 through June 30, 2017 for the THA/TKA measure. Further, we proposed to adopt an FY 2019 baseline period for this measure of July 1, 2010 to June 30, 2013 for purposes of calculating performance standards and awarding improvement points.

We welcomed public comments on these proposals.

We did not receive any specific public comments on these proposals and are finalizing the FY 2019 performance and baseline periods for the THA/TKA measure as proposed.

b. Performance Period and Baseline Period for the PSI-90 Safety Domain Measure for the FY 2019 Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50692 through 50694), we adopted performance periods and baseline periods for the PSI-90 measure for the FY 2017 and FY 2018 Hospital VBP Programs. We adopted this policy in light of the time needed to process measure data and our policy goal to collect enough data to generate the most reliable measure scores possible. We stated our belief that aligning the Hospital VBP Program performance periods with the Hospital IQR Program reporting period duration would allow hospitals to review Hospital Compare measure rates when they are updated and incorporate this information into their quality improvement efforts, rather than having to wait until the Hospital VBP Program provides its scoring reports to hospitals. We stated our further belief that aligning the Hospital IQR Program and the Hospital VBP Program in this manner will minimize the burden on participating hospitals by aligning the time periods during which they must monitor their performance on this measure.

We did not finalize a baseline period and performance period for the AHRQ PSI-90 measure for FY 2019 in that final rule (78 FR 50692 through 50694). We stated that, by declining to finalize the measure's FY 2019 performance and baseline periods in that final rule, we would be able to adopt a more recent baseline period than we initially proposed. We stated that we intended to propose baseline and performance periods for the AHRQ PSI measure for the FY 2019 Hospital VBP Program in future rulemaking.

We continue to believe that we should adopt performance and baseline periods of 24 months for the PSI-90 measure. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28126) we proposed to adopt an FY 2019 performance period for the PSI-90 measure of July 1, 2015 through June 30, 2017, with a corresponding 24-month baseline period of July 1, 2011 through June 30, 2013, for purposes of calculating performance standards and awarding improvement points.

We welcomed public comments on these proposals. However, we did not receive any specific public comments on this proposal and are finalizing the FY 2019 performance and baseline periods for the PSI-90 measure as proposed.

c. Summary of Previously Adopted and Newly Finalized Performance Periods and Baseline Periods for Certain Measures for the FY 2019 Hospital VBP Program

The following table summarizes previously adopted and proposed performance and baseline periods for the FY 2019 Hospital VBP Program:

Previously Adopted and Newly Finalized Performance and Baseline Periods for Certain Measures for the FY 2019 Hospital VBP Program

DomainBaseline periodPerformance period
Safety:
• PSI-90• July 1, 2011-June 30, 2013• July 1, 2015-June 30, 2017.
Clinical Care—Outcomes:
• Mortality* (MORT-30-AMI, MORT-30-HF, MORT-30-PN)• July 1, 2009-June 30, 2012*• July 1, 2014-June 30, 2017.*
• THA/TKA• July 1, 2010-June 30, 2013• January 1, 2015-June 30, 2017.
* Previously adopted performance and baseline periods.

9. Performance Period and Baseline Period for the Clinical Care—Outcomes Domain for the FY 2020 Hospital VBP Program

As described above with respect to the mortality measures, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50692 through 50694), we adopted performance periods and baseline periods for the three 30-day mortality measures for the FY 2017, FY 2018, and FY 2019 Hospital VBP Programs. We adopted this policy in light of the time needed to process measure data and to ensure that we collect enough measure data for reliable performance scoring, as described further above. We continue to believe that we should adopt 36-month performance and baseline periods for the mortality measures when possible to accommodate those durations.

We believe that a similar rationale applies to the new THA/TKA measure that we proposed to adopt for the Clinical Care—Outcomes domain for the FY 2019 Hospital VBP Program, and which, under our policy of measure readoption, we generally would readopt for the FY 2020 Hospital VBP Program if finalized. As stated above, we have examined the correlation between hospitals' performance on the THA/TKA measure for 30-month and 36-month periods, and we believe that the 30-month period meets our standard for moderate reliability of quality measure data during the specified time period. However, as with the 30-day mortality and PSI-90 measures, we are attempting to align performance periods under the Hospital VBP Program with reporting periods under the Hospital IQR Program, while introducing measures covering important clinical topics into the program as quickly as possible. We believe that our proposal for a 30-month performance period for this measure for FY 2019 allows us to accomplish that alignment beginning with the FY 2020 Program.

Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28126 through 28127) we proposed to adopt a 36-month performance period for the measures in the Clinical Care—Outcomes domain in the FY 2020 Hospital VBP Program (including the proposed THA/TKA measure for FY 2020, if that measure is adopted for the FY 2020 Hospital VBP Program) of July 1, 2015 through June 30, 2018, with a corresponding 36-month baseline period of July 1, 2010 through June 30, 2013, for purposes of calculating performance standards and awarding improvement points.

We welcomed public comment on these proposals.

We did not receive any specific public comments on this proposal and are finalizing the FY 2020 performance and baseline periods for the Clinical Care—Outcomes domain as proposed.

The following table summarizes the finalized performance and baseline period for the Clinical Care—Outcomes domain for the FY 2020 Hospital VBP Program:

Performance and Baseline Period for the Clinical Care—Outcomes Domain for the FY 2020 Hospital VBP Program

DomainBaseline periodPerformance period
Clinical Care—Outcomes:
• Mortality (MORT-30 AMI, MORT-30-HF, MORT-30-PN)• July 1, 2010-June 30, 2013• July 1, 2015-June 30, 2018.
• THA/TKA• July 1, 2010-June 30, 2013• July 1, 2015-June 30, 2018.

10. Performance Standards for the Hospital VBP Program

a. Background

Section 1886(o)(3)(A) of the Act requires the Secretary to establish performance standards for the measures selected under the Hospital VBP Program for a performance period for the applicable fiscal year. The performance standards must include levels of achievement and improvement, as required by section 1886(o)(3)(B) of the Act, and must be established and announced not later than 60 days before the beginning of the performance period for the fiscal year involved, as required by section 1886(o)(3)(C) of the Act. We refer readers to the Hospital Inpatient VBP Program final rule (76 FR 26511 through 26513) for further discussion of achievement and improvement standards under the Hospital VBP Program.

In addition, when establishing the performance standards, section 1886(o)(3)(D) of the Act requires the Secretary to consider appropriate factors, such as: (1) practical experience with the measures, including whether a significant proportion of hospitals failed to meet the performance standard during previous performance periods; (2) historical performance standards; (3) improvement rates; and (4) the opportunity for continued improvement.

b. Performance Standards for the FY 2016 Hospital VBP Program

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53599 through 53604), we adopted performance standards for FY 2015 and certain FY 2016 Hospital VBP Program measures. We also finalized our policy to update performance periods and performance standards for future Hospital VBP Program years via notice on the CMS Web site or another publicly available Web site.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50694 through 50698), we revised our regulatory definitions of “achievement threshold” and “benchmark” at 42 CFR 412.160 and adopted performance standards for additional FY 2016 Hospital VBP Program measures. We also adopted an interpretation of “achievement threshold” and “benchmark” under section 412.160 to not include the numerical values that result when the performance standards are calculated. We further adopted a policy under which we may update a measure's performance standards for a fiscal year once if we identify data issues, calculation errors, or other problems that would significantly affect the displayed performance standards. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50695 through 50698) for the complete set of FY 2016 performance standards.

c. Previously Adopted Performance Standards for the FY 2017, FY 2018, and FY 2019 Hospital VBP Programs

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50698 through 50699), we adopted performance standards for the three 30-day mortality measures for the FY 2017, FY 2018, and FY 2019 Hospital VBP Programs and for the PSI-90 measure for the FY 2017 and FY 2018 Hospital VBP Programs. We refer readers to that final rule for those performance standards.

d. Additional Performance Standards for the FY 2017 Hospital VBP Program

In accordance with our finalized methodology for calculating performance standards (discussed more fully in the Hospital Inpatient VBP Program final rule (76 FR 26511 through 26513)), in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28127 through 28128) we proposed to adopt the following additional performance standards for the FY 2017 Hospital VBP Program. We note that the numerical values for the performance standards displayed below represent estimates based on the most recently available data, and we intend to update the numerical values in the FY 2015 IPPS/LTCH PPS final rule. We note further that the MSPB measure's performance standards are based on performance period data; therefore, we are unable to provide numerical equivalents for the standards at this time.

We note further that the performance standards for the NHSN measures (CAUTI, SSI, and proposed CLABSI, MRSA Bacteremia, and C. difficile Infection), the PSI-90 measure, and the MSPB measure are calculated with lower values representing better performance, in contrast to other measures, on which higher values indicate better performance. As discussed further in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50684), the performance standards for SSI are computed separately for each measure stratum. We will award achievement and improvement points to each stratum separately and then compute a weighted average of the points awarded to each stratum by predicted infections. We note that we misstated PC-01 measure's benchmark in the proposed rule and have corrected that error in the table below.

Previously Adopted and Proposed Performance Standards for the FY 2017 Hospital VBP Program: Safety, Clinical Care—Outcomes, Clinical Care—Process, and Efficiency and Cost Reduction Measures

Measure IDDescriptionAchievement thresholdBenchmark
Safety Measures
CAUTICatheter-Associated Urinary Tract Infection0.83710.0000.
CLABSICentral Line-Associated Blood Stream Infection0.44830.0000.
C. difficileClostridium difficile Infection0.79270.0000.
MRSA BacteremiaMethicillin-Resistant Staphylococcus aureus Bacteremia0.86130.0000.
PSI-90*Complication/patient safety for selected indicators (composite)**0.577321*0.397051.
SSISurgical Site Infection
• Colon• 0.7117• 0.0000.
• Abdominal Hysterectomy• 0.7509• 0.0000.
Clinical Care—Outcomes Measures
MORT-30-AMI*Acute Myocardial Infarction (AMI) 30-day mortality rate**0.851458*0.871669.
MORT-30-HF*Heart Failure (HF) 30-day mortality rate**0.881794*0.903985.
MORT-30-PN*Pneumonia (PN) 30-day mortality rate**0.882986*0.908124.
Clinical Care—Process Measures
AMI-7aFibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival0.9545451.000000.
IMM-2Influenza Immunization0.9958821.000000.
PC-01Elective Delivery Prior to 39 Completed Weeks Gestation0.0312500.000000.
Efficiency and Cost Reduction Measure
MSPB-1Medicare Spending per BeneficiaryMedian Medicare Spending per Beneficiary ratio across all hospitals during the performance periodMean of the lowest decile Medicare Spending per Beneficiary ratios across all hospitals during the performance period.
*Previously adopted performance standards.

Proposed Performance Standards for the FY 2017 Hospital VBP Program Patient and Caregiver-Centered Experience of Care/Care Coordination Domain

HCAHPS survey dimensionFloor (percent)Achievement threshold (percent)Benchmark (percent)
Communication with Nurses56.9078.0886.41
Communication with Doctors62.0380.4388.71
Responsiveness of Hospital Staff36.4664.8379.62
Pain Management49.4770.2078.18
Communication about Medicines42.8962.8273.15
Hospital Cleanliness & Quietness43.4665.2679.06
Discharge Information61.8685.5991.04
Overall Rating of Hospital35.0069.8184.27

We note that we intend to propose additional performance standards for the FY 2018 Hospital VBP Program in future rulemaking.

We welcomed public comments on these proposed performance standards.

We did not receive any specific public comments on the proposed performance standards. We are therefore finalizing the FY 2017 performance standards as outlined below.

Set out below are the updated the numerical values for the performance standards. As with the NHSN measures and the PSI-90 measure, we note that better performance on the PC-01 measure is represented by lower numerical values.

Previously Adopted and Newly Finalized Performance Standards for the FY 2017 Hospital VBP Program: Safety, Clinical Care—Outcomes, Clinical Care—Process, and Efficiency and Cost Reduction Measures

Measure IDDescriptionAchievement thresholdBenchmark
Safety Measures
CAUTICatheter-Associated Urinary Tract Infection0.8450.000.
CLABSICentral Line-Associated Blood Stream Infection0.4570.000.
C. difficileClostridium difficile Infection0.7500.000.
MRSA BacteremiaMethicillin-Resistant Staphylococcus aureus Bacteremia0.7990.000.
PSI-90*Complication/patient safety for selected indicators (composite)**0.577321*0.397051.
SSISurgical Site Infection
• Colon• 0.751• 0.000.
• Abdominal Hysterectomy• 0.698• 0.000.
Clinical Care—Outcomes Measures
MORT-30-AMI*Acute Myocardial Infarction (AMI) 30-day mortality rate**0.851458*0.871669.
MORT-30-HF*Heart Failure (HF) 30-day mortality rate**0.881794*0.903985.
MORT-30-PN*Pneumonia (PN) 30-day mortality rate**0.882986*0.908124.
Clinical Care—Process Measures
AMI-7aFibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival0.9545451.000000.
IMM-2Influenza Immunization0.9516070.997739.
PC-01Elective Delivery Prior to 39 Completed Weeks Gestation0.0312500.000000.
Efficiency and Cost Reduction Measure
MSPB-1Medicare Spending per BeneficiaryMedian Medicare Spending per Beneficiary ratio across all hospitals during the performance periodMean of the lowest decile Medicare Spending per Beneficiary ratios across all hospitals during the performance period.
*Previously adopted performance standards.

Performance Standards for the FY 2017 Hospital VBP Program Patient and Caregiver-Centered Experience of Care/Care Coordination Domain

HCAHPS survey dimensionFloor (percent)Achievement threshold (percent)Benchmark (percent)
Communication with Nurses58.1478.1986.61
Communication with Doctors63.5880.5188.80
Responsiveness of Hospital Staff37.2965.0580.01
Pain Management49.5370.2878.33
Communication about Medicines41.4262.8873.36
Hospital Cleanliness & Quietness44.3265.3079.39
Discharge Information64.0985.9191.23
Overall Rating of Hospital35.9970.0284.60

e. Performance Standards for the FY 2019 and FY 2020 Hospital VBP Programs

As discussed further above, we have adopted certain Safety and Clinical Care—Outcomes domain measures for future program years in order to ensure that we can adopt performance periods and baseline periods of sufficient length for performance scoring purposes. In the FY 2015 IPPS/LTCH PPS proposed rule, we also proposed to adopt the PSI-90 measure in the Safety domain and the THA/TKA measure in the Clinical Care—Outcomes domain for the FY 2019 Hospital VBP Program. We note that, as described above with respect to the NHSN, PSI-90, and MSPB measures, better performance is represented by lower values for the THA/TKA measure. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28128 through 28129) we proposed to adopt the following performance standards for the FY 2019 Hospital VBP Program:

Previously Adopted and Proposed Performance Standards for Certain Safety and Clinical Care—Outcomes Domain Measures for the FY 2019 Hospital VBP Program

Measure IDDescriptionAchievement thresholdBenchmark
Safety Measures
PSI-90Complication/patient safety for selected indicators (composite)0.8404210.589716
Outcomes Measures
MORT-30-AMI*Acute Myocardial Infarction (AMI) 30-day mortality rate**0.850671*0.873263
MORT-30-HF*Heart Failure (HF) 30-day mortality rate**0.883472*0.908094
MORT-30-PN*Pneumonia (PN) 30-day mortality rate**0.882334*0.907906
THA/TKAHospital-level Risk-Standardized Complication Rate Following Elective Primary Total Hip Arthroplasty (THA) And/Or Total Knee Arthroplasty (TKA)0.0325210.022895
Previously adopted performance standards.

We welcomed public comments on these proposed performance standards.

We did not receive any specific public comments on this proposal and are finalizing the FY 2019 performance standards as outlined below.

Set out below are the updated numerical values for the FY 2019 performance standards. We note that, as described above with respect to the NHSN, PSI-90, and MSPB measures, better performance is represented by lower values for the THA/TKA measure.

Previously Adopted and Newly Finalized Performance Standards for Certain Safety and Clinical Care—Outcomes Domain Measures for the FY 2019 Hospital VBP Program

Measure IDDescriptionAchievement thresholdBenchmark
Safety Measures
PSI-90Complication/patient safety for selected indicators (composite)0.8403350.589462
Outcomes Measures
MORT-30-AMIAcute Myocardial Infarction (AMI) 30-day mortality rate*80.850671*80.873263
MORT-30-HFHeart Failure (HF) 30-day mortality rate*80.883472*80.908094
MORT-30-PNPneumonia (PN) 30-day mortality rate*80.882334*80.907906
THA/TKAHospital-level Risk-Standardized Complication Rate Following Elective Primary Total Hip Arthroplasty (THA) And/Or Total Knee Arthroplasty (TKA)0.0322290.023178
Previously adopted performance standards.

We also proposed to adopt the following performance standards for the FY 2020 Hospital VBP Program:

Proposed Performance Standards for Clinical Care—Outcomes Domain Measures for the FY 2020 Hospital VBP Program

Measure IDDescriptionAchievement thresholdBenchmark
Clinical Care—Outcomes Measures
MORT-30-AMIAcute Myocardial Infarction (AMI) 30-day mortality rate0.8535110.875840
MORT-30-HFHeart Failure (HF) 30-day mortality rate0.8813940.905962
MORT-30-PNPneumonia (PN) 30-day mortality rate0.8822810.909460
THA/TKAHospital-level Risk-Standardized Complication Rate Following Elective Primary Total Hip Arthroplasty (THA) And/Or Total Knee Arthroplasty (TKA)0.0325210.022895

We welcomed public comments on these proposed performance standards.

We did not receive any specific public comments on this proposal and are finalizing the FY 2020 performance standards as outlined below.

Set out below are the updated the numerical values for the FY 2020 performance standards. We note that, as described above with respect to the NHSN, PSI-90, and MSPB measures, better performance is represented by lower values for the THA/TKA measure.

Performance Standards for Clinical Care—Outcomes Domain Measures for the FY 2020 Hospital VBP Program

Measure IDDescriptionAchievement thresholdBenchmark
Clinical Care—Outcomes Measures
MORT-30-AMIAcute Myocardial Infarction (AMI) 30-day mortality rate0.8537150.875869
MORT-30-HFHeart Failure (HF) 30-day mortality rate0.8810900.906068
MORT-30-PNPneumonia (PN) 30-day mortality rate0.8822660.909532
THA/TKAHospital-level Risk-Standardized Complication Rate Following Elective Primary Total Hip Arthroplasty (THA) And/Or Total Knee Arthroplasty (TKA)0.0322290.023178

f. Technical Updates Policy for Performance Standards

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50694 through 50698), we revised our regulatory definitions of “achievement threshold” and “benchmark” at 42 CFR 412.160 and adopted performance standards for additional FY 2016 Hospital VBP Program measures. We also adopted an interpretation of “achievement threshold” and “benchmark” under section 412.160 to not include the numerical values that result when the performance standards are calculated. We further adopted a policy under which we may update a measure's performance standards for a fiscal year once if we identify data issues, calculation errors, or other problems that would significantly change the displayed performance standards.

Our historic practice has been to display Hospital VBP Program performance standards' numerical values in rulemaking. We adopted this practice for the convenience of the public. Although we have typically expressed the performance standards for each Hospital VBP measure as a numerical value prior to the start of the performance period for that measure, we do not display numerical values for the MSPB measure because the measure is constructed as a measure of costs attributable to patient care during a specified episode of care during the performance period itself (77 FR 53601). We have stated that with respect to the MSPB measure, we do not believe it is helpful for hospitals to be compared against performance standards constructed from baseline period data given the potential changes in market forces and utilization practices that occur over time.

Further, during the long interval between the time we first display the performance standards for all measures but the MSPB measure and the time that we calculate the achievement and improvement scores for those measures based on actual hospital performance, one or more of those measures might have been technically updated in a way that inhibits our ability to ensure that we are making appropriate comparisons between the baseline and performance period. For example, the software used to calculate the PSI-90 measure is regularly updated to incorporate coding changes, refinements based on the consensus development process, and refinements to improve specificity and sensitivity. The statistical modeling we use to adjust measure calculations for PSI-90 and HCAHPS also needs to be periodically updated to incorporate coefficient factors that more properly account for patient mix (both measures) and the HCAHPS survey data collection mode (HCAHPS survey). These types of technical updates do not substantively affect the measure rate calculation methodology, but they do sometimes affect our ability to make appropriate comparisons between the baseline and performance period if, for example, the baseline performance standards are tabulated using one version of the software and hospital performance during subsequent performance periods is tabulated with another version. We believe that in order to make the most accurate comparison of hospital performance across time, we should use the most updated version of the measure that is available at the time we calculate that performance because the updated version will produce the most valid measure rates.

Further, as part of its regular maintenance process for NQF-endorsed performance measures, NQF requires measure stewards to submit annual measure maintenance updates and undergo maintenance of endorsement review every 3 years. In the measure maintenance process, the measure steward (owner/developer) is responsible for updating and maintaining the currency and relevance of the measure and will confirm existing or minor specification changes with NQF on an annual basis. NQF solicits information from measure stewards for annual reviews, and it reviews measures for continued endorsement in a specific 3-year cycle.

The NQF's annual or triennial maintenance processes for endorsed measures may result in the NQF requiring updates to the measures in order to maintain endorsement status. We believe that it is important to incorporate nonsubstantive updates required by the NQF, as well as nonsubstantive updates made to other measures, into the measure specifications we have adopted for the Hospital VBP Program so that these measures remain up-to-date and ensure that we make fair comparisons between the performance and baseline periods that we adopt under the program. We also recognize that some updates to measures are substantive in nature and might not be appropriate for adoption without further rulemaking.

With respect to what constitutes substantive versus nonsubstantive changes to measures, we would make this determination on a case-by-case basis. Examples of nonsubstantive changes to measures might include updated diagnosis or procedure codes, medication updates for categories of medications, broadening of age ranges, and exclusions for a measure (such as the addition of a hospice exclusion to the 30-day mortality measures). We believe that nonsubstantive changes may include updates to measures based upon changes to guidelines upon which the measures are based.

Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28129 through 28130) we proposed to amend the definition of “performance standards” under section 412.160 to enable us to update performance standards' numerical values to incorporate nonsubstantive technical updates that are made to Hospital VBP Program measures between the time that they are adopted for a particular program year and the time that we actually calculate hospital performance on those measures after the performance period for the program year has concluded. Further, we proposed to inform hospitals of these technical updates through postings on our Hospital VBP Program Web site, the QualityNet Web site, other educational outreach efforts, and/or the scoring reports that we provide for each program year. We noted that these proposals, if finalized, may have the effect of superseding the performance standards that we establish prior to the start of the performance period for the affected measures, but we believe them to be necessary to ensure that the performance standards in the Hospital VBP Program's scoring calculations enable the fairest comparisons between performance measured during the baseline period and performance period.

We would continue to use rulemaking to adopt substantive updates to the measures we have adopted for the Hospital VBP Program. Examples of changes that we might consider to be substantive would be those in which the changes are so significant that the measure is no longer the same measure or when a standard of performance assessed by a measure becomes more stringent (that is, changes in acceptable timing of medication, procedure/process, or test administration). We also noted that the NQF process incorporates an opportunity for public comment and engagement in the measure maintenance process.

We also proposed to include in our revised definition of “performance standards” under section 412.160 of our regulations the policy we adopted in the FY 2013 IPPS/LTCH PPS final rule to update the performance standards once if we identify data issues, calculation errors, or other problems that would significantly change the standards (78 FR 50695). We proposed to make this change so that our policies governing updates to the performance standards appear together.

We welcomed public comments on these proposals. We also specifically sought public comments on what we should consider to be substantive changes in measures' performance standards, including whether or not we should consider certain changes in performance standards as a result of technical or nonsubstantive updates to be substantive.

Comment: Several commenters opposed CMS' proposal to adopt technical updates for performance standards, stating that there is no reason we cannot use the public notice and technical corrections process to disseminate changes in performance standards to stakeholders. Some commenters stated that not all stakeholders have access to QualityNet to receive the updates that CMS proposed. One commenter noted that changing performance standards targets with more than annual frequency would undermine hospitals' ability to implement performance improvement efforts. Commenters noted that the Hospital VBP Program is designed to hold hospitals accountable for their performance during a specified time period based on standards that are published before that performance period begins.

One commenter recommended that CMS apply changes in the risk adjustment system only when a new performance standard is published and then use those same updates when performance is measured for the performance period—if the changes are indeed “nonsubstantive,” as the proposed regulatory text would specify, delaying the application of such updates should not be detrimental to the Hospital VBP Program. The commenter expressed concern that changes could be made resulting both in different hospital performance and a different performance standard, which would eliminate the usefulness of the minimal amount of information currently available to hospitals on these measures.

Response: We disagree with the recommendation to have all measure changes subject to notice-and-comment rulemaking. As previously noted in FY 2014 IPPS/LTCH PPS final rule (78 FR 50776), we believe that the maintenance of technical specifications for quality measure policy for the Hospital IQR Program also is applicable to the Hospital VBP Program. We believe this policy adequately balances our need to incorporate nonsubstantive NQF updates to NQF-endorsed measures in the most expeditious manner possible, while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted. We also note that the NQF process incorporates an opportunity for public comment and engagement in the measure maintenance process. These policies regarding what is considered substantive versus nonsubstantive apply to all measures in the Hospital IQR Program and the Hospital VBP Program, and we believe the same standard applies when determining what should be considered substantive changes to performance standards.

We believe that it is of paramount importance that the performance standards that we adopt accurately reflect hospitals' actual performance during the baseline period. We view our Technical Updates authority policy as a means to ensure that accuracy and to ensure that the program scores hospitals based on performance standards that reflect the actual provision of care in hospitals around the country.

With respect to commenters' concerns that we may update performance standards more than annually, we are aware that updates may have unintended consequences on hospitals' quality improvement efforts. We do not intend to make updates to performance standards except to improve the standards' accuracy and to ensure that the numerical values that we display for hospitals accurately reflect hospitals' performance during the baseline period, as applicable. In addition, with respect to commenters' suggestion that delays to performance standards updates would not be detrimental to the Hospital VBP Program, we disagree. We believe that we must provide hospitals with as much accurate information as is possible so that they may develop and implement quality improvement policies. We do not believe it would be helpful to hospitals for us to delay publishing a technical update on the basis that the update will not significantly affect performance.

We note further that we do not intend to limit any updates made to performance standards using this authority to QualityNet accountholders. We intend to publish any changes made under this policy on the public QualityNet Web site and through our Hospital VBP Program listserv entitled, “Hospital Inpatient Value-Based Purchasing (HVBP) and Improvement,” available under the notifications and discussions link on our home page.

Comment: One commenter suggested that CMS should define in specific terms what should constitute a “substantive” versus a “nonsubstantive” update to the Hospital VBP Program performance standards before adopting the authority to make technical updates. The commenter further stated that CMS should be as transparent with stakeholders as possible about these changes, noting that midstream updates could have profound impacts on hospitals' performance under Hospital VBP Program.

Response: With respect to what constitutes substantive versus nonsubstantive changes, we expect to make this determination on a case-by-case basis. Examples of nonsubstantive changes to measures might include updated diagnosis or procedure codes, medication updates for categories of medications, broadening of age ranges, and exclusions for a measure (such as the addition of a hospice exclusion to the 30-day mortality measures). We believe that nonsubstantive changes may include updates to NQF-endorsed measures based upon changes to guidelines upon which the measures are based. However, as commenters have requested, we intend to be as transparent as possible with stakeholders about any technical updates that we would adopt, including the rationale for any such updates and their effects on finalized performance standards.

We will continue to use rulemaking to adopt substantive updates made to measures we have adopted for the Hospital IQR Program. Examples of changes that we might consider to be substantive would be those in which the changes are so significant that the measure is no longer the same measure, or when a standard of performance assessed by a measure becomes more stringent, for example, changes in acceptable timing of medication, procedure/process, or test administration. Another example of a substantive change would be where the NQF has extended its endorsement of a previously endorsed measure to a new setting, such as extending a measure from the inpatient setting to hospice. These policies regarding what is considered substantive versus nonsubstantive would apply to all measures in the Hospital IQR Program.

After consideration of the public comments we received, we are finalizing the technical updates policy for performance standards as proposed. We are also finalizing our proposed revisions to the definition of “performance standards” in section 412.160 of our regulations.

g. Request for Public Comments on International Classification of Diseases, Tenth Revision, Clinical Modification/Procedure Coding System (ICD-10-CM/PCS) Transition

Beginning October 1, 2015, when the ICD-10-CM/PCS codes become the required code set, we will collect non-electronic health record-based quality measure data coded only in ICD-10-CM/PCS. Even though we expect that the endorsement status of the measures we have adopted for the Hospital VBP Program will remain the same, we are concerned that the transition to a new coding system might have unintended consequences on quality measure data denominators, statistical adjustment coefficients, and measure rates. We are concerned about the possible impacts on the Hospital VBP Program, and requested public comments on how we should accommodate the transition.

Specifically, we requested comments on how, if at all, we should adjust performance scoring under the Hospital VBP Program to accommodate quality data coded under ICD-10-CM/PCS, or otherwise ensure fair and accurate comparisons under the Hospital VBP Program once the transition date has passed. For example, we could consider analyzing the effects of the ICD-10-CM/PCS transition on hospitals' measured performances and, if substantive differences result, retrospectively adjust performance standards in order to ensure that they accurately reflect the underlying methodology. We could also consider performing similar adjustments to hospitals' measure rates, measure scores, or TPSs once our analysis is completed. We also might consider scoring hospitals only on achievement if analysis indicates that we are unable to reliably and validly calculate improvement scores when comparing International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) based baseline period data to ICD-10-CM/PCS based performance period data. However, while we intend to analyze the effects of the ICD-10-CM/PCS transition on hospitals' performance, we do not have the necessary data for all hospitals at this time.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28130) we stated that we intended to take two steps to analyze ICD-10-CM/PCS potential impact before receiving ICD-10-CM/PCS-based fall 2015 discharge data in May 2016. First, we stated that we will assess measure specifications to qualitatively assess impact to measure denominators after CMS releases ICD-10-CM/PCS-based measure specifications in the future. Second, we stated that we intend to voluntarily solicit information from no more than 9 hospitals before October 1, 2015 to estimate the impact of ICD-10-CM/PCS on their Hospital VBP measure rates and denominator counts. We intend to use this information to inform both proposed and future Hospital VBP Program policy and measures.

We welcomed public comments on this topic.

Comment: One commenter supported CMS' implementation of ICD-10-CM/PCS on October 1, 2015 without any further delays. The commenter also warned that, while adoption is welcome and overdue, implementation of the new system must be carefully orchestrated to minimize the administrative burden on hospitals. The commenter noted their appreciation of CMS' efforts to offer extensive educational opportunities for hospitals and noted that extensive end-to-end testing of both the electronic transaction and the adjudication of the claim by Medicare contractors and State Medicaid agencies will be needed to ensure a smooth transition from ICD-9-CM to ICD-10-CM/PCS.

Response: We thank the commenter for its support. We would like to clarify that HHS has not yet finalized an ICD-10 implementation date through rulemaking. We refer readers to the CMS Web page on ICD-10 (http://cms.hhs.gov/Medicare/Coding/ICD10/index.html) and the Federal Register for current information.

Comment: A few commenters recommended that CMS work with more than nine hospitals, as well as other national hospital associations and stakeholders interested in volunteering to participate in the ICD-10-CM/PCS transition process, to gain a broader understanding of the coding transition and its impact on CMS' quality reporting and pay-for-performance programs.

Response: We believe an initial limited analysis will enable us to better understand the impact of the ICD-10-CM/PCS transition on hospitals' performance. We intend to continue discussing this topic with stakeholders in the future.

Comment: One commenter supported CMS' suggested strategy for analyzing Hospital VBP Program performance scores to accommodate the ICD-10-CM/PCS transition, but requested that CMS make any adjustment methodology public and continue to score hospitals on both achievement and improvement.

Response: We intend to discuss publicly any adjustments that we would subsequently propose through rulemaking for the Hospital VBP Program.

Comment: Many commenters urged CMS not to adopt achievement-only scoring as a result of the ICD-10-CM/PCS transition, stating that improvement points are a balancing feature of the Hospital VBP Program that provide incentives for progress. Some commenters stated that the Hospital VBP Program statute requires that CMS score hospitals on both achievement and improvement, and suggested that CMS “waive” hospitals' participation in the program until we have adequate data to provide both elements of performance scoring.

Several commenters suggested that CMS remove measures from the program for a year if we cannot calculate reliable improvement scores. Other commenters requested that CMS allow sufficient time to analyze the impact of the ICD-10-CM/PCS transition and address any potential issues before penalizing hospitals in future Hospital VBP Program years. One commenter suggested holding hospitals harmless if CMS cannot accurately accept and calculate quality measures.

Response: We thank the commenters for this feedback and will take it into consideration as we develop our policy on this issue.

Comment: Several commenters expressed concern that transitioning the Hospital VBP Program to the ICD-10-CM/PCS system could significantly alter how measures are scored between the baseline and performance periods. The commenter opined that comparisons between the ICD-9-CM and ICD-10-CM/PCS systems would be unfair, and suggested that CMS score hospitals using ICD-9-CM data and an ICD-10-CM/PCS crosswalk. Commenters suggested that CMS run both the baseline data and the performance data using ICD-9-CM (using crosswalk software) and make the results of the testing publicly available. A few commenters urged CMS to formalize its ICD-10-CM/PCS testing plans to ensure that end-to-end testing begins no later than January 2015 and is made available to all hospitals.

Response: We thank the commenters for this feedback and will take it into consideration as we develop our policy on this issue.

Comment: One commenter recommended that CMS update its quality measures in order to best take advantage of the added granularity offered by ICD-10-CM/PCS. The commenter does not believe that it will be possible to accurately adjust performance standards retrospectively in order to correct the substantive differences in ICD-9-CM and ICD-10-CM/PCS quality data. The commenter suggested that evaluating quality measures solely on achievement would minimize the administrative costs associated with identifying the feasibility, validity, and reliability of comparing quality measures based on dissimilar code sets, and would also allow measure developers to freely update quality measures without the fear of distorting comparisons between baseline and performance period data coded in dissimilar formats.

Commenters agreed that the ICD-10-CM/PCS transition may have an impact on quality measurement based on claims data, and encouraged CMS to analyze those effects rigorously once data are available.

Response: We thank the commenters for this feedback and will take it into consideration as we develop our policy on this issue.

We thank the commenters for these responses and we will consider them as we plan for the ICD-10-CM/PCS transition under the Hospital VBP Program.

11. FY 2017 Hospital VBP Program Scoring Methodology

a. General Hospital VBP Program Scoring Methodology

In the Hospital Inpatient VBP Program final rule (76 FR 26514), we adopted a methodology for scoring clinical process of care, patient experience of care, and outcome measures. As noted in that rule, this methodology outlines an approach that we believe is well understood by patient advocates, hospitals, and other stakeholders because it was developed during a lengthy process that involved extensive stakeholder input, and was based on a scoring methodology we presented in a report to Congress. We also noted in that final rule that we had conducted extensive additional research on a number of other important methodology issues to ensure a high level of confidence in the scoring methodology. In addition, we believe that, for reasons of simplicity, transparency, and consistency, it is important to score hospitals using the same general methodology each year, with appropriate modifications to accommodate new domains and measures. We finalized a similar scoring methodology for the MSPB measure in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51654 through 51656).

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53604 through 53605), for the FY 2015 Hospital VBP Program, we finalized our proposal to use these same general scoring methodologies to score hospital performance for the FY 2015 Hospital VBP Program. In that rule, we stated that we believe these scoring methodologies continue to appropriately capture hospital quality as reflected by the finalized quality measure sets. We also noted that readopting the finalized scoring methodology from prior program years represents the simplest and most consistent policy for hospitals and the public. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50699), we readopted the finalized general scoring methodology adopted for the FY 2015 Hospital VBP Program for the FY 2016 Hospital VBP Program.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50702 through 50704), we adopted new quality domains based on the NQS for FY 2017 and subsequent years.

We continue to agree with the reasoning for the scoring methodology outlined in the FY 2013 IPPS/LTCH PPS final rule and summarized above. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28130 through 28131) we proposed to adopt the general scoring methodology adopted for the FY 2016 Hospital VBP Program for the FY 2017 Hospital VBP Program, with appropriate modifications to accommodate the new quality domains that we have previously adopted. These proposed modifications to our scoring methodology are limited to reclassified quality domains, new placements for measures within those domains, and domain weighting. We discuss below a proposal to revise the finalized domain weighting for FY 2017.

We welcomed public comment on this proposal. We also received a number of general comments on the Hospital VPB Program's scoring methodology.

Comment: Several commenters stated that CMS should consider phasing out improvement scoring for selected measures or the entire Hospital VBP Program that have been included in the Hospital VBP Program for several years in order to emphasize comparative performance on the measures. Several commenters agreed that phasing out improvement scoring after several years (one commenter suggested 3 years) would emphasize comparative performance on the measures. While some commenters noted that improvement at the outset of the program is very important to encouraging historically poor-performing hospitals to invest in improvement, those commenters believe that hospitals should be compared and paid on their achievements and not merely for improving on subpar performance after a period of time.

Several other commenters, on the other hand, expressed strong support for pay-for-performance programs that assess multiple aspects of care and recognize hospitals for achievement versus national benchmarks and improvement versus baseline performance. Commenters stated that this incentive structure can provide greater inducement for hospitals to improve performance. Commenters believed this construct is foundational for hospitals to improve performance in a rational and predictable manner.

Response: We thank the commenters for this feedback. We will take it under consideration as we develop Hospital VBP Program policies.

Comment: One commenter recommended that CMS consider comparing “like” hospitals—such as those of similar sizes, teaching status, and case mix—under the Hospital VBP Program in order to avoid inadvertently providing scoring advantages or disadvantages to participating hospitals.

Response: We do not believe the quality measures that we have adopted for the Hospital VBP Program incorporate the necessary data to disaggregate hospitals' performance by size, teaching status, or case mix any further than they already do through risk adjustment. We do intend, however, to assess the feasibility of this suggestion through our program monitoring and evaluation efforts.

After consideration of the public comments we received, we are finalizing our proposal to adopt the general scoring methodology adopted for the FY 2016 Hospital VBP Program for the FY 2017 Hospital VBP Program, with appropriate modifications to accommodate the new quality domains that we have previously adopted. These modifications to our scoring methodology are limited to reclassified quality domains, new placements for measures within those domains, and domain weighting.

b. Domain Weighting for the FY 2017 Hospital VBP Program for Hospitals That Receive a Score on All Domains

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50702 through 50704), we adopted our proposal to align the Hospital VBP Program's quality measurement domains with the NQS's quality priorities, with certain modifications. We adopted this realignment beginning with the FY 2017 Hospital VBP Program. We also adopted the following domains and domain weights for the FY 2017 Hospital VBP Program for hospitals that receive a score in all newly aligned domains.

Previously Adopted Domains and Domain Weights for the FY 2017 Hospital VBP Program for Hospitals Receiving a Score on All Newly Aligned Domains

DomainWeight
Safety15 percent.
Clinical Care35 percent.
• Clinical Care—Outcomes• 25 percent.
• Clinical Care—Process• 10 percent.
Efficiency and Cost Reduction25 percent.
Patient and Caregiver Centered Experience of Care/Care Coordination25 percent.

However, as discussed in more detail above, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28119), we are finalizing our proposal to remove six “topped-out” measures from the FY 2017 Clinical Care—Process subdomain. This substantial reduction in the number of measures adopted for the Clinical Care—Process subdomain warrants reconsideration of the finalized domain weighting for FY 2017 that we adopted in the FY 2014 IPPS/LTCH PPS final rule.

As described in more detail above, we are also finalizing our proposal to re-adopt the CLABSI measure and to adopt two new measures (MRSA Bacteremia and C. difficile Infection) for the Safety domain for FY 2017 Hospital VBP Program and subsequent years, which raises the total number of measures in this domain for FY 2017 to six. Because we proposed to make changes in the number of measures in only two domains (Safety and Clinical Care), we focused our proposed domain weighting changes in the proposed rule on these domains only. Because we continue to believe that hospitals should be provided strong incentives to perform well on measures of patient safety, in view of the new measures we proposed to add to that domain, we proposed to revise the previously finalized domain weighting for the FY 2017 Hospital VBP Program for hospitals receiving a score on all newly aligned domains as follows:

Proposed Revised Domain Weights for the FY 2017 Hospital VBP Program for Hospitals Receiving a Score on All Newly Aligned Domains

DomainWeight
Safety20 percent.
Clinical Care30 percent.
• Clinical Care—Outcomes• 25 percent.
• Clinical Care—Process• 5 percent.
Efficiency and Cost Reduction25 percent.
Patient and Caregiver Centered Experience of Care/Care Coordination25 percent.

We welcomed public comments on the proposed revised domain weights.

Comment: Several commenters expressed broad support for CMS' proposed revision to the domain weighting for FY 2017, agreeing that it appropriately shifts the program's focus to the Safety domain and away from Clinical Care—Process domain. One commenter commended CMS' efforts to move the delivery system towards value-driven paradigms that reward high quality and cost effective health care providers. A few commenters noted that the Safety domain is largely comprised of well-developed HAI outcome measures. One commenter noted that the domain weights largely align with the National Quality Strategy's quality priorities and places an increased focus on patient safety. Some commenters specifically expressed support for maintaining the weight of the Efficiency and Cost Reduction domain at 25 percent.

Response: We thank the commenters for their support.

Comment: Several commenters wanted CMS to maintain the Clinical Care—Process domain weighting at 35 percent, and noted that measures of clinical processes continue to play an important role in improving the quality of care. One commenter suggested CMS not reduce the weight for Clinical Care—Process measures to 5 percent because commenter believed that these measures play a vital role in quality improvement and should remain a significant component of the Hospital VBP Program. The commenter also noted that a hospital's level of performance on Clinical Care—Process measures reflects a hospital's overall discipline and commitment to quality improvement that extends beyond just the specific topics being measured.

Other commenters suggested that the final rule should increase the weight for the Clinical Care—Process domain in order to ensure that the Hospital VBP Program's focus is appropriately on improving patient outcomes. A few commenters noted that measuring clinical processes gives hospitals the data they need to improve performance and identify good process measures that are not burdensome to collect. One commenter stated that the Clinical Care—Outcomes and Safety domains already reflect higher priority than Clinical Care—Process domain measures. This commenter suggested that process measures may be used to identify gaps that may not be readily apparent from outcome measures. One commenter encouraged CMS to give the Clinical Care—Process measures the greatest weight because of the limitations of the measures in the other domains and because this domain provides hospitals with the most actionable information on quality performance.

One commenter questioned the extent to which measures of clinical process are necessary given the low domain weighting allocated to the Clinical Care—Process domain. The commenter suggested that CMS consider phasing the measures out of the program entirely.

Other commenters suggested additional measures that should be added to the Clinical Care—Process Domain, including one commenter who suggested the Medicare Service Utilization measure be added to the Efficiency and Cost Reduction domain.

Response: Because we proposed to remove six “topped-out” measures from the FY 2017 Clinical Care—Process subdomain, the number of measures adopted for that subdomain will be significantly reduced. For that reason, we reconsidered the finalized domain weighting for FY 2017 that we adopted in the FY 2014 IPPS/LTCH PPS final rule. We continue to believe that hospitals should be provided strong incentives to perform well on measures of patient safety, and we believe the revised domain weighting appropriately reduces the relative weighting allocated to the Clinical Care—Process domain, in accordance with the substantially reduced number of measures adopted under that domain. As we have stated in prior rulemaking, we believe that the Hospital VBP Program should shift from its initial focus on measures of processes to measures of outcomes and efficiency, and we believe that the proposed domain weighting change appropriately continues that policy change.

Comment: One commenter urged that CMS ensure that the IMM-2 measure is afforded sufficient weight in determining hospital value-based payments, such as by including this measure in the Safety domain.

Response: We believe we have placed the IMM-2 measure appropriately within the Clinical Care—Process domain, as it is a chart-abstracted measure. We further believe that we have allocated sufficient domain weighting to the Clinical Care—Process domain, and respond to additional comments on the FY 2017 domain weighting in subsequent sections below.

Comment: One commenter recommended that CMS decrease the weight of the consistency score in the HCAHPS survey to 10 percent and weight the HCAHPS measure total score with the new care transition measures at 90 percent. Alternatively, the commenter suggested that CMS revise the methodology of the consistency score to more accurately measure consistent performance and leave the weight of 20 percent in place. Instead of using the HCAHPS floor values as the minimum range for consistency, the commenter suggested that CMS use the 25th percentile value so that consistency points would only be rewarding hospitals maintaining a reasonable level of performance in each HCAHPS measure.

Response: We continue to believe that the HCAHPS survey is an important and significant component of the Total Performance Score. We further believe that Consistency Points appropriately encourage hospitals to attempt to improve their scores on all dimensions of the HCAHPS survey, and are therefore appropriately allotted 20 points within the domain. While we may reexamine the HCAHPS survey's scoring methodology if we adopt the CTM-3 items in the future, we do not believe that it is appropriate at this time to reduce the Consistency Points component of the PEC/CC domain to 10 percent.

Comment: Many commenters urged CMS to revise the MSPB measure to include both quality and cost outcomes, which means achieving better outcomes at lower total health costs, rather than simply and crudely cutting costs. A few commenters stated that basing 25 percent of the TPS on a measure of cost comparison with no quality component will encourage hospitals to further cut costs beyond the incentives of the DRG system, with uncertain checks on corresponding quality impacts. Several commenters stated that because so much of the MSPB measure is outside of the hospital's control, the domain should not be factor so heavily into the TPS.

A few commenters urged CMS to consider removing the MSPB measure entirely or dropping the Efficiency and Cost Reduction domain's weighting to 5 percent because the commenters suggested that measures aimed at improving efficiency should be grounded in current best evidence, should evaluate clinical outcomes concurrently with resource use, and should be interpretable based on outcomes achieved with resources expended. Another commenter recommended lowering the weight of the Efficiency domain when the new episode-based payment measures' initial implementation begins to provide CMS and hospitals an opportunity to gain experience with these measures. The commenter noted that lowering the weight of the Efficiency domain provides a period of time for the development of more accurate or relevant Efficiency measures into the program.

However, several commenters suggested that CMS increase weighting of the Efficiency and Cost Reduction domain. A few commenters suggested that CMS consider incrementally increasing the Efficiency and Cost Reduction domain's weight to 50 percent as more efficiency measures are developed in the coming years. One commenter suggested that this change should occur in six years.

Response: We believe we have appropriately balanced our desire to provide strong incentives for hospitals to consider the cost and the quality of the care that they provide to Medicare beneficiaries and to all patients by assigning the Efficiency and Cost Reduction domain to 25 percent of the Total Performance Score. We note that the MSPB measure is still relatively new to the Hospital VBP Program, and represents the incorporation of efficiency metrics for the first time in the program. We view that step as important, and we continue to believe it merits significant domain weighting in order to ensure that hospitals monitor the costs of the care they provide to Medicare beneficiaries during the inpatient hospitalization and are involved in the coordination of beneficiaries' care immediately prior to a hospitalization and post-discharge.

However, we thank the commenters for their thoughts and intend to continue examining domain weighting and will consider revisiting this issue in the future.

Comment: A few commenters wanted to decrease the PEC/CC weight. One commenter stated that anecdotal evidence shows significant variation in HCAHPS survey scores due to differences in acuity level and region. The commenter also noted that a recent study found that patient satisfaction was independent of hospital compliance with surgical processes and with hospitals' safety culture.

One commenter urged CMS to retain the PEC/CC domain's weighting at 25 percent, stating that the patient's experience is a critical component of quality health care. The commenter stated that, if CMS retains the Safety domain, CMS should not increase its allocated domain weighting, and should leave the Clinical Care—Process domain's weighting at 10 percent.

A few commenters suggested adding additional measures to the PEC/CC domain, in order to strengthen those domains.

Response: We disagree with commenters that suggested that we consider lower weighting for the PEC/CC domain. We continue to believe that the patient's experience is an important component of high-quality health care, and we believe that allocating significant domain weighting to the PEC/CC domain reflects that priority appropriately. As described further above, we also believe that the Consistency Points are properly set at 20 points within the domain. We believe the PEC/CC domain's weighting appropriately provides hospitals with strong incentives to improve their patients' experience during acute care hospitalizations.

Comment: A few commenters urged that CMS remove the Safety domain from the Hospital VBP Program and consider the HAC Reduction Program as its Safety domain, redistributing the weight to the other domains. In the alternative, one commenter suggested that CMS leave the Clinical Care—Process domain's weighting at 10 percent.

One commenter suggested that CMS increase the Safety or Clinical Care—Outcomes domain weights.

Response: We consider measures of patient safety to be of critical importance to the Hospital VBP Program, and we believe that their inclusion in the program with significant domain weighting appropriately provides hospitals with substantial incentives to protect their patients during acute care episodes.

Comment: A few commenters suggested that CMS replace the Clinical Care—Outcome measures or develop a plan to improve the measures' reliability. One commenter expressed concern that three mortality measures in the Clinical Care—Outcomes domain do not reliably assess hospital performance and could have negative unintended consequences for certain hospitals.

One commenter urged CMS to modify the domain weights so that more emphasis is placed on achieving outcomes, providing quality clinical processes, and improving patient experience. Commenters expressed support for the goal of improving quality and cost outcomes within the system, which means achieving better outcomes at lower total health costs, rather than simply and crudely cutting costs.

Response: We believe that we have taken appropriate steps to increase the reliability of the 30-day mortality measures that we have placed into the Clinical Care—Outcomes domain by extending the performance periods for those measures. We believe that the measures appropriately receive substantial domain weighting in order to ensure that hospitals focus quality improvement efforts on patients with these harmful conditions. In addition, we believe that, our future measure set should evolve to emphasizing outcomes, safety cost and efficiency, population health, and patient experience of care as noted in the HHS National Quality Strategy. We continue to evaluate measures that assess these critical components of the HHS National Quality Strategy, and as we add more measures in this area, we intend to increase the weight of this domain.

We also believe that safety and the patient experience of care is important in assessing quality. As we note above, because we are adding two new measures to the Safety domain, we are increasing this domain's weight by 5 percent, we believe that this increase appropriately balances the importance of patient safety while balancing the need for excellence in the remaining domains. Likewise, we believe that a 25 percent weight for the Patient and Caregiver Centered Experience of Care/Care Coordination domain appropriately balances the need to address the patient experience with the importance of stressing quality clinical processes, outcomes, efficiency and safety.

After consideration of the public comments we received, we are finalizing the revised domain weighting for the FY 2017 Hospital VBP Program as proposed.

c. Domain Weighting for the FY 2017 Hospital VBP Program for Hospitals Receiving Scores on Fewer than Four Domains

In prior program years, we finalized a policy that hospitals must have received domain scores on all finalized domains in order to receive a TPS. However, because the Hospital VBP Program has evolved from its initial two domains to an expanded measure set with additional domains, we considered whether it was appropriate to continue this policy.

Therefore, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53606 through 53607), we finalized our proposal that, for the FY 2015 Hospital VBP Program and subsequent years, hospitals with sufficient data to receive at least two out of the four domain scores that existed for the FY 2015 Hospital VBP Program (that is, sufficient cases and measures to receive a domain score on at least two domains) will receive a TPS. We also finalized our proposal that, for hospitals with at least two domain scores, TPSs would be reweighted proportionately to the scored domains to ensure that the TPS is still scored out of a possible 100 points and that the relative weights for the scored domains remain equivalent to the weighting which occurs when there are scores in all four domains. We believe that this approach allows us to include relatively more hospitals in the Hospital VBP Program while continuing to focus on reliably scoring hospitals on their quality measure performance.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50701 through 50702), we continued this approach for the FY 2016 Hospital VBP Program and subsequent fiscal years for purposes of eligibility for the program even though, based on the NQS, we adopted four NQS-based domains for the FY 2017 Hospital VBP Program (78 FR 50702 through 50704), which include the subdivided Clinical Care domain.

In light of the four NQS-based domains we have adopted, we have reconsidered the appropriate minimum number of domains (that is, the number of domains on which hospitals must receive scores) in order to receive a TPS. We are concerned that requiring just two out of the four NQS-based domains in order to receive a TPS may be insufficient to ensure robust quality measurement under the Hospital VBP Program. Further, given the transition to NQS-based domains that we have adopted, we believe an additional independent analysis of appropriate minimum numbers of domains under the new domain structure is appropriate. We commissioned that analysis from our Reports & Analytics contractor for the Hospital VBP Program. The results of that analysis informed our proposal below, and we stated that we intended to post a summary of the reliability and minimum numbers analysis on the CMS Web site during the public comment period. We believe that requiring three out of the four NQS-based domains appropriately balances our desire to be as inclusive as possible with Hospital VBP Program requirements while ensuring that TPSs under the program are sufficiently reliable.

Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28132) we proposed to require that, for the FY 2017 Hospital VBP Program and subsequent years, hospitals must receive domain scores on at least three quality domains in order to receive a TPS. For purposes of the Clinical Care domain score, we proposed to consider either the Clinical Care—Process or Clinical Care—Outcomes subdomains as one domain in order to meet this proposed requirement. By adopting this policy, we believe we will continue to allow as many hospitals as possible may participate in the program while ensuring that reliable TPSs result. However, we would only reweight hospitals' TPSs once and would therefore not reallocate the Clinical Care—Process and Clinical Care—Outcomes subdomains' weighting within the Clinical Care domain if a hospital does not have sufficient data for one of the subdomains. For example, a hospital receiving domain scores on all domains except the Clinical Care—Process subdomain would not have the 5 percent weighting from the Clinical Care—Process subdomain reallocated entirely to the Clinical Care—Outcomes subdomain. Instead, the 5 percent weighting from the Clinical Care—Process subdomain would be proportionately reallocated across all domains.

We welcomed public comments on this proposal.

Comment: One commenter supported CMS' proposal to require hospitals to have sufficient data on at least three domains in order to receive a Total Performance Score in FY 2017.

Response: We thank the commenter for their support.

Comment: One commenter expressed concern that CMS' proposal would result in lower participation rates in the program. The commenter recommended that CMS retain the minimum number of domain scores at two.

Response: As described above, we are concerned that requiring just two domains to receive a Total Performance Score for FY 2017 may provide an insufficient basis in quality data for robust performance scoring. We believe that the proposed requirement appropriately balances our desire to include as many hospitals as possible in the Hospital VBP Program while ensuring that Total Performance Scores are based on reliable quality data.

After consideration of the public comments we received, we are finalizing the requirement that hospitals have sufficient data on at least three of the four domains for the FY 2017 Program as proposed. We also are finalizing that hospitals with sufficient data on at least three of four domains for FY 2017 will have their Total Performance Scores proportionately reweighted, and for purposes of that reweighting, we will not reallocate the Clinical Care—Process and Clinical Care—Outcomes subdomains' weighting within the Clinical Care domain if a hospital does not have sufficient data for one of the subdomains.

12. Minimum Numbers of Cases and Measures for the FY 2016 and FY 2017 Hospital VBP Program's Quality Domains

a. Previously Adopted Minimum Numbers of Cases and FY 2016 Minimum Numbers of Cases

In the Hospital Inpatient VBP Program final rule (76 FR 26527 through 26531), we adopted minimum numbers of at least 10 cases on at least 4 measures for hospitals to receive a Clinical Process of Care domain score. In the same final rule, we adopted a minimum number of 100 HCAHPS surveys for a hospital to receive a Patient Experience of Care domain score. In the CY 2012 OPPS/ASC final rule with comment period (76 FR 74532 through 74534), we adopted a minimum number of 10 cases for the mortality measures that we adopted for FY 2014. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53608 through 53609), we adopted a new minimum number of 25 cases for the mortality measures for FY 2015. In the same final rule, we adopted a minimum number of 25 cases for the MSPB measure (77 FR 53609 through 53610), a minimum of three cases for any underlying indicator for the PSI-90 measure based on AHRQ's measure methodology (77 FR 53608 through 53609), and a minimum of one predicted infection for NHSN-based surveillance measures based on CDC's minimum case criteria (77 FR 53608 through 53609). However, we noted that we adopted these case minimums for FY 2015 only, although we intended to adopt them for FY 2015 and subsequent years. We continue to believe that the finalized minimum numbers of cases described above are appropriate and provide sufficiently reliable data for scoring purposes under the Hospital VBP Program. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28132), we proposed to adopt the specified case minimums for the FY 2016 Hospital VBP Program and subsequent years.

We welcomed public comment on this proposal. We noted that we proposed below to specify minimum numbers of measures for the FY 2017 Hospital VBP Program and subsequent years based on the new domain structure.

We did not receive any specific public comments on this proposal. Therefore, we are finalizing this policy as proposed.

b. Minimum Number of Measures—Safety Domain

As described in more detail above, we proposed to adopt six quality measures in the Safety domain for the FY 2017 Hospital VBP Program. Of these measures, five are NHSN-based surveillance measures and one is the PSI-90 measure. After consideration of these measures and of previous independent analyses of the necessary minimum number of measures adopted for the Outcomes domain, whose measures formed the basis for part of the new Safety domain, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133) we proposed to adopt a minimum number of three measures for the Safety domain for FY 2017 and subsequent years. We believe this proposal balances our desire to be as inclusive as possible with the Hospital VBP Program and the need for reliable quality measurement data on which to base TPSs. We also clarified that we will continue to score hospitals on NHSN measures if, as we discussed with respect to the CLABSI measure (77 FR 53608) and the SSI measure (78 FR 50684), the hospital has met CDC's minimum case criteria of one predicted infection during the applicable period.

We welcomed public comment on this proposal.

We did not receive any specific public comments on this proposal, and therefore are finalizing this policy as proposed.

c. Minimum Number of Measures—Clinical Care Domain

(1) Background

In the FY 2014 IPPS/LTCH PPS final rule, we adopted a new domain structure for the FY 2017 Hospital VBP Program and subsequent years based on the NQS. In that final rule, we adopted a Clinical Care domain that was subdivided into the Clinical Care—Process and Clinical Care—Outcome subdomains. We adopted these subdomains in order to ensure that we place the appropriate domain weighting on measures of clinical processes and measures of clinical outcomes. We believe the same consideration is appropriate for determining minimum numbers of measures for each subdomain, and, based on prior independent analyses conducted of the appropriate minimum numbers for the Clinical Process of Care and Outcomes domains whose measures formed the basis for the new Clinical Care domain, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133), we proposed separate minimum numbers for each of these subdomains below. As described further above, we also attempted to balance our desire to be as inclusive as possible with the Hospital VBP Program and the need for reliable quality measurement data on which to base TPSs.

(2) Clinical Care—Outcomes Subdomain

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707), we adopted a minimum number of two measures in the former Outcome domain. We stated our belief that this minimum number is appropriate for the expanded Outcome domain that formed the basis for the Clinical Care—Outcomes subdomain because adding measure scores beyond the minimum number of measures has the effect of enhancing the domain score's reliability.

As noted above, the Clinical Care—Outcomes subdomain now contains the three 30-day mortality measures, and based on previous independent analysis of the appropriate minimum number of measures for the Outcomes domain that formed the basis for the Clinical Care—Outcomes subdomain (available on our Web site at: http://cms.hhs.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/Downloads/HVBP_Measure_Reliability-.pdf), we continue to believe that a minimum number of two measures within the subdomain appropriately balances scoring reliability with inclusiveness under the program. As noted above, we stated our intent to post a summary of the reliability and minimum numbers analysis on the CMS Web site during the public comment period. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 281333), we proposed to adopt a minimum number of two measures in the Clinical Care—Outcome subdomain for FY 2017 and subsequent years.

We welcomed public comment on this proposal.

We did not receive any specific public comments on this proposal, and therefore are finalizing this policy as proposed.

(3) Clinical Care—Process Subdomain

We have reconsidered the finalized minimum number of measures given the significant reduction in Clinical Care—Process measures due to “topped-out” removals that we proposed in the proposed rule. We are concerned that requiring hospitals to report on all three proposed Clinical Care—Process measures for the FY 2017 Hospital VBP Program, or even requiring two out of three measures, could prevent a significant proportion of participating hospitals from receiving a Clinical Care—Process subdomain score. We are aware that relatively few hospitals report data for the AMI-7a measure, and the proposed PC-01 measure will only include hospitals that provide maternity services. In accordance with our preference for including as many hospitals as possible in the Hospital VBP Program while ensuring the reliability of the domain score, and based on a prior independent analysis that formed the basis for the Clinical Care—Process domain, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133) we proposed to require hospitals to report a minimum of one measure in the Clinical Care—Process domain for the FY 2017 Hospital VBP Program and subsequent years to receive a domain score.

We welcomed public comment on this proposal.

We did not receive any specific public comments on this proposal, and therefore are finalizing this policy as proposed.

d. Minimum Number of Measures—Efficiency and Cost Reduction Domain

Because the MSPB measure remains the only measure within the Efficiency and Cost Reduction domain for FY 2017, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133) we proposed to require that hospitals receive a MSPB measure score in order to receive an Efficiency and Cost Reduction domain score. If we adopt additional measures for this domain in the future, we will consider if we should revisit this policy.

We welcomed public comments on this proposal.

We did not receive any specific public comments on this proposal, and therefore are finalizing this policy as proposed.

e. Minimum Number of Measures—PEC/CC Domain

As with the MSPB measure adopted for the Efficiency and Cost Reduction domain described further above, we have not adopted additional measures for the PEC/CC domain. Because the HCAHPS survey measure remains the only measure within the PEC/CC domain for FY 2017, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133), we proposed to require that hospitals receive an HCAHPS survey measure score in order to receive a PEC/CC domain score. If we adopt additional measures for this domain in the future, we will consider if we should revisit this policy.

We welcomed public comments on this proposal.

We did not receive any specific public comments on this proposal, and therefore are finalizing this policy as proposed.

13. Applicability of the Hospital VBP Program to Maryland Hospitals

Section 1886(o)(1)(C) of the Act specifies the hospitals for which the Hospital VBP Program applies. Specifically, the term “hospital” is defined under section 1886(o)(1)(C)(i) of the Act as a “subsection (d) hospital (as defined in section 1886(d)(1)(B) [of the Act]).” Section 1886(o)(1)(C)(ii) of the Act sets forth a list of exclusions to the definition of the term “hospital” with respect to a fiscal year. Section 1886(o)(1)(C)(iv) of the Act states that in the case of a hospital that is paid under section 1814(b)(3) of the Act, the Secretary may exempt the hospital from the Hospital VBP Program if the State submits an annual report to the Secretary describing how a similar program in the State for a participating hospital or hospitals achieves or surpasses the measured results in terms of patient health outcomes and cost savings established under the Hospital VBP Program. We have interpreted the reference to section 1814(b)(3) of the Act to mean those Maryland hospitals that were paid under section 1814(b)(3) of the Act and that, absent the “waiver” provided by section 1814(b)(3) of the Act, would have been paid under the IPPS.

The State of Maryland entered into an agreement with CMS, effective January 1, 2014, to participate in CMS' new Maryland All-Payer Model, a 5-year hospital payment model. This model is being implemented under section 1115A of the Act, as added by section 3021 of the Affordable Care Act, which authorizes the testing of innovative payment and service delivery models, including models that allow States to “test and evaluate systems of all-payer payment reform for the medical care of residents of the State, including dual-eligible individuals.” Section 1115A of the Act authorizes the Secretary to waive such requirements of Titles XI and XVIII of the Act as may be necessary solely for purposes of carrying out section 1115A of the Act with respect to testing models.

Under the agreement with CMS, Maryland will limit per capita total hospital cost growth for all payers, including Medicare. In order to implement the new model, effective January 1, 2014, Maryland elected to no longer have Medicare pay Maryland hospitals in accordance with section 1814(b)(3) of the Act. Maryland also represented that it is no longer in continuous operation of a demonstration project reimbursement system since July 1, 1977, as specified under section 1814(b)(3) of the Act. Because Maryland hospitals are no longer paid under section 1814(b)(3) of the Act, they are no longer subject to those provisions of the Act and related implementing regulations that are specific to hospitals paid under section 1814(b)(3) of the Act, including but not limited to section 1886(o)(1)(C)(iv) of the Act, which provides an exemption for hospitals paid under section 1814(b)(3) of the Act from the application of the Hospital VBP Program if the State which is paid under that section meets certain requirements.

In order to implement the Maryland All-Payer Model, we have waived certain provisions of the Act, and the corresponding implementing regulations, as set forth in the agreement between CMS and Maryland and subject to Maryland's compliance with the terms of the agreement. The effect of Maryland hospitals no longer being paid under section 1814(b)(3) of the Act is that they are not entitled to be exempted from the Hospital VBP Program under section 1886(o)(1)(C)(iv) of the Act and, but for the model, would be included in the Hospital VBP Program. In other words, although the exemption from the Hospital VBP Program no longer applies, Maryland hospitals will not be participating in the Hospital VBP Program because section 1886(o) of the Act and its implementing regulations have been waived for purposes of the model, subject to the terms of the agreement.

Accordingly, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28133 through 28134) we proposed to make conforming revisions to section 412.160, in the definition of “base-operating DRG payment amount” and to section 412.161, which describes the applicability of the Hospital VBP Program. We proposed to delete references in these regulations to hospitals paid under section 1814(b)(3) of the Act because, at this time, there are no hospitals paid under that section.

We welcomed public comment on these proposals. After receiving no specific public comment on these proposals, we are finalizing our proposed regulation text changes to delete references in the regulation text to hospitals paid under section 1814(b)(3) of the Act because no hospitals are paid under that section.

14. Disaster/Extraordinary Circumstance Exception Under the Hospital VBP Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50704 through 50706), we adopted a disaster/extraordinary circumstance exception. We refer readers to that final rule for the policy's details.

We note that we are currently in the process of revising the Extraordinary Circumstances/Disaster Extension or Waiver Request form, previously approved under OMB control number 0938-1171.

J. Changes to the Hospital-Acquired Condition (HAC) Reduction Program

1. Background

We refer readers to section V.I.1.a. of the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707 through 50708) for a general overview of the HAC Reduction Program.

2. Statutory Basis for the HAC Reduction Program

Section 3008 of the Affordable Care Act added section 1886(p) to the Act to provide an incentive for certain hospitals to reduce the incidence of HACs. Section 1886(p) of the Act requires the Secretary to make an adjustment to payments to “applicable hospitals” effective beginning on October 1, 2014 and for subsequent program years. Section 1886(p)(1) of the Act sets forth the requirements by which payments to “applicable hospitals” will be adjusted to account for HACs with respect to discharges occurring during FY 2015 or later. For hospitals with HAC scores in the worst performing quartile relative to other applicable hospitals for a given fiscal year, the amount of Medicare payment is reduced to 99 percent of the amount of payment that would otherwise apply to discharges under section 1886(d) or 1814(b)(3) of the Act, as applicable. Section 1886(p)(2)(A) of the Act defines “applicable hospitals” as subsection (d) hospitals that meet certain criteria. Section 1886(p)(2)(B)(i) of the Act defines these criteria and specifies that the payment adjustment would apply to an applicable hospital that ranks in the top quartile (25 percent) of all subsection (d) hospitals, relative to the national average, of conditions acquired during the applicable period, as determined by the Secretary. Section 1886(p)(2)(B)(ii) of the Act requires the Secretary to establish and apply a risk-adjustment methodology in calculating HAC scores for each hospital.

Sections 1886(p)(3) and (p)(4) of the Act define “hospital-acquired conditions” and “applicable period,” respectively. The term “hospital-acquired condition” means “a condition identified in subsection 1886(d)(4)(D)(iv) of the Act and any other condition determined appropriate by the Secretary that an individual acquires during a stay in an applicable hospital, as determined by the Secretary.” The term “applicable period” means, with respect to a fiscal year, a period specified by the Secretary.

Section 1886(p)(5) of the Act requires that, prior to FY 2015 and each subsequent fiscal year, the Secretary provides confidential reports to each applicable hospital with respect to the HAC Reduction Program scores for the applicable period, to give the hospitals an opportunity to review and correct the data. Section 1886(p)(6)(A) of the Act sets forth the reporting requirements by which the Secretary would make information available to the public regarding HACs for each applicable hospital. Section 1886(p)(6)(B) of the Act requires the Secretary to ensure that an applicable hospital has the opportunity to review, and submit corrections for, the information to be made public with respect to the HAC scores of the applicable hospital prior to such information being made public. Section 1886(p)(6)(C) of the Act requires that, once corrected, the HAC scores be posted on the Hospital Compare Web site on the Internet in an easily understandable format.

Section 1886(p)(7) of the Act limits administrative and judicial review of certain determinations made pursuant to section 1886(p) of the Act. These determinations include: what qualifies as an applicable hospital; the specifications of a HAC; the Secretary's determination of the “applicable period'; the provision of confidential reports submitted to the applicable hospital; and the information publicly reported on the Hospital Compare Web site.

3. Implementation of the HAC Reduction Program for FY 2015

a. Overview

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707 through 50729), we presented the general framework for implementation of the HAC Reduction Program for the FY 2015 implementation. We included the following provisions for the program: (a) The relevant definitions applicable to the program; (b) the payment adjustment under the program; (c) the measure selection and conditions for the program, including a risk-adjustment and scoring methodology; (d) performance scoring; (e) the process for making hospital-specific performance information available to the public, including the opportunity for a hospital to review the information and submit corrections; and (f) limitation of administrative and judicial review.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50967), we established the rules governing the payment adjustment under the HAC Reduction Program at Subpart I of 42 CFR Part 412 (§§ 412.170 and 412.172). We also amended existing § 412.150 (the section that describes the basis and scope of Subpart I of Part 412, which contains the regulations governing adjustments to the base operating DRG payment amounts under the IPPS for inpatient operating costs) to incorporate the basis and scope of §§ 412.170 and 412.172 for the HAC Reduction Program.

In accordance with the provisions of section 1886(p) of the Act, in the FY 2014 IPPS/LTCH PPS final rule, we included, under § 412.170, definitions for the terms “hospital-acquired condition,” “applicable hospital,” and “applicable time period” (78 FR 50967). In § 412.170, we defined “hospital-acquired condition” as a condition as described in section 1886(d)(4)(D)(iv) of the Act and any other condition determined appropriate by the Secretary that an individual acquires during a stay in an applicable hospital, as determined by the Secretary. We defined an “applicable hospital” as “a hospital described in section 1886(d)(1)(B) of the Act (including a hospital in Maryland that is paid under section 1814(b)(3) of the Act and that, absent the waiver specified by section 1814(b)(3) of the Act, would have been paid under the hospital inpatient prospective payment system) as long as the hospital meets the criteria specified under § 412.172(e)” (78 FR 50967). We specified that this definition does not include hospitals and hospital units excluded from the IPPS, such as LTCHs, cancer hospitals, children's hospitals, IRFs, IPFs, CAHs, and Puerto Rico hospitals. We defined the “applicable period” as, with respect to a fiscal year, the 2-year period (as specified by the Secretary) from which data are collected in order to calculate the Total HAC Score for the HAC Reduction Program.

Comment: Commenters supported the HAC Reduction Program as a mechanism to identify hospitals that underperform in preventing well-identified, measurable, and preventable adverse events.

Response: We appreciate the commenters' support. We are committed to reduce HACs, which are important markers of quality of care and whose reduction can positively impact patient outcomes and the cost of care.

Comment: Several commenters suggested changing the terminology of “hospital-acquired conditions” to “hospital-acquired complications” to signal more clearly the intent of the program is to focus on complications that arise from inappropriate delivery of care.

Response: The name of the HAC Reduction Program is specified in section 1886(d) of the Act. We believe that the name of the program reflects Congress' intent in passing this provision of the Affordable Care Act.

b. Payment Adjustment Under the HAC Reduction Program, Including Exemptions

(1) Basic Payment Adjustment

Section 1886(p)(1) of the Act sets forth the requirements by which payments to “applicable hospitals” are to be adjusted for hospitals in the worst performing quartile relative to other applicable hospitals beginning on October 1, 2014. Section 1886(p)(1) of the Act specifies that the amount of payment shall be equal to 99 percent of the amount of payment that would otherwise apply to such discharges under section 1886(d) or 1814(b)(3) of the Act, as applicable. As specified in the statute, this payment adjustment is calculated and made after payment adjustments under sections 1886(o) and 1886(q) of the Act, the Hospital VBP Program and the Hospital Readmissions Reduction Program respectively, are calculated and made. (We note that the Hospital VBP Program is discussed in section IV.I. of the preamble of this final rule and the Hospital Readmissions Reduction Program is discussed in section IV.H. of the preamble of this final rule.) Section 1886(p)(2)(A) of the Act defines “applicable hospitals” as subsection (d) hospitals that meet certain criteria. Section 1886(p)(2)(B)(i) of the Act defines these criteria and specifies that the payment adjustment would apply to an applicable hospital that ranks in the top quartile (25 percent) of all subsection (d) hospitals, relative to the national average of hospitals that report conditions acquired during the applicable period, as determined by the Secretary.

Therefore, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50967), we specified in § 412.172(b) of the regulations that, for applicable hospitals, beginning with discharges occurring during FY 2015, the amount of payment under § 412.172, or section 1814(b)(3) of the Act, as applicable, for such discharges shall be equal to 99 percent of the amount of payment that would otherwise apply to such discharges under § 412.172, or section 1814(b)(3) of the Act. This amount of payment will be determined after the application of the payment adjustment under the Hospital Readmissions Reduction Program under § 412.154, and the adjustment made under the Hospital VBP Program under § 412.162, and section 1814(l)(4) of the Act but without regard to this section 1886(p) of the Act.

Comment: Many commenters noted that the proposed 1-percent reduction in payment for the top quartile of lower performing hospitals will provide a stronger penalty than the current DRA HAC policy and has the potential to stimulate improvements in safety. The commenters supported CMS' efforts to reduce HACs by paying less to hospitals for instances involving patients contracting HACs during a hospital stay. These commenters noted that quality payment adjustments continue to positively affect provider performance. Commenters further noted that several commercial health plans have implemented similar actions, processes, and guidelines to align their payment policies with CMS to adjust payment for reasonably preventable errors made by hospitals and health care facilities.

Response: We appreciate the commenters' support and agree that the HAC Reduction Program, along with the other CMS quality initiatives set forth under the Affordable Care Act (for example, the Hospital VBP and Hospital Readmissions Reduction Programs), will lead to improvements in patient care, safety and outcomes.

Comment: Some commenters indicated that it was not clear in the FY 2015 IPPS/LTCH PPS proposed rule how the HAC Reduction Program payment adjustment would specifically be applied. The commenters stated that the HAC Reduction Program penalty appears to apply to all hospital payments (for example, outliers, DSH, uncompensated care, and IME) and they questioned why the policy should apply to IME and DSH payments that they asserted are not related to the underlying quality policy the provision enforces.

These commenters urged CMS to use administrative authority under section 1886(d)(5)(I)(i) of the Act to limit the HAC penalty to the base operating DRG payment only, which they reported would be consistent with Congressional intent and with the Hospital VBP and Hospital Readmissions Reduction Programs. The commenters noted that by restricting the penalty to the base operating DRG payment it could ensure consistency across the programs and reduce any confusion because under the Hospital VBP and Hospital Readmissions Reduction Programs the payment adjustment applies to the base operating DRG payment, not the base DRG rate and the additional add-on payments of outliers, DSH, uncompensated care, and IME.

Response: We did not propose to change the application of the payment adjustment that we finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50711). As we discussed in that rule, the statutory requirements for the HAC Reduction Program payment adjustment differ from those for the Hospital VBP and Hospital Readmissions Reduction Programs. In accordance with section 1886(q)(1) of the Act, the Hospital Readmissions Reduction Program adjustment is applied to the base operating DRG payment amount, which is defined at section 1886(q)(2) of the Act to exclude certain payments under subsection (d). Similarly, in accordance with sections 1886(o)(7)(A) and 1886(o)(7)(B) of the Act, the Hospital VBP Program applies adjustments to the base operating DRG payment amount, which is defined at section 1886(o)(7)(D) of the Act to exclude certain payments under subsection (d).

For the HAC Reduction Program, no such statutory exclusion exists and section 1886(p)(1) of the Act states that the payment for applicable hospitals “shall be equal to 99 percent of the amount of payment that would otherwise apply.” Therefore, the HAC Reduction Program payment adjustment will be applied after the application of the other program adjustments including add-on payments consisting of outliers, DSH, uncompensated care, and IME.

As we have stated previously, our goal for the HAC Reduction Program is to heighten the awareness of HACs and reduce the number of incidences that occur through implementing the adjustments required by section 1886(p) of the Act. We believe that our efforts in using payment adjustments and our measurement authority will encourage hospitals to eliminate the incidence of HACs that could be reasonably prevented by applying evidence-based clinical guidelines. Given this goal, and the statutory language in 1886(p) of the Act, we do not believe this is an appropriate situation for us to exercise our authority under 1886(d)(5)(I)(i) of the Act.

(2) Applicability to Maryland Hospitals

Section 1886(p)(2)(c) of the Act specifies that the Secretary may exempt hospitals paid under 1814(b)(3) “from the application of this subsection if the State which is paid under such section submits an annual report to the Secretary describing how a similar program in the state for a participating hospital or hospitals achieves or surpasses the measured results in terms of patient health outcomes and cost savings established under this subsection.” Accordingly, a program established by the State of Maryland that could serve to exempt hospitals in the State from the HAC Reduction Program would focus on hospitals operating under the waiver provided by section 1814(b)(3) of the Act, that is, those hospitals that would otherwise have been paid by Medicare under the IPPS, absent this provision. As we stated in section IV.J.3.b of the preamble of this final rule, because hospitals paid under section 1814(b)(3) of the Act are subsection (d) hospitals, unless the Secretary exempts these hospitals from the application of payment adjustments under the HAC Reduction Program under the authority of section 1886(p)(2)(C) of the Act, they are considered to be “applicable hospitals” (subject to the payment adjustments in the HAC Reduction Program) under the HAC Reduction Program.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50967 through 50968), we established criteria for evaluation to determine whether Maryland would be exempted from the application of the payment adjustments under the HAC Reduction Program for a given fiscal year, under § 412.172(c). Pursuant to our rule, if the State submitted an annual report to the Secretary describing how a similar program to reduce hospital acquired conditions in that State achieves or surpasses the measured results in terms of health outcomes and cost savings for the HAC Reduction Program as applied to hospitals described in section 1886(d)(1)(B) of the Act, the State would be exempt from the HAC Reduction Program. We specified in the regulations that “CMS will establish criteria for evaluation of Maryland's annual report to the Secretary to determine whether Maryland will be exempted from the application of payment adjustments under this program for a given fiscal year.” We also specified that Maryland's annual report to the Secretary and request for exemption from the HAC Reduction Program must be resubmitted and reconsidered annually. We provided that, for FY 2015, Maryland must submit a preliminary report to us by January 15, 2014 and a final report to us by June 1, 2014.

We noted that our criteria to evaluate Maryland's program is for FY 2015, the first year of the payment adjustment under the HAC Reduction Program, and that our evaluation criteria may change through notice and comment rulemaking as this program evolves.

The State of Maryland entered into an agreement with CMS, effective January 1, 2014, to participate in CMS' new Maryland All-Payer Model, a 5-year hospital payment model. This model is being implemented under section 1115A of the Social Security Act (“Act”), as added by section 3021 of the Affordable Care Act, which authorizes the testing of innovative payment and service delivery models, including models that allow states to “test and evaluate systems of all-payer payment reform for the medical care of residents of the State, including dual eligible individuals.” Section 1115A of the Act authorizes the Secretary to waive such requirements of titles XI and XVIII of the Act as may be necessary solely for purposes of carrying out Section 1115A with respect to testing models.

Under the agreement with CMS, Maryland will limit per capita total hospital cost growth for all payers, including Medicare. In order to implement the new model, effective January 1, 2014, Maryland elected to no longer have Medicare reimburse Maryland hospitals in accordance with section 1814(b)(3) of the Act. Maryland also stipulated that it is no longer in continuous operation of a demonstration project reimbursement system since July 1, 1977, as specified under Section 1814(b)(3) of the Act. Because Maryland hospitals are no longer paid under section 1814(b)(3) of the Act, they are no longer subject to those provisions of the Act and related implementing regulations that are specific to section 1814(b)(3) hospitals, including but not limited to section 1886(p)(2)(C) of the Act, which provides exemptions for hospitals paid under section 1814(b)(3) from the application of the HAC Reduction Program.

However, in order to implement the Maryland All-Payer Model, CMS has waived certain provisions of the Act for Maryland hospitals, including section 1886(p), and the corresponding implementing regulations, as set forth in the agreement between CMS and Maryland and subject to Maryland's compliance with the terms of the agreement. In other words, although section 1886(p)(2)(C) of the Act no longer applies to Maryland hospitals, Maryland hospitals will not be participating in the HAC Reduction Program because section 1886(p) of the Act and its implementing regulations have been waived for purposes of the model, subject to the terms of the agreement. Consequently, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28135), we proposed that the Total HAC Scores for Maryland hospitals would not be included when identifying the top quartile of all hospitals with respect to their Total HAC Score during the applicable period.

As a result of changes to the status of Maryland hospitals under 1814(b)(3) of the Act described above, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28136), we proposed conforming changes to these regulations and sought public comment on this proposal. Specifically, we proposed to remove the entire contents of paragraph (c) under § 412.172 and reserve the paragraph (c) designation.

No commenters opposed our proposal to exclude the Total HAC Scores for Maryland hospitals when identifying the top quartile of all hospitals and no commenters opposed CMS' proposed changes to the regulations regarding Maryland hospitals. Therefore, we are finalizing our proposal to exclude the Total HAC Scores for Maryland hospitals when identifying the top quartile of all hospitals and our proposed changes to the regulations regarding Maryland hospitals.

c. Measure Selection and Conditions, Including Risk-Adjustment Scoring Methodology

(1) General Selection of Measures

We did not propose any new measures for the HAC Reduction Program in the FY 2015 IPPS/LTCH PPS proposed rule. Although we are not required under section 1886(p) of the Act to address specific measure scoring methodologies and domain weights regarding the HAC Reduction Program in notice-and-comment rulemaking, as required under the Hospital VBP program, we believe that it is important to set forth such scoring methodologies for each individual HAC measure, in order for the public to understand how the measures adopted in previous rulemaking relate to the performance methodology used to determine the applicable hospitals subject to the payment adjustment under the HAC Reduction Program. Below we set forth the specific measure scoring methodology and domain weights regarding the HAC Reduction Program for FY 2015 as finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50712 through 50719).

Comment: A few commenters thanked CMS for not adding any new measures to the HAC program for FY 2015 and FY 2016. One commenter encouraged CMS to fill measure gaps as soon as possible to ensure that this program provides the greatest possible value for quality improvement and consumer education. Several commenters suggested that CMS identify new measures for the HAC Reduction Program that would address a variety of quality and safety issues relevant to the broadest possible range of hospitals and affect a greater number of patients, as commenters asserted that this approach is more fair and would ensure hospitals are not penalized for the type of patients they treat. In addition, the commenters believed this approach would help improve the ability of the program to identify the real poor performers. One commenter recommended that these new measures should not be entirely claims-based.

Commenters made additional recommendations for future new measures including PSI-4: Death rate among surgical inpatients with serious, treatable complications (NQF #0351), PSI-16: Transfusion reaction count (NQF #0349), surgical site infections (SSIs) following hip and knee arthroplasty and SSIs following high-volume procedures such as caesarean section surgery. One commenter recommended expanding the iatrogenic pneumothorax rate (PSI-6), which currently addresses iatrogenic pneumothorax with venous catheterization, to also include iatrogenic pneumothorax with paracentesis and thoracentesis. One commenter recommended that new measures of infection be developed that incorporate infection rates per thousand discharges in order to inform patients of their likelihood of acquiring an infection at a given hospital.

Response: We did not propose new measures in this rulemaking as we intend to allow time for providers to gain experience with the finalized measures. We are continuously evaluating the program and working to identify new, potentially suitable measures to fill measure gaps. We appreciate the commenters' input for measure selection and will take this feedback into consideration in future rulemaking.

Comment: Many commenters suggested that all HAC Reduction Program measures should be NQF-endorsed and, while recognizing it is not a requirement for the HAC Reduction Program, commenters also recommended that CMS use the formal pre-rulemaking process of the Measure Applications Partnership (MAP) for any measures being considered for the program.

Response: While we note that section 1886(p)(3) of the Act does not require NQF endorsement for a measure to be considered for the HAC Reduction Program, we are aware of the value of the NQF endorsement and MAP processes in facilitating information exchange and agreement among stakeholders. We also note that all of the measures adopted for the HAC Reduction Program went through the pre-rulemaking process and were either recommended for inclusion by the MAP, or represent 1 of the 12 HACs that have been identified by the Secretary and which are referenced in section 1886(p) of the Act for the HAC Reduction Program.

(2) Updates on AHRQ PSI-90, and CDC NHSN CLABSI and CAUTI Measures

For FY 2015, we will keep the AHRQ PSI-90 composite measure (in Domain 1) that we adopted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717) because it is currently endorsed by NQF. However, we note that the AHRQ PSI-90 composite measure is currently undergoing NQF maintenance review. The PSI-90 composite measure consists of eight component indicators: PSI-3 Pressure ulcer rate; PSI-6 Iatrogenic pneumothorax rate; PSI-7 Central venous catheter-related blood stream infections rate; PSI-8 Postoperative hip fracture rate; PSI-12 Postoperative Pulmonary Embolism/Deep Vein Thrombosis rate; PSI-13 Postoperative sepsis rate; PSI-14 Wound dehiscence rate; and PSI-15 Accidental puncture & laceration rate. AHRQ is considering the addition of PSI-9 (Perioperative hemorrhage rate), PSI-10 (Perioperative physiologic metabolic derangement rate) and PSI-11 (Post-operative respiratory failure rate) or a combination of these three measures into the PSI-90 composite measure. We consider the inclusion of additional component measures in the PSI-90 composite measure to be a significant change to the PSI-90 composite measure that we finalized in the FY 2014 IPPS/LTCH PPS final rule. If the changes are significant, we will engage in notice-and-comment rulemaking prior to requiring reporting of this revised composite.

Similarly, the CDC NHSN Catheter-Associated Urinary Tract Infection (CAUTI) and Central Line-Associated Blood Stream Infection (CLABSI) measures in Domain 2 that we adopted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717) for FY 2015 also are currently undergoing NQF maintenance review. If there are significant changes to these measures, we will engage in notice-and-comment rulemaking prior to requiring reporting of the changes made to CDCs NHSN CLABSI and CAUTI measures. For FY 2015, we will keep CDC's NHSN CAUTI and CLABSI measures in Domain 2 as they are currently endorsed.

Comment: Several commenters supported CMS' commitment to use the notice-and-comment rulemaking process for any HAC measure with significant changes made during the NQF review process.

One commenter specifically recommended that the AHRQ PSI-90 measure and the CDC NHSN CLABSI and CAUTI measures currently undergoing NQF maintenance review only be included for FY 2016 and beyond contingent upon continued NQF endorsement and any updates recommended for continued endorsement.

Response: We appreciate the commenters' support of our rulemaking process. As for the comments regarding NQF endorsement of the measures, we refer readers to our response in section IV.J.3.c. of the preamble of this final rule.

(3) Measure Selection

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717), we finalized the following measures for selection: (i) the AHRQ PSI-90 composite measure for Domain 1 and the CDC NHSN measures CAUTI and CLABSI for Domain 2 for FY 2015; (ii) addition of the CDC NHSN Surgical Site Infection (SSI) measure for FY 2016; and (iii) addition of the CDC NHSN Methicillin-Resistant Staphylococcus aureus (MRSA) Bactremia and C. difficile measures for FY 2017. Several of these measures are already part of the Hospital IQR Program and are reported on the Hospital Compare Web site.

Comment: One commenter supported the implementation schedule of quality measures for the program, specifically stating that the AHRQ PSI-90 composite measure and the CDC NSHN CLABSI and CAUTI measures are sufficient starting points for the HAC Reduction Program. A few commenters also supported the addition of the CDC NHSN SSI, MRSA and C. difficile measures which they believed would address the increasing incidence of these infections in hospital settings. A few commenters supported the adoption of the NHSN SSI measure in Domain 2 for FY 2016.

Response: We thank the commenters for the recognition of the significance of potential patient harms in hospitals as well as for their support of our proposals for the implementation of the HAC Reduction Program. We emphasize that patient safety is our primary objective for the HAC Reduction Program.

Comment: A few commenters supported maintaining claims-based measures such as the PSI-90 composite measure in quality reporting programs because they are the least burdensome, least costly and most widely accessible and available reporting method.

Response: We agree that claims-based measures have the advantages of being minimally burdensome to providers while providing data covering a large proportion of the Medicare population. We consider several factors when selecting measures for quality programs, including but not limited to measurement gap areas, opportunities for quality improvement, and feasibility and burden for implementation. Claims-based measures, including AHRQ PSIs, are collected and widely accepted by States and other health care purchasers for payment purposes.

In addition to the claims-based measure in the FY 2015 HAC Reduction Program, we also adopt chart-abstracted measures as appropriate. There are currently two chart-abstracted measures in the program and the number of chart-abstracted measures will increase in subsequent years (three in FY2016 and five in FY 2017). We also are exploring options for new measures, including electronically specified measures, that could be incorporated into the HAC Reduction Program in future years.

Comment: Many commenters believed that coding biases result in unacceptable levels of reliability and validity for the PSI-90 composite measure and thus the measure fails to accurately and meaningfully reflect hospital performance. A few commenters expressed concerns that the PSI measures are not clinically validated against medical records.

Response: We have previously addressed commenters' specific concerns regarding validity and coding issues of PSI-90 composite measure, and we refer readers to our responses to these comments in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50715). We also note that there are validation studies examining the relationship between billing or claims data and medical records.

(1) Zrelak PA, Romano PS, Tancredi DJ, Geppert JJ, Utter GH. Validity of the AHRQ Patient Safety Indicator for Postoperative Physiologic and Metabolic Derangement based on a national sample of medical records. Medical Care 2013; 51(9):806-11. (2) Utter GH, Zrelak PA, Baron R, Tancredi DJ, Sadeghi B, Geppert JJ, Romano PS. Detecting postoperative hemorrhage or hematoma from administrative data: The performance of the AHRQ Patient Safety Indicator. Surgery 2013; 154(5):1117-25. (3) Borzecki AM, Cevasco M, Chen Q, Shin M, Itani KM, Rosen AK. How valid is the AHRQ Patient Safety Indicator “postoperative physiologic and metabolic derangement”? J Am Coll Surg. 2011 Jun;212(6):968-976. (4) Borzecki AM, Kaafarani H, Cevasco M, Hickson K, Macdonald S, Shin M, Itani KM, Rosen AK. How valid is the AHRQ Patient Safety Indicator “postoperative hemorrhage or hematoma”? J Am Coll Surg. 2011 Jun;212(6):946-953.

Comment: Many commenters expressed a lack of confidence about the PSI-90 composite measure due to recent discussions at the NQF Patient Safety Standing Committee (“Patient Safety Committee” or “Committee”). Some commenters stated that the Patient Safety Committee did not recommend the measure for endorsement and other commenters noted that NQF Patient Safety Committee requested changes to the weighting of the individual components in the composite measure to better reflect their relative importance or preventability. One way the Committee suggested this reweighting could be achieved is through including three additional component measures (PSI-9—Perioperative Hemorrhage or Hematoma Rate, PSI-10—Postoperative Physiologic and Metabolic Derangement Rate and PSI-11—Postoperative Respiratory Failure Rate) in the composite. A few commenters expressed support for the potential inclusion of PSI-9, 10 and 11 in the PSI-90 composite measure. However, one commenter did not support the addition of any new components to the composite measure, while a few commenters opposed the inclusion of PSI-9 and PSI-10 in particular because they claimed that these components had a high false-positive rate due to lack of clarity on the coding criteria.

In the event that the composite measure is not re-endorsed by NQF, some commenters recommended that CMS not consider using individual PSI-90 component measures that may still be endorsed. They also recommended that additional testing for consistency between individual components and the composite scores be undertaken and the results released. Other commenters had concerns that several of the PSI-90 component measures are not NQF-endorsed. Some commenters supported and understood that CMS may need to retain the PSI-90 composite measure, regardless of NQF endorsement status.

Response: We would like to clarify the status of the PSI-90 measure with regard to NQF endorsement. As part of the routine NQF measure maintenance process, the Patient Safety Committee expressed concerns about the weighting of the PSI-90 component measures and requested to see additional measure information related to re-weighting of PSI-90 with three additional components (PSI-9, PSI 10 and PSI 11) before deciding if the measure would be recommended for continued endorsement. AHRQ has submitted the requested data for the NQF Patient Safety Committee's consideration in making their decision regarding continued endorsement of the composite. As we stated earlier, if during the NQF review process, substantive changes are made to the measure, we will go through a notice-and-comment rulemaking process.

Regarding the concern for the claimed high false-positive rate of some of the PSI-90 component measures, we conferred with AHRQ which noted that most of the studies that examine positive predictive values predate the use of Present on Admission (POA) coding that is now integral to the PSIs. Detailed reviews of these studies indicate that most of the false positives were due to events that were POA. POA coding for IPPS hospitals was required by CMS beginning October 1, 2007 with a payment penalty beginning October 1, 2008. Studies that use data prior to 2009 would not have captured POA information. Therefore, we believe that proper coding will address the commenters' concerns.

In addition, AHRQ noted that the NQF convened a group of 12 experts to determine what criteria should be used for evaluating composite performance measurement for NQF endorsement. The Technical Expert Panel provided clear guidance on the relationship between the individual component indicators and the composite in the Composite Performance Measure Evaluation Guidance document (NQF, April 2013). Specifically, individual component measures that are included in the composite performance measure: (1) should be justified based on the clinical evidence; (2) do not need to be NQF endorsed; (3) generally should demonstrate a gap in performance; and (4) may not be sufficiently reliable independently, but contribute to the reliability of the composite performance measure.

AHRQ convened a Composite Measure Workgroup of experts in the field to determine the best weighting strategy. The methodology of the PSI-90 composite measure is detailed in the original technical report by the AHRQ Composite Measure Workgroup: http://qualityindicators.ahrq.gov/Downloads/Modules/PSI/PSI_Composite_Development.pdf. Several alternative approaches were discussed with the AHRQ Composite Measure Workgroup and the first NQF Composite Measure Steering Committee. Factor analysis was considered as one approach and was deemed to have no clear advantages over less complex, more intuitively clear weighting schemes. In brief, numerator weighting that is used in the PSI-90 composite measure was preferred due to its greater simplicity and clarity.

Comment: A few commenters stated that many of the AHRQ PSI-90 composite component measures are rare events and do not meet the high-volume requirement for the HAC Reduction Program.

Response: We note that section 1886(d)(4) (D)(iv) of the Act defines a hospital-acquired condition for the HAC Reduction Program as one that is high cost, high volume or both or any other conditions determined appropriate by the Secretary. We believe the PSI-90 composite measure and its components meet the statutory requirement for inclusion in the program.

Comment: Some commenters asserted that composite measures such as PSI-90 do not provide actionable information to hospitals.

Response: We disagree and note that hospitals have access to their results on the individual PSI-90 component measures and how they compare to the national risk adjusted rate on their Hospital Specific Reports which are issued during the review and corrections period. In addition, the component measure scores are available to hospitals and the public on our Web site at: http://www.medicare.gov/. Therefore, hospitals can use the individual component measure results to identify specific areas for improvement efforts.

Comment: Based on the belief that the PSI-90 composite measure has significant flaws as described above, many commenters recommended identifying alternatives to the PSI-90 composite measure and phasing it out of the HAC Reduction Program as soon as possible. Some commenters suggested that the alternative measure(s) be derived from the NQF portfolio of safety measures.

Response: In the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27625 through 27626), we explained the rationale for including the PSI-90 composite measure in the HAC Reduction Program. We continue to believe the PSI-90 is an appropriate tool for calculation of HAC scores. Nonetheless, we will continue to explore options for new measures, including electronically specified measures that could be incorporated into the HAC Reduction Program to supplement or replace the PSI-90 composite measure. We also note that the PSI-90 is one of three measures included in the Program for FY 2015. The other two measures are chart-abstracted and we are increasing the number of chart-abstracted measures in subsequent years of the program (three in FY 2016 and five in FY 2017).

Comment: One commenter suggested revisions to four of the PSI-90 composite component measures. For PSI-6, the commenter recommended exclusion of high frequency outliers, such as iatrogenic pneumothorax in patients with a lack of intravenous access; acuity; and cases where iatrogenic pneumothorax is secondary to a life-saving procedure. The commenter also recommended that CMS not apply this measure if clinicians have used all available means of avoiding iatrogenic pneumothorax, such as ultrasound guidance. For PSI-7, the commenter recommended exclusions for trauma. For PSI-12, the commenter recommended clear definition of the inclusion criteria in order to avoid misclassification of providers and subsequent inappropriate penalties. For PSI-14, the commenter requested adding exclusions for trauma cases and patients in shock that require emergency procedures.

Another commenter recommended that PSI-7 be removed from the HAC calculation because it is not as well-validated as the NSHN CLABSI measure, the transition from ICD-9 to ICD-10 coding which some hospitals have already undergone could compromise the validity of this component and that, as it currently exists, some vascular catheter infections might be double counted.

Response: AHRQ's Quality Indicator program continually updates and refines measures to provide the best possible quality indicators to the public. We conferred with AHRQ, which welcomed the commenters' suggestions and will examine the feasibility of including these exclusions. All of the AHRQ quality indicators go through a rigorous testing process prior to changes being made to the indicators. It should be noted that NQF policy and guidance generally has favored risk adjustment approaches over exclusion of high-risk patients, when possible, to optimize the generalizability and value of quality measures. Suggestions regarding potential PSI measure revisions can be made directly to QIsupport@ahrq.hhs.gov.

With regard to the commenter's specific suggestion for PSI-6—Iatrogenic Pneumothorax Rate, it is impossible to identify patients who lack peripheral intravenous access using ICD-9-CM coded data. However, given exclusions for trauma and respiratory disease, it is assumed that all patients who experienced this event had some type of procedure (such as central venous catheter placement or thoracentesis) that placed them at risk for iatrogenic (hospital-acquired) pneumothorax. For PSI-7—Central Venous Catheter-Related Blood Stream Infection Rate, ICD-10 implementation will take effect no sooner than October 1, 2015 and may be subject to additional delays. AHRQ will conduct extensive testing on the ICD-10 specified measures to ensure events are not double counted.

For PSI-12—Perioperative Pulmonary Embolism or Deep Vein Thrombosis Rate, inclusion criteria are clearly defined and have been narrowed as a result of changes in ICD-9-CM codes and user feedback. For example, the numerator inclusion criteria no longer include upper extremity or thoracic venous thrombosis, due to concern about the uncertain preventability of these events among patients who require long-term use of central venous catheters. The numerator inclusion criteria also no longer include superficial venous thrombosis, due to concern about the uncertain preventability of these events.

Comment: One commenter was concerned that several of the PSI composite component measures in the HAC Reduction Program—including PSI-6, PSI-12 and PSI-15—were finalized for removal from the Hospital IQR Program after FY 2014 in the FY 2013 IPPS/LTCH PPS final rule. The commenter contended that because these measures have been deemed unfit for use in a public reporting application, they are equally unsuitable for use in a payment penalty program.

Response: As we stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53507 through 53509), to avoid duplication, we removed PSI-6, PSI-12 and PSI-15 from Hospital IQR Program as these individual measures are already included in the PSI-90 composite measure which is currently part of the Hospital IQR Program measure set. The measures were not deemed to be unfit, as characterized by the commenter.

Comment: One commenter described its experience with the AHRQ Quality Indicator Software not allowing its organization to identify specific patient encounters included in the measure components and not always accurately reflecting POA. This commenter recommended that CMS ask AHRQ to update the software outputs to provide accurate case level patient information for patients in the numerator, to update the software to define which ICD code triggers the measure, to include simultaneous SAS and MonAHRQ releases and to include the PSI-90 outputs in the AHRQ process the same way as other measures.

Response: AHRQ informed us that they are constantly improving the AHRQ QI software and welcomes this and other suggestions for improvements. The AHRQ QI software and the MonAHRQ software are under different timelines for release for a variety of external reasons. Additional suggestions for improvements can be made directly to QIsupport@ahrq.hhs.gov.

Comment: One commenter recommended revisions to the CAUTI measure to minimize the potential for the possible unintended consequence of premature urinary catheter removal. The commenter's recommended revisions included adding exclusions for bedridden elderly patients whose urine output cannot be monitored otherwise, those who have had complex pelvic surgery, and those with a history of urinary retention; and inclusion of a data capture point for catheter reinsertion to capture the rate of repeat instrumentation and infection risk for those with early catheter removal.

Response: We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50716) for our discussion of the issue of potential unintended consequences of the CAUTI measure. In regard to the addition of a data capture point in the NHSN system, we conferred with CDC, which stated that they weigh each datum piece that is required for NHSN surveillance very carefully, considering the burden required to capture and collect the information and the benefits of the data collected. Individuals performing validation of CAUTI data have stated that locating insertion documentation is very difficult, if not impossible in many cases. For this reason, NHSN does not require the documentation of the date of insertion of indwelling urinary catheters. The NHSN UTI data collection form and system do allow for voluntary collection of this information and NHSN encourages facilities to utilize these capabilities to inform their CAUTI prevention efforts as they deem necessary. However, it is not appropriate to require such documentation by all facilities.

Comment: Several commenters contended that the MRSA and C. difficile measures do not adequately distinguish between community-acquired and hospital-acquired infections and suggested the measures not be included in the HAC Reduction Program for that reason. Another commenter had the same concern and supported the inclusion of the MRSA measure but not the C. difficile measure. A commenter noted that rates of C. difficile are generally higher in patient with surgical procedures (particularly with gastrointestinal surgical procedures) versus non-surgical patients and that there are known regional variations in MRSA and C. difficile infection rates. For these reasons, this commenter recommended that process measures focusing on best practices and guidelines for patients who contract MRSA or C. difficile as inpatients would be more appropriate than outcome measures tracking MRSA and C. difficile infection rates. A few commenters recommended that the C. difficile measure be included in the Hospital VBP program, and not in the HAC Reduction Program.

Response: With respect to some commenters' concerns about MRSA and C. difficile measures, we note that these measures do enable differentiation between community-acquired and health care-associated events based on date of admission and date(s) of specimen collection. Therefore, we do not believe the measures need to be revised. While we appreciate the recommendations for process measures, we note that process measures are not usually risk adjusted and current statute requires risk-adjustment for the HAC Reduction Program. The issue of the same measures being included in multiple programs is addressed below.

Comment: Many commenters urged CMS to eliminate the overlap of measures between the Hospital VBP and HAC Reduction Programs. The commenters understood CMS' desire to align the programs in order to draw more attention to these important patient safety issues and to spur quicker and more meaningful change in patient care. However, the commenters believed that this approach creates multiple operational challenges, results in the potential for double payment penalties, and sends conflicting signals about the true state of hospital performance (a hospital could incur a penalty under the HAC Reduction Program but receive an incentive under the Hospital VBP Program). Commenters overwhelmingly recommended that the HAC Reduction Program measures should only be included in either the HAC Reduction Program or the Hospital VBP program but not in both programs. One commenter recommended that either the HAC Reduction Program or the Hospital VBP program be eliminated completely.

Response: We acknowledge that there is overlap in measures between the Hospital VBP Program and the HAC Reduction Program and refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50716) for our discussion of the rationale for this overlap. As for elimination of these programs, they are statutory requirements and eliminating them is beyond the scope of the Secretary's authority.

Comment: Several commenters recommended that CMS consider a comprehensive strategy in which measures are placed into pay-for-performance programs using a staged approach: the Hospital IQR Program would be the basis for selection into the pay-for-performance programs; the Hospital VBP Program would be the next step and would include measures covering important safety issues but ones for which it is unclear if effective strategies exist to improve performance; and the HAC Reduction Program would be the final stop and would include measures that have generally good but not topped out performance with a limited performance gap to close and a set of highly effective, proven strategies that are widely implementable. Many commenters also suggested that measures should be publicly reported for at least one year before they are included in the HAC Reduction Program so that any unintended consequences of measurement and reporting can be addressed.

Response: We appreciate the commenters' feedback and will consider these suggestions in future rulemaking.

(4) Measure Risk-Adjustment Methodology

In the FY 2014 IPPS/LTCH PPS final rule, we established that we would use the existing measure-level risk-adjustment that is already part of the risk-adjustment methodology for the individual measures in Domains 1 and 2 in order to fulfill this requirement (78 FR 50719). We codified the use of this methodology under § 412.172(d) of the regulations. The AHRQ PSI-90 composite measure and the CDC NHSN measures selected for the program are risk-adjusted and reliability-adjusted. Links to the measure specification documents can be found in section IV.J.4. of the preamble of this final rule. Specifically, risk factors such as the patient's age, gender, comorbidities, and complications will be considered in the calculation of the measure rates so that hospitals serving a large proportion of sicker patients are not unfairly penalized. We noted that the risk-adjustment methodology for these measures meets current NQF endorsement criteria. We believe that such risk-adjustment is appropriate, pursuant to section 1886(p) of the Act.

We will continue to examine the impact of the additional measures in the program, and propose refinements to the program if necessary. Should changes to the risk-adjustment models for the measures be adopted during NQF endorsement maintenance processes, we will propose adopting these changes as soon as possible through rulemaking.

Comment: Many commenters had concerns about the PSI-90 risk-adjustment methodology. Most commenters believed that inadequate risk-adjustment results in a disproportionate impact on teaching hospitals or hospitals that treat many sick and vulnerable patients, perform a high volume of emergency trauma and burn care, and perform a large number of surgical procedures. Another commenter expressed the opposite concern—that small hospitals might have artificially inflated HAC scores as a result of the risk-adjustment methodology algorithm, which gives hospitals with poor data reliability a low reliability weight therefore skewing their rates closer to the national mean.

Response: Each of the PSI-90 composite measure component measures includes detailed risk-adjustment for clinical factors (for example, modified diagnostic related groupings, major diagnostic categories, comorbidities), age, and gender that influence the risk for experiencing a patient safety event during hospitalization. The three risk factors mentioned explicitly above—trauma, burns, and surgical discharges—are accounted for in the PSI risk-adjustment models. For example, acknowledging that some hospitals do more transplants and trauma care than others, the models account for this heterogeneity of risk. AHRQ's Quality Indicator program continually updates and refines measures to provide the best possible quality indicators to the public.

Comment: Several commenters expressed concerns that the HAC Reduction Program does not contain adequate adjustment for socioeconomic (SES) factors that influence HAC rates. Commenters recommended comparing providers to their peers, adjusting provider penalties based on SES of patients served, incorporating a provider's annual improvement into performance calculations, and adopting new measures that better adjust for socioeconomic factors. One commenter specifically recommended complying with the recommendations of the NQF's Expert Panel on Risk-Adjustment for Sociodemographic Factors (Draft Report available at: http://www.qualityforum.org/Risk_Adjustment_SES.aspx).

Response: We appreciate the commenters' suggestions on the importance of addressing socioeconomic status in the HAC Reduction Program and have continued to consider and evaluate these stakeholder concerns. We also note that these concerns were addressed in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50653 through 50654, 50673 through 50674) and again in section IV.H.4. of the preamble of this final rule. While these discussions in section IV.H.4. of the preamble of this final rule are in response to comments regarding the Hospital Readmissions Reduction Program, we have received similar comments with respect to other quality reporting programs and our responses address considerations which also apply to the HAC Reduction Program.

To the extent that these commenters were requesting that CMS mitigate the HAC Reduction Program payment adjustment despite a hospital being in the top quartile, section 1886(p) of the Act specifies that the amount of payment for such a hospital “shall be equal to 99 percent of the amount of payment that would otherwise apply” and we refer readers to the earlier discussion of the payment adjustment in section IV.J.3.b. of the preamble of this final rule.

(5) Measure Calculations

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717 through 50719), we established that we will perform measure calculations for the AHRQ PSI-90 composite measure under Domain 1 and the CDC NHSN measures under Domain 2. We stated that measure calculations for the AHRQ PSI-90 composite measure included using ICD-9-CM diagnosis and/or procedure codes and, for the principal and secondary diagnoses, a present on admission (POA) indicator value associated with all diagnoses on the claim. As noted in section IV.J.3.b. of the preamble of this final rule, in order to implement the new Maryland All-Payer Model, Maryland elected to no longer have Medicare payment made to Maryland hospitals in accordance with section 1814(b)(3) of the Act, effective January 1, 2014. Although CMS has waived certain provisions of the Act for Maryland hospitals as set forth in the agreement between CMS and Maryland and subject to Maryland's compliance with the terms of the agreement, CMS has not waived the POA indicator reporting requirement. In other words, the changes to the status of Maryland hospitals under section 1814(b)(3) of the Act as described above do not in any way change the POA indicator reporting requirement for Maryland hospitals. We also finalized that the same rules under the Hospital IQR Program be applied to determine how the AHRQ PSI-90 composite measure and CDC NHSN measures are applied and calculated.

(6) Applicable Time Period

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717), we adopted a 2-year applicable period to collect data that would be used to calculate the Total HAC Score for FY 2015. For Domain 1 (AHRQ PSI-90 composite measure), we established a 2-year data period to calculate the measures based on recommendations from AHRQ, the measure developer, as we believed that the 24-month data period will provide hospitals and the general public the most current data available. The 24-month data period also will allow time to complete the complex calculation process for these measures, to perform comprehensive quality assurance to enhance the accuracy of measure results, and to disseminate confidential reports on hospital-level results to individual hospitals.

As such, for FY 2015, we will use the 24-month period from July 1, 2011 through June 30, 2013 as the applicable time period for the AHRQ PSI-90 composite measure. The claims for all Medicare FFS beneficiaries discharged during this period will be included in the calculation of measure results for FY 2015. This includes claims data from the 2011, 2012, and 2013 Inpatient Standard Analytic Files (SAFs).

The CDC NHSN measures, CAUTI and CLABSI, are currently collected and calculated on a quarterly basis. However, for the purpose of the HAC Reduction Program, we will use 2 years of data to calculate the Domain 2 score. For FY 2015, we will use calendar years 2012 and 2013 for the HAC Reduction Program. As noted above, we codified the definition of “applicable time period” in the FY 2014 IPPS/LTCH PPS final rule at § 412.170.

Comment: One commenter supported use of the 2-year applicable time periods for the collection of Domain 1 and 2 measures for FY 2015. A few commenters suggested aligning the duration of performance periods for Hospital VBP and the HAC Reduction Programs, in particular suggesting using 1 year of data for the CDC NHSN measures. A few additional commenters had concerns that the data are retrospective and therefore do not provide actionable information.

One commenter disagreed with the finalized 2-year data collection period for the CDC measures for CAUTI and CLABSI and requested that CMS reconsider quarterly collection and calculation of these measures for Domain 2. The commenter stated that quarterly data would be more useful for providers in addressing areas in which they would like to improve, and would also allow consumers and purchasers to have timely information regarding areas of care that are meaningful and important to them.

Response: We refer readers to our response to the applicable time period comments in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717). We understand that hospitals might find quarterly data more useful and for that purpose, we refer stakeholders to Hospital Compare that includes quarterly updates of the measures included in the HAC Reduction Program.

d. Criteria for Applicable Hospitals and Performance Scoring Policy

The HAC Reduction Program does not contain specific statutory directives on scoring methods, as found with other programs. Therefore, our main concern when establishing scoring methods for the HAC Reduction Program was to align with existing scoring methodologies in similar hospital programs. Accordingly, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50721), we finalized a scoring methodology that aligns with the achievement scoring methodology currently used under the Hospital VBP Program (78 FR 27629). We believe aligning the scoring methodologies reduces confusion associated with multiple scoring methodologies. In addition, we note that alignment benefits the hospital stakeholders who have prior experience with the Hospital VBP Program.

In the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27629), we proposed to implement a methodology for assessing the top quartile of applicable hospitals for HACs based on performance standards, where we would score each hospital based on whether they fall in the top quartile for each applicable measure and where in the top quartile they fall. In addition, we proposed to calculate a Total HAC Score for each hospital by summing the hospital's performance score on each measure within a domain to determine a score for each domain, then multiplying each domain score by a proposed weight (Domain 1—AHRQ Patient Safety Indicators 50 percent, Domain 2—CDC NHSN Measures 50 percent), and adding together the weighted domain scores to determine the Total HAC Score.

We reviewed the public input on the proposed 75th percentile benchmark. Several commenters requested that a change to the proposed minimum benchmark for scoring each measure be made. We agreed with these commenters, and in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50722), we modified our proposal and established that the scoring will begin at the minimum value for each measure rather than the 75th percentile. The methodology finalized in the FY 2014 IPPS/LTCH PPS final rule will assess the top quartile of applicable hospitals for HACs based on the Total HAC Score. The support for Domain 2 measures in general, coupled with multiple recommendations, and specifically those from MedPAC, to provide more weight to Domain 2 measures led us to conclude that such scoring changes were necessary. Therefore, we finalized a different weight for each Domain than originally proposed (78 FR 50721).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50722), we further specified that we will calculate a Total HAC Score for each hospital by using the hospital's performance score on each measure within a domain to determine a score for each domain, then multiply each domain score by the following weights: Domain 1—(AHRQ PSI-90 composite measure), 35 percent; and Domain 2—(CDC NHSN measures), 65 percent; and combine the weighted domain scores to determine the Total HAC Score (§ 412.172(e)(3)). We use each hospital's Total HAC Score to determine the top quartile of subsection (d) hospitals (applicable hospitals) that are subject to the payment adjustment beginning with discharges on or after October 1, 2014. With respect to a subsection (d) hospital, we identify the top quartile of all hospitals that are subsection (d) hospitals with respect to their rate of HACs during the applicable period (§ 412.172(e)(1)). We use a Total HAC Score to identify applicable hospitals and identify the 25 percent of hospitals with the highest Total HAC Scores as applicable hospitals (§ 412.172(e)(2)).

We finalized the PSI-90 composite measure for Domain 1. Because hospitals may not have complete data for every AHRQ indicator in the composite measure for this Domain 1 measure, we finalized the same methodology used for the Hospital VBP Program to determine the minimum number of indicators with complete data to be included in the calculation of the Domain measure.

In addition, we finalized the following rules to determine the number of AHRQ indicators to be included in the calculation for a hospital's Domain 1 score. In this discussion, “complete data” refers to whether a hospital has enough eligible discharges to calculate a rate for a measure. Complete data for the AHRQ PSI-90 composite measure means the hospital has three or more eligible discharges for at least one component indicator. Specifically—

If a hospital does not have “complete data” for the PSI-90 composite measure, we will not calculate a Domain 1 score for that hospital.

If a hospital has “complete data” for at least one indicator for the AHRQ PSI-90 composite measure, we will calculate a Domain 1 score.

The calculation of the SIR for the CDC measures requires that the facility have a ≥ 1 predicted HAI event. The predicted number of events is calculated using the national HAI rate and the denominator counts (that is, number of device days, procedure days, or patient days depending on the HAI). In the event the SIR cannot be calculated for any domain 2 measures because the facility has < 1 predicted infection for each measure, Domain 1 scores exclusively will be used to calculate a HAC score. In other words, we will exclude from the overall HAC score calculation any measure for which an SIR cannot be calculated for the reason set out above.

Because of the differences among the measures for the HAC Reduction Program and the distribution of measure results, simply adding up the measure results to calculate the domain or Total HAC Score will make the scores less meaningful to hospitals and the general public. As a result, as we indicated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50720 through 50725), points will be assigned to hospitals' performance for each measure. This approach aligns with the Hospital VBP Program for measuring hospital achievement. In particular, the Hospital VBP Program assigns up to 10 points for each measure based on a hospital's performance result for that measure for a given time period. We note that, for the HAC Reduction Program, unlike the Hospital VBP Program where a higher score means better performance, the more points a hospital receives on a measure corresponds with a poorer score performance. For the HAC Reduction Program, we finalized use of a slightly different methodology for scoring points, depending on the specific measure (Table C in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50723), which is also included below). Specifically—

  • For the AHRQ Patient Safety for Selected Condition (PSI-90) composite in Domain 1, point assignment will be based on a hospital's score for the composite measure.
  • For the PSI-90 composite measure, 1 to 10 points will be assigned to the hospital.
  • For the CDC NHSN measures in Domain 2, point assignment for each measure will be based on the SIR for that measure.
  • For each SIR, 1 to 10 points will be assigned to the hospital for each measure (CAUTI and CLABSI for FY 2015).
  • The Domain 2 score will consist of the average of points assigned to the SIR (CAUTI and CLABSI for FY 2015).

Table C—Calculation of Domain 1 and 2 Measures for FY 2015

Measure nameMeasure resultScenarioIndividual measure score (points)
Domain 1 AHRQ PSI-90 ***Weighted average of rates of component indicatorsComposite value1-10.
Domain 2 CDC NHSN CAUTI CLABSIStandard Infection Ratio (SIR)SIR1-10 (refer to Figure A).
*** These measure rates are risk-adjusted and reliability-adjusted.

For all measures finalized for the HAC Reduction Program, we will use the following rules to determine the number of points assigned to a measure (78 FR 50723 through 50725). Based on the distribution for PSI-90 rates for all the hospitals, we will divide the results into percentiles in increments of 10 with the lowest percentile ranges meaning better performance. Hospitals with PSI-90 rates within the lowest tenth percentile will be given one point; those with PSI-90 rates within the second lowest percentile range (between the 11th and 20th percentile) will be given 2 points, and so forth.

Figure A—Point Assignment for Hospital A's PSI-90 Score

If Hospital A's PSI-90 rate falls into this percentileThen assign this number of points
1st-10th1
11th-20th2
21st-30th3
31st-40th4
41st-50th5
51st-60th6
61st-70th7
71st-80th8
81st-90th9
91st-100th10

For Domain 2, we will obtain measure results that hospitals submitted to the CDC NHSN for the Hospital IQR Program. The CDC NHSN HAI measures capture adverse events that occurred within intensive care units (ICUs), including pediatric and neonatal units. For the Hospital IQR Program, hospitals that elected to participate in the reporting program (that is, have an active IQR pledge), but do not have an ICU, can apply for ICU waivers so that the hospitals will not be subject to the 2-percent payment reduction for nonsubmission of quality reporting data.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50723), we noted in the second quarter of 2012, among the 3,321 IPPS hospitals with active IQR pledges for data submission, 377 (or 10.1 percent) applied and received an ICU waiver. At the same time, 2,939 hospitals (88.5 percent) of the IPPS hospitals did not have an ICU waiver and submitted data for the CDC HAI CLABSI measure, while 4 hospitals (0.1 percent) that had no ICU waiver failed to submit data to the NHSN. For the same quarter, of the 3,321 IPPS hospitals with active IQR pledges, 2,935 (88.4 percent) that did not have an ICU waiver submitted data for the CDC HAI CAUTI measure, whereas 8 hospitals (0.2 percent) did not submit data. Because data availability for the two CDC HAI measures impact the score for Domain 2 and eventually the Total HAC Score, we aim to encourage hospitals with an ICU that did not submit data to begin data submission, and to encourage hospitals that have already submitted data to continue data submission for all the CDC HAI measures. To this end, we finalized the following rules (Figure B in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50724), which is included below):

  • If a hospital has an ICU waiver for the CDC HAI measures, we will use only the Domain 1 score to calculate its Total HAC Score.
  • If a hospital does not have an ICU waiver for a CDC HAI measure:

○ If the hospital does not submit data for the CDC HAI measures, we will assign 10 points to that measure for that hospital.

○ If the hospital does submit data for at least one CDC NHSN measure:

If there are “complete data” (that is, enough adverse events to calculate the SIR) for at least one measure, we will use those data to calculate a Domain 2 score and use the hospital's Domain 1 and Domain 2 scores to calculate the Total HAC Score.

If there are not enough adverse events to calculate the SIR for any of the measures, we will use only the hospital's Domain 1 score to calculate its Total HAC Score.

Comment: Several commenters supported the use of a scoring methodology for the HAC Reduction Program that aligns with the achievement methodology of the Hospital VBP Program and agreed that this scoring alignment reduces confusion.

Response: We appreciate the commenters' support.

Comment: A few commenters stated that CMS implemented as reasonable a scoring methodology as was permitted by statute. A few commenters expressed support for the creation of two domains of measures using measures that are risk adjusted at the patient, unit and hospital levels and expressed support for the weighted contributions of Domain 1 and Domain 2 measures to the Total HAC score. Another commenter found the scoring to be very complex and detailed, making it difficult for hospitals to replicate.

Response: We acknowledge that the scoring methodology is complex. The scoring methodology was described in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50719 through 50725) and is clarified later in the preamble to this final rule. During the review and correction period that will occur prior to assessment of the HAC Reduction Program penalty or posting of the data on Hospital Compare, hospitals will be given access to their HAC Reduction Program measure scores, domain scores and total HAC score accompanied by a document that describes how the scores were calculated.

Comment: One commenter questioned whether the HAC Reduction Program scores reflect meaningful differences in quality between hospitals. The commenter specifically stated that the HAC scoring methodology makes distinctions between hospitals whose performance is not statistically different from one another which results in payment adjustments being levied on hospitals whose performance is not statistically different from the national benchmarks. The commenter also believed that there will be inconsistencies between results for the CMS programs using the same measures but different scoring methodologies.

Response: We note that HAC Reduction Program does not have national benchmarks in the current scoring methodology. We also recognize the possibility for inconsistencies between our programs when measures like the AHRQ PSI-90 composite measure and the CDC NHSN HAI measures are used in multiple programs; we refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50728) where we addressed this issue. We note that different CMS programs have different purposes and thus it is not unexpected that programs use different approaches to score hospitals' performance. For example, the Hospital IQR Program, which publicly reports measure performance on Hospital Compare, is intended to provide consumers with the information needed to allow them to make informed decisions about hospital quality when seeking care.

The HAC Reduction Program is intended to motivate hospitals to reduce the incidence of HACs. We will continue to monitor the HAC Reduction Program and take the commenter's concerns under consideration as we strive to improve the program.

Comment: A few commenters supported using the same method of determining if a hospital has enough data to calculate a PSI-90 score in both the Hospital VBP and HAC Reduction Programs and the same inclusion criteria for the CDC NHSN measures as is used in the Hospital IQR Program.

Response: We appreciate the commenters' support. This alignment was described in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50722).

Comment: One commenter requested the posting of more HAC Reduction Program measure thresholds and benchmark data in advance as has been accomplished with the Hospital VBP Program.

Response: We note that the HAC Reporting Program is not required by law to create measure thresholds and benchmarks, as is the Hospital VBP Program. By statute, the payment adjustments for the HAC Reporting Program are applied to hospitals with a Total HAC score in the 75th percentile. Based on the differing statutory approaches, we do not believe that the commenter's requests are applicable to this program.

(1) Clarification of Finalized Measure Result Scoring for FY 2015 and Subsequent Years

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50723), we finalized for the HAC Reduction Program a scoring methodology that divides the measure results into percentiles in increments of 10 and assigns points (1 to 10) in accordance with the percentile into which the hospital's measure result falls. Our preliminary analysis of the measures showed that multiple hospitals had the same measure results, and that in certain instances, the number of hospitals with the same measure results exceeded the number of hospitals for their appropriate percentile. Consequently a few hospitals with the same measure results fall into the next higher percentile. In these instances, we will assign the same point for all hospitals with the same measure results, and that point will be based on the prior or the lowest appropriate percentile.

For example, if, for the CAUTI measure, 13 percent of hospitals have an SIR of 0, we will assign a point of 1 to all 13 percent of hospitals, even though, arguably, 10 percent of them fall into the first percentile, and 3 percent of the 13 percent fall into the second percentile. Because each percentile range ideally represents 10 percent of hospitals, we will assign a point of 2 to the remaining 7 percent of hospitals in the second percentile because their SIR is larger than 0. We believe this is the most favorable method for scoring measure results for hospitals. We note that randomly assigning some hospitals with the same SIR a higher (for example, less favorable) score would be both arbitrary and capricious, which are prohibited by the Administrative Procedure Act.

Comment: A few commenters applauded CMS for clarifying the process by which measure scores will be assigned in the case of hospitals with tied measure results spanning multiple deciles.

Response: We appreciate the commenters' support for the clarified process and believe it makes clear that we are applying the scoring criteria in a manner that is most equitable to hospitals.

(2) Clarification of FY 2015 Finalized Narrative of Rules to Calculate the Total HAC Score

In the FY 2014 IPPS/LTCH PPS final rule, we finalized a series of rules to determine how to calculate the Domain 2 score and ultimately the Total HAC Score when there were waivers for the collection of CDC NHSN HAI measures (78 FR 50723). We also illustrated and finalized these rules in Figure B of the final rule (78 FR 50724). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28141), we proposed to clarify that the narrative for Figure B should also include “other waivers” that waive hospitals from collecting CDC HAI measure data. The clarified rules that we proposed are as follows for the collection of CDC HAI measures:

  • If a hospital has an ICU waiver or other waiver for the CDC NHSN HAI measures, we will use only the Domain 1 score to calculate its Total HAC Score.
  • If a hospital does not have an ICU waiver or other waiver for the CDC HAI measures:

○ If the hospital does not submit data for the CDC HAI measures, we will assign 10 points to that measure for that hospital.

○ If the hospital does submit data for at least one CDC NHSN measure:

▪ If there are “complete data” (that is, enough adverse events to calculate the SIR) for at least one measure, we will use those data to calculate a Domain 2 score and use the hospital's Domain 1 and Domain 2 scores to calculate the Total HAC Score.

▪ If there are not enough adverse events to calculate the SIR for any of the measures, we will use only the hospital's Domain 1 score to calculate its Total HAC Score.

As discussed earlier, if a hospital has enough data to calculate the PSI-90 composite measure score for Domain 1 and “complete data” for at least one measure in Domain 2, the scores of the two domains will contribute to the Total HAC Score at 35 percent for Domain 1 and 65 percent for Domain 2. However, if a hospital does not have enough data to calculate the PSI-90 composite measure score for Domain 1 but it has “complete data” for at least one measure in Domain 2, its Total HAC Score will depend entirely on its Domain 2 score. Similarly, if a hospital has “complete data” to calculate the PSI-90 composite measure score in Domain 1 but none of the measures in Domain 2, its Total HAC Score will be based entirely on its Domain 1 score. If the hospital does not have “complete data” to calculate the PSI-90 composite measure score for Domain 1 or any of the measures in Domain 2, we will not calculate a Total HAC Score for this hospital.

Comment: A few commenters were concerned that a hospital without any Domain 2 measure scores would have their Total HAC score based entirely on Domain 1, which comprises claims-based data. Because this situation could happen when a hospital does not have enough data to reliably calculate an SIR for the CDC NHSN HAI measures, one commenter recommended that CMS collaborate with CDC to determine if there are analytic approaches besides the SIR that would allow more hospitals to meet the minimum data criteria for reliable measure results for the CDC NHSN HAI measures.

Response: We understand the commenters' concern and point out that the intention of the scoring rules described above for calculating a Total HAC score is to make use of the available data for each hospital and encourage hospitals to report HAI data to CDC NHSN, even if they do not have enough data to reliably calculate an SIR for the CDC NHSN HAI measures in Domain 2. In section IV.J.3.c. of the preamble of this final rule, we address stakeholders concerns about using claims data in general and the PSI-90 measure in particular, for the HAC Reduction Program.

We conferred with CDC, which indicated that they continuously evaluate the data reported to NHSN and consider the best measures for monitoring and comparative purposes. Currently the SIR is the best measure to allow for risk adjustment and production of a facility-level and/or CCN-level metric that can be used for comparison across similar facility types. This provides the opportunity to most accurately represent a facility's success. If the data are insufficient (for example, too few device days) to produce the SIR, CDC indicated that any calculation produced from such low numbers would be imprecise. CDC continues to review the data and evaluate options for metric development, including situations where facilities have low denominator volume and/or few infections.

After consideration of the public comments we received, we are finalizing the scoring clarifications for the HAC Reduction Program as proposed.

e. Reporting Hospital-Specific Information, Including the Review and Correction of Information

(1) Confidential Reports to Applicable Hospitals

Section 1886(p)(5) of the Act requires the Secretary to provide confidential reports to the applicable hospitals with respect to HACs. To meet the requirements under section 1886(p)(5) of the Act, in the FY 2014 IPPS/LTCH PPS final rule, we finalized the provision of confidential reports for the HAC Reduction Program to include information related to claims-based measure data for the PSI measures, the measure scores, the domain score for each domain, and the Total HAC Score (78 FR 50725). We noted that we use chart-abstracted measures in the HAC Reduction Program, and such information will be contained in the reports hospitals currently receive as part of the Hospital IQR Program and can be reviewed and corrected through the process specified for that program. We stated that we believe that this method would reduce the burden on hospitals, by alleviating the need to correct data present in two different programs.

(2) Availability of Information to the Public

Section 1886(p)(6)(A) of the Act requires the Secretary to “make information available to the public regarding HAC rates of each subsection (d) hospital” under the HAC Reduction Program. Section 1886(p)(6)(C) of the Act requires the Secretary to post the HAC information for each applicable hospital on the Hospital Compare Web site in an easily understood format. Section 1886(p)(6)(B) of the Act also requires the Secretary to “ensure that an applicable hospital has the opportunity to review, and submit corrections for, the HAC information to be made public for each hospital.”

To meet the requirements under section 1886(p)(6)(C) of the Act, in the FY 2014 IPPS/LTCH PPS final rule, we finalized policies that the following information will be made public on the Hospital Compare Web site relating to the HAC Reduction Program: (1) hospital scores with respect to each measure; (2) each hospital's domain specific scores; and (3) the hospital's Total HAC Score (78 FR 50725).

Comment: One commenter supported the public availability of facility-specific data on HACs. The commenter was concerned that these data had previously been available on Hospital Compare but were no longer posted there and urged that CMS repost these data. One commenter recommended that, at a minimum, in spite of the absence of measures for some HACs, CMS should make the raw counts of HACs publicly available on Hospital Compare or https://data.medicare.gov/.

Response: We appreciate the commenter's recognition of the importance of having facility level HAC data available publicly. Although the commenter did not specify which data were being referenced, we interpret this comment to refer to the eight HAC measures that were removed from the Hospital IQR Program (Air Embolism, Blood Incompatibility, CAUTI, Falls and Trauma, Foreign Object Retained After Surgery, Manifestation of Poor Glycemic Control, Pressure Ulcer Stages III or IV, and Vascular Catheter Associated Infections). The rationale for removing these measures from the Hospital IQR Program can be found in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53506 through 53507). The measures included in the HAC Reduction Program (PSI-90 composite, CLABSI and CAUTI) have been available on Hospital Compare since December 2010, January 2012 and January 2013, respectively. The HAC Reduction Program scores will also be publicly available later this year.

(3) Review and Correction of Information

Section 1886(p)(6)(B) of the Act requires the Secretary to ensure that each hospital has the opportunity to review and submit corrections for the information to be made available to the public with respect to each hospital under section 1886(p)(6)(A) of the Act prior to such information being made available to the public.

In the FY 2014 IPPS/LTCH PPS final rule, we codified our regulation regarding the reporting of hospital-specific information at § 412.172(f) (78 FR 50968). CMS will make information available to the public regarding HAC rates of all hospitals described in section 1886(d)(1)(B) of the Act, including hospitals in Maryland previously paid under section 1814(b)(3) of the Act, under the HAC Reduction Program (paragraph (f)). As noted in section IV.J.3.b. of the preamble of this final rule, in order to implement the new Maryland All-Payer Model, Maryland elected to no longer have Medicare pay Maryland hospitals in accordance with section 1814(b)(3) of the Act, effective January 1, 2014.

In summary, we established that CMS will provide each hospital with confidential hospital-specific reports and discharge level information used in the calculation of its Total HAC Score (paragraph (f)(1) of § 412.172). Hospitals will have a period of 30 days after receipt of the information provided under paragraph (f)(1) to review and submit corrections for the HACs measure scores, domain scores, and the Total HAC Score for the fiscal year. The administrative claims data used to calculate a hospital's Total HAC Score for those conditions for a fiscal year will not be subject to review and correction (paragraph (f)(2)). CMS will post the HAC Reduction Program scores for the applicable conditions for a fiscal year for each applicable hospital on the Hospital Compare Web site. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50725 through 50728) for detailed discussions of the above provisions.

CMS provided hospitals with their confidential hospital-specific reports and discharge level information used in the calculation of their Total HAC Score in late July 2014 on the Quality Net Web site. In order to have access to their hospital-specific report, hospitals must register for a Quality Net Secure Portal account. Hospitals have a period of 30 days after the information is posted on Quality Net to review and submit corrections for the calculation of their HACs measure scores, domain scores, and Total HAC Score for the fiscal year.

(4) Preliminary Analysis of the HAC Reduction Program

In order to model estimated payment changes for the FY 2015 IPPS/LTCH PPS proposed rule, we conducted a preliminary analysis of the HAC Reduction Program using currently available historical data as a proxy for the actual data that will be used to determine hospital performance under the program. The results of this preliminary analysis can be found on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html under the FY 2015 IPPS/LTCH PPS proposed rule Home Page link as Table 17.—FY 2015 Preliminary Analysis of the Hospital-Acquired Condition Reduction Program. We stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28142) that when the actual data for the performance periods finalized in the FY 2014 IPPS/LTCH PPS final rule for each measure are available, hospitals will have an opportunity to review and submit corrections as discussed in section IV.J.3.e. of the preamble of the proposed rule and this final rule.

Comment: One commenter objected to CMS making Table 17—FY 2015 Preliminary Analysis of the Hospital-Acquired Condition Reduction Program, publicly available via the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html under the FY 2015 IPPS/LTCH PPS proposed rule Home Page link. This commenter stated that the data had not yet been reviewed and its sources auditable and in compliance with the requirements of the law. The commenter stated that the Table did not provide insight into how the Composite Score was developed. The commenter acknowledged that there was a methodology included in the proposed rule preamble, however, also noted that any attempts to recalculate and confirm the scores in the Table with other information available to the public (such as CMS' Hospital Compare Web site) were not possible. Lastly, the commenter stated that the reporting periods used to calculate the Score in Table 17 (both for Domain 1 (Patient Safety) and Domain 2 (CLASBI and CAUTI) are not those that are set in law.

Response: We acknowledge the commenter's objection and point out that as stated in the FY 2015 IPPS/LTCH PPS proposed rule, we conducted a preliminary analysis of the HAC Reduction Program using currently available historical data as a proxy for the actual data that will be used to determine hospital performance under the program to model estimated payments. In addition, as stated earlier in this section, we established that we will provide each hospital with confidential hospital-specific reports and discharge level information used in the calculation of its Total HAC Score (paragraph (f)(1) of § 412.172). Hospitals will have a period of 30 days after receipt of the information provided under paragraph (f)(1) to review and submit corrections for the HACs measure scores, domain scores, and Total HAC Score for the fiscal year. The administrative claims data used to calculate a hospital's Total HAC Score for those conditions for a fiscal year will not be subject to review and correction (paragraph (f)(2)). CMS will post the HAC Reduction Program scores for the applicable conditions for a fiscal year for each applicable hospital on the Hospital Compare Web site. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50725 through 50728) for detailed discussions of the above provisions.

Providing a preliminary analysis of the HAC Reduction Program using currently available historical data as a proxy for the actual data is consistent with the law. We clearly indicated that these were not the final data. However, because this is the first year of the HAC Reduction Program, we wish to gain some initial experience under the review and correction process discussed in section IV.J.3.e. of the preamble of this final rule and determine to what extent the review and corrections process in this first year changes the preliminary hospital level data we provided in Table 17 of the proposed rule before providing updated hospital level data. Updated hospital level data will be made publicly available following the review and corrections process.

f. Limitation on Administrative and Judicial Review

Section 1886(p)(7) of the Act provides that there will be no administrative or judicial review under Section 1869 of the Act, under Section 1878 of the Act, or otherwise for any of the following:

  • The criteria describing an applicable hospital under section 1886(p)(2)(A) of the Act.
  • The specification of hospital acquired conditions under section 1886(p)(3) of the Act.
  • The specification of the applicable period under section 1886(p)(4) of the Act.
  • The provision of reports to applicable hospitals under section 1886(p)(5) of the Act.
  • The information made available to the public under section 1886(p)(6) of the Act.

In the FY 2014 IPPS/LTCH PPS final rule, we included these statutory provisions under § 412.172(g) of the regulations (78 FR 50729 and 50968). We note that section 1886(p)(6) of the Act requires the Secretary to make information available to the public regarding HAC scores of each applicable hospital under the HAC Reduction Program. Section 1886(p)(6)(B) of the Act also requires the Secretary to ensure that an applicable hospital has the opportunity to review, and submit corrections for, the information to be made available to the public, prior to that information being made public. We believe that the review and correction process explained above in section IV.J.3.e. of the preamble of this final rule will provide hospitals with the opportunity to correct data prior to its release on the Hospital Compare Web site.

4. Maintenance of Technical Specifications for Quality Measures

Technical specifications of the HAC measures for the Agency for Health Research and Quality (AHRQ) Patient Safety Indicator 90 (PSI-90) in Domain 1 can be found at AHRQ's Web site at: http://qualityindicators.ahrq.gov/Modules/PSI_TechSpec.aspx. Technical specifications for the CDC NHSN's HAI measures in Domain 2 can be found at CDC's NHSN Web site at: http://www.cdc.gov/nhsn/acute-care-hospital/index.html. Both Web sites provide measure updates and other information necessary to guide hospitals participating in the collection of HAC Reduction Program data.

Many of the quality measures used in different Medicare and Medicaid reporting programs are NQF-endorsed. As part of its regular maintenance process for NQF-endorsed performance measures, the NQF requires measure stewards to submit annual measure maintenance updates and undergo maintenance of endorsement review every 3 years. In the measure maintenance process, the measure steward (owner/developer) is responsible for updating and maintaining the currency and relevance of the measure and will confirm existing or minor specification changes with NQF on an annual basis. NQF solicits information from measure stewards for annual reviews, and it reviews measures for continued endorsement in a specific 3-year cycle.

We note that NQF's annual or triennial maintenance processes for endorsed measures may result in the NQF requiring updates to the measures. We believe that it is important to have in place a subregulatory process to incorporate nonsubstantive updates required by the NQF into the measure specifications we have adopted for the HAC Reduction Program, so that these measures remain up-to-date.

For the HAC Reduction Program, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28142), we proposed to follow the finalized processes outlined for addressing changes to adopted measures in the Hospital IQR Program “Maintenance of Technical Specifications for Quality Measures” section found in section IX.A.1.b. of the preamble of this final rule.

We believe this proposal adequately balances our need to incorporate updates to HAC Reduction Program measures in the most expeditious manner possible while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted. We invited public comments on this proposal.

Comment: One commenter supported the proposed method of maintaining and updating the technical specifications for the quality measures, including adoption of a subregulatory process for nonsubstantive changes released by measure developers.

Response: We appreciate the commenter's support.

Comment: A few commenters believed that nonsubstantive changes identified during routine measure maintenance processes and during NQF measure maintenance review should all be subject to the annual notice-and-comment rulemaking process.

Response: We disagree with the recommendation to have all measure changes subject to notice-and-comment rulemaking. As previously noted in FY 2014 IPPS/LTCH PPS final rule (78 FR 50776) we believe that the maintenance of technical specifications for quality measure policy for the Hospital IQR Program also is applicable to the HAC Reduction Program. We believe this policy adequately balances our need to incorporate nonsubstantive NQF updates to NQF-endorsed measures in the most expeditious manner possible, while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted. We also note that the NQF process incorporates an opportunity for public comment and engagement in the measure maintenance process. These policies regarding what is considered substantive versus nonsubstantive apply to all measures in the Hospital IQR Program and the HAC Reductions Program.

Comment: One commenter indicated that any changes to a measure developed for adults but now including those less than 18 years of age should be considered nonsubstantive.

Response: We refer the reader to our response to a similar suggestion in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50776). We will make a decision as to whether such changes constitute substantive changes on a case-by-case basis.

After consideration of the public comments we received, we are finalizing the maintenance of technical specifications for quality measures in the HAC Reduction Program as proposed.

5. Extraordinary Circumstances Exceptions/Exemptions

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50711), we indicated that we had received public comments requesting a potential waiver or exemption process for hospitals located in areas that experience disasters or other extraordinary circumstances (EC), even though we did not propose an extraordinary circumstance exceptions/exemptions (ECE) policy for the HAC Reduction Program. We stated in the FY 2014 IPPS/LTCH PPS final rule that we were reviewing this issue and might consider such a proposal in future rulemaking. We also noted that should we consider a policy we intend to focus on several policy and operational considerations in developing a disaster exemption process for the HAC Reduction Program. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28142), we welcomed public comments on whether an exemption process should be implemented and the policy and operational considerations for a potential HAC Reduction Program ECE policy.

Comment: Many commenters supported the creation of an extraordinary circumstance exemption process for hospitals that experience a natural disaster. Some commenters recommended that CMS consider adopting several aspects of the Hospital VBP waiver process for the HAC Reduction Program, including allowing hospitals to have 60 days from the occurrence of the extraordinary circumstance to file for an exemption. The commenters believed this would ensure that hospitals do not seek an advantage on their HAC scores long after a disaster period has ended. Other commenters recommended that hospitals be given 90 calendar days from the date of the disaster to request an exemption and that the exemption apply for at least 2 payment years because the HAC Reduction Program currently uses a 2-year performance period.

Response. We appreciate the commenters' support. We will take into consideration these recommendations as we consider whether an exemption process for the HAC Reduction Program should be implemented.

6. Implementation of the HAC Reduction Program for FY 2016

a. Measure Selection and Conditions, Including Risk-Adjustment Scoring Methodology

(1) General Selection of Measures

In the FY 2014 IPPS/LTCH PPS final rule, we finalized measures for FY 2015 and onwards, but only finalized a scoring methodology for FY 2015 for the HAC Reduction Program (78 FR 50712 through 50713). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we did not propose any new additional measures for the HAC Reduction Program for FY 2016. We note that AHRQ's PSI-90 composite measure and CDC's NHSN CLABSI (NQF #0138) and CAUTI (NQF #0139) measures were submitted in January 2014 and December 2013, respectively, as part of the NQF maintenance endorsement process. As noted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50719), should changes to the risk-adjustment models for the measures be adopted during NQF endorsement maintenance processes, CMS will adopt these changes as soon as possible. Finally, as we stated in the FY 2015 IPPS/LTCH PPS proposed rule, although we are not required under section 1886(p) of the Act to address specific measure scoring methodologies regarding the HAC Reduction Program in notice-and-comment rulemaking, as required under the Hospital VBP Program, we believe that it is important to set forth such scoring methodologies for each individual HAC measure, in order for the public to understand how the measures discussed and finalized in this FY 2015 IPPS/LTCH PPS final rule relate to the performance methodology used to determine the applicable hospitals subject to the payment adjustment under the HAC Reduction Program.

(2) Measure Selection and Scoring Methodology for FY 2016

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717), we finalized for FY 2016 and onwards CDC's NHSN Surgical Site Infection (SSI) measure (NQF #0753) and its measure methodology. The SSI and other measure specifications are available at: http://www.qualityforum.org/QPS/QPSTool.aspx. To locate a specific measure, search by the NQF number: (1) for the SSI measure use NQF #0753; (2) for the CLABSI measure use NQF #0139; and (3) for the CAUTI measure use NQF #0138. For SSI updates related to CMS programs and the use of CDC's NHSN measures, we refer readers to the Web site at: http://www.cdc.gov/nhsn/acute-care-hospital/ssi. The SSI measure explanation of SIR in the NHSN e-newsletter is available at: http://www.cdc.gov/nhsn/PDFs/Newsletters/NHSN_NL_OCT_2010SE_final.pdf.

CDC's SSI measure was finalized as a Domain 2 measure in the calculation of the Total HAC Score (78 FR 50717). In the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose to change CDC's measure methodology for the SSI measure.

b. Measure Risk-Adjustment

In the FY 2014 IPPS/LTCH PPS final rule, we finalized the measure risk-adjustment for AHRQ's PSI-90 composite measure for Domain 1 and the risk-adjustment for CDC's NHSN measures for Domain 2 (78 FR 50718 through 50719). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we did not propose any risk-adjustment changes for any of the measures finalized in the FY 2014 IPPS/LTCH PPS final rule.

c. Measure Calculations

In the FY 2014 IPPS/LTCH PPS final rule, we finalized the measure calculations for AHRQ's PSI-90 composite measure for Domain 1 and the measure calculations for CDC's NHSN measures for Domain 2 (78 FR 50718 through 50719). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we did not propose any measure calculation changes for any of the measures finalized in the FY 2014 IPPS/LTCH PPS final rule.

d. Applicable Time Period

In the FY 2014 IPPS/LTCH PPS final rule, we finalized and codified policy at § 412.170 that there will be a 2-year applicable time period to collect data used to calculate the Total HAC Score (78 FR 50717).

For the Domain 1 AHRQ PSI-90 composite measure, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we proposed for FY 2016 a 24-month period from July 1, 2012 through June 30, 2014 as the applicable time period. The claims for all Medicare FFS beneficiaries discharged during this period would be included in the calculation of measure results for FY 2016. This includes claims data from the 2012, 2013, and 2014 Inpatient Standard Analytic Files (SAFs).

The Domain 2 CDC NHSN measures (CAUTI, CLABSI, and SSI) are currently collected and calculated on a quarterly basis. However, for the purpose of the HAC Reduction Program, we finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50717) that we will use 2 years of data to calculate the Domain 2 score for FY 2015 for the CAUTI and CLABSI measures. For FY 2016, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we proposed to use calendar years 2013 and 2014 for all three Domain 2 measures in the HAC Reduction Program.

e. Criteria for Applicable Hospitals and Performance Scoring

For FY 2016, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we proposed a change to the scoring methodology of the Total HAC Score. This proposal, which is discussed below, was intended to address the implementation of CDC's NHSN SSI measure in Domain 2 finalized for implementation in FY 2016.

(1) Finalized Scoring Methodology for Domains 1 and 2 for FY 2015

We finalized a scoring methodology for the Total HAC Score in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50722). This finalized scoring methodology is similar to the achievement scoring methodology currently used under the Hospital VBP Program. With respect to an applicable hospital, we finalized that CMS will identify the top quartile of all hospitals with respect to their Total HAC Score during the applicable period (§ 412.170). In addition, we finalized that the Total HAC Score will be determined by the following three steps: (1) each measure result will be scored as outlined in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50723); (2) domain scores will be determined by the scores assigned to the measures within the domain; and (3) the Total HAC Score will be determined by the sum of the weighted domain scores. For FY 2015, the Total HAC Score is the sum of the Domain 1 score multiplied by 35 percent plus the Domain 2 score multiplied by 65 percent. For further details of the general scoring methodology finalized for the HAC Reduction Program, we refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50719 through 50725).

(2) Scoring Methodology of Domain 2 and New Weighting of Domains 1 and 2 for FY 2016

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28143), we proposed to adjust the scoring methodology of Domain 2 and the weighting of Domains 1 and 2 beginning in FY 2016 due to the addition of CDC's NHSN SSI measure. We would like to clarify that the scoring methodology for Domain 1 in FY2016 is unchanged from the scoring methodology for Domain 1 in FY 2015. This methodology is described above under our discussion of Criteria for Applicable Hospitals and Performance Scoring Policy. For the scoring of CDC's NHSN SSI measure, we proposed an identical process of assigning points to the SSI measure results. We note that the SSI measure, reported via CDC's NHSN, is currently specified under the Hospital IQR program and is restricted to colon procedures (including incision, resection or anastomosis of the large intestine and large-to-small and small-to-large bowel anastomosis), and abdominal hysterectomy procedures including those performed by laparoscope. The SSI measure assesses SSIs based on the type of surgery procedures (that is, the SSI measure is stratified into infections that occur with colonic procedures and those that occur in abdominal hysterectomy procedures). We also note that patient age and a preoperative health score are risk factors taken into account using the Standardized Infection Ratio (SIR) (78 FR 20625). Use of an SIR is consistent with CDC's NHSN CLABSI and CAUTI measures that also report SIRs. In order to calculate an SSI measure score for Domain 2, we proposed to calculate an abdominal hysterectomy procedure SSI SIR and a colonic procedure SSI SIR and pool both SIRs for each hospital. We proposed pooling the abdominal hysterectomy SSI SIR and colonic procedure SSI SIR as this would provide a single SSI SIR, which is consistent with reporting a single SSI SIR as meant by design of the NQF endorsed measure (NQF #0753), and would allow a risk-adjusted weighting of the surgical volume among the two procedures. We proposed that a pooled SSI SIR for an applicable hospital is the sum of all observed infections among abdominal hysterectomy and colonic procedures divided by the sum of all predicted infections among abdominal hysterectomy and colonic procedures performed at the applicable hospital. The pooled SSI SIR would be scored in the same manner as all measures finalized for the HAC Reduction Program (refer to Figure A in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50723), which is also included above in this final rule). To determine a Domain 2 score, we proposed taking the average of the three CDC HAI SIR scores. We noted in the FY 2014 IPPS/LTCH PPS final rule that there will be instances in which applicable hospitals may not have data on all four measures and therefore a set of rules was finalized to determine how to score each Domain. We proposed to follow the same finalized rules used to determine scoring of Domains 1 and 2 (FY 2014 IPPS/LTCH PPS final rule (78 FR 50723 through 50725)), and the proposed changes in section IV.J.6.e. of the preamble of the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28144), which are included in section IV.J.6.e. of the preamble of this final rule. We invited public comments on this proposal.

In addition, for FY 2016 we proposed to weight Domain 1 at 25 percent and Domain 2 at 75 percent. We proposed to decrease the weight of Domain 1 from 35 percent to 25 percent for two reasons. First, with the implementation of CDC's SSI measure, we believed the weighting of both domains needed to be adjusted to reflect the addition of a fourth measure; and second, in keeping with public comments from the FY 2014 IPPS/LTCH PPS final rule, MedPAC and others stated that Domain 2 should be weighted more than Domain 1. Finally, we proposed for FY 2016 that the Total HAC Score for applicable hospitals would be the sum of the weighted scores from Domain 1 (weighted at 25 percent) and Domain 2 (weighted at 75 percent). We invited public comments on this proposal.

Comment: Several commenters supported the proposed approach of creating a pooled SIR for the SSI measure that includes colon surgeries and abdominal hysterectomy surgeries because this is consistent with how CDC currently reports the measure. A few commenters noted that this approach allows for risk adjusted weighting of the surgical volume between the two procedures. One commenter recommended that CMS collaborate with NHSN leadership and professional organization representing surgeons to develop a profile of surgical procedures that are high volume across the spectrum of acute care hospitals that might be added to the existing procedures in the SSI measure. The commenters suggested that an expansion of the number of procedures may increase the likelihood that the SSI SIR is reliable.

Response: We appreciate the commenters' support for the approach of creating a pooled SIR for the SSI measure. We note that CDC maintains ongoing collaborations with a number of professional surgical organizations and is currently in process of developing additional SSI metrics for higher volume surgical procedures. Once these measures are finalized, we may consider them for future rulemaking.

Comment: A few commenters suggested that CMS and CDC monitor the impact of the consolidated SIR for hospitals that perform a higher volume of hysterectomies. The commenters pointed out that based on Hospital Compare data, where the SSI rates for the two procedures are reported separately, hysterectomies have a higher infection rate compared to colon surgeries, and fewer hospitals have a reported hysterectomy SIR. Commenters recommended that when the consolidated SIR adversely impacts hospitals that perform more hysterectomies, then the SIR should be modified to account for the different mix of services. One commenter recommended CMS weigh each individual SSI metric separately as they believe the combined SIR is a complicated, burdensome composite metric.

Response: We will consider these suggestions in future rulemaking.

Comment: A few commenters were concerned that adding the SSI measure to Domain 2 could lead to an average score that lacks specificity in determining a hospital's true HAI scores. In addition, one commenter stated that adding the MRSA bacteremia and C. difficile measures to Domain 2 score would further dilute the domain. The commenters suggested assigning each CDC NHSN HAI measure a separate percentage to total the domain weight versus averaging all HAIs in Domain 2. A few commenters stated that, with only two procedures in the SSI measure, it is reasonable to continue equally weighting the measures in Domain 2. However, if more procedures are added to the SSI measure, the commenters recommended that CMS consider providing a higher weight to the SSI measure.

Response: We note that the purpose of the domain scores is to provide a summary of a hospital's performance with regard to patient safety (Domain 1) and HAI (Domain 2) measures. A hospital's performance with regard to the individual measures is available on Hospital Compare and is updated quarterly for hospitals that participate in the Hospital IQR Program. We appreciate the suggestion for weighting the CDC NHSN HAI measures separately and will take this into consideration in future rulemaking.

Comment: One commenter recommended that the weighting of measures in the Total HAC Score correspond to the relative amounts of harm found in the patient population based on what is reported in peer-reviewed literature.

Response: We will take this feedback into consideration as we add more measures to the program and evaluate if changes to the scoring methodology are needed.

Comment: Many commenters supported the CMS proposal to increase the weight given to Domain 2 and decrease the weight given to Domain 1 because Domain 2 includes the chart abstracted NHSN measures which the commenters believed to be more reliable and actionable than the claims-based PSI-90 composite measure in Domain 1. One commenter recommended that CMS continue to decrease the Domain 1 weight in future years. A few commenters believed that the overlap of measures between the Hospital VBP Program and the HAC Reduction Program should be eliminated, but expressed their support for the domain weight change if CMS retained all measures that overlap despite the commenters' objections.

Response: We agree that an increase in the Domain 2 weight is warranted, given that the number of measures in the domain is increasing.

Comment: A few commenters did not support the proposal to change the weight of Domain 1 to 25 percent from 35 percent and Domain 2 to 75 percent from 65 percent. One commenter stated that this approach would promote an overly narrow definition of HACs that places too much emphasis on infections alone and not enough on other patient safety risks. The commenter added that CMS should take a more balanced approach to weighting the existing domains in order to place a high bar for hospitals to avoid both infections and harmful complications that can be prevented, and seek and develop measures for hospital safety problems that have the most prevalence and impact.

Response: We agree that both patient safety events and infections are important components of the HAC Reduction Program. In the FY 2014 IPPS/LTCH PPS proposed rule (79 FR 28143 through 29144), we explain the rationale for assigning a higher weight to Domain 2. We believe that the AHRQ PSI-90 measure plays a vital role in patient safety and it continues to comprise an integral part of the HAC Reduction Program with a weight of 25 percent of the Total HAC Score.

After consideration of the public comments we received, we are finalizing the scoring methodology of Domain 2 and new weighting of Domains 1 and 2 for FY 2016 as proposed.

f. Rules To Calculate the Total HAC Score for FY 2016

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28141, 28144), and in section IV.J.3.d. of the preamble of this final rule, we discuss our proposal to adopt the “Clarification of FY 2015 Finalized Narrative of Rules to Calculate the Total HAC Score.” We invited public comments on this proposal.

After consideration of the public comments we received, we finalized the proposed clarification of the FY 2015 rules to calculate the Total HAC Score. We received no public comments on this specific proposal for FY 2016; therefore, we are finalizing the clarification for FY 2016 as well.

7. Future Considerations for the Use of Electronically Specified Measures

We believe that collection and reporting of data through health information technology will greatly simplify and streamline reporting for many CMS quality reporting programs. Through electronic reporting, hospitals will be able to leverage EHRs to capture, calculate, and electronically submit quality data submitted to CMS for the Hospital IQR Program. CMS has become aware of some hospitals and health systems that have developed or adopted a methodology to identify and measure all-cause harm through their electronic health record (EHR) systems. Some hospitals and health systems are able to use the results of these electronic measures to address adverse events at the point of care and to track improvement over time. Many of these measures capture a broad range of common hospital-acquired conditions that may not be captured by existing national measures (examples include measures of adverse drug events and hypoglycemia). Given that these measures are captured using clinical data from EHR systems, collection of HAC data will allow CMS to align measures across multiple settings.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28144), we sought comment as to whether the use of a standardized electronic composite measure of all-cause harm should be used in the HAC Reduction Program in future years in addition to, or in place of, claims-based measures assessing HACs. We welcomed any suggestions of specific all-cause harm electronic measures, including detailed measure specifications. Specifically, we invited public comments on the feasibility and the perceived value of such a measure, and what would be the most appropriate weighting of this measure in the Total HAC Score. In addition, we requested suggestions on the timeframe for which such a standardized electronic composite measure of all-cause harm should be proposed.

We intend for the future direction of electronic quality measure reporting to significantly enhance the tracking of HACs under the HAC Reduction Program. We stated in the FY 2015 IPPS/LTCH PPS proposed rule that we will continue to work with measure stewards and developers to develop new measure concepts, and conduct pilot, reliability and validity testing as part of efforts to promote the adoption of Certified Electronic Health Record Technology in hospitals.

Comment: Many commenters supported leveraging electronic technology to capture, calculate, and submit data. Commenters recommended that ultimately electronic measures could replace claims-based measures and could provide information in a timelier manner. Several commenters cautioned that electronic measures must undergo careful testing and that implementation occur in a phased manner and not mandated until technically feasible for all hospitals to comply. One commenter recommended that an e-measure related to antimicrobial stewardship be considered. One commenter recommended that, beginning in FY 2015, hospitals be given a waiver from complying with existing Domain 1 requirements, provided that they demonstrate transition toward or current use of an approach utilizing electronic measures in a manner supported by the peer reviewed literature.

Response: We appreciate the commenters' feedback and support for the use of electronic measures in general. We will take the suggestions into account in future rulemaking.

Comment: Many commenters supported the development of an all-cause harm measure derived from electronic health records. Some commenters believed that all-cause harm measures could capture information on never events, adverse drug events, ventilator-associated events, diagnostic errors, hypoglycemia, transfusion reactions, and medication reconciliation (unintentional medication discrepancies per patient (NQF #2456)). Another commenter encouraged innovative approaches and collaboration with organizations, hospitals and the CMS Innovation Center when developing all-cause patient harm measures derived from electronic health records. One commenter recommended an all cause harm measure be incorporated as a third domain.

Several other commenters expressed concern about use of composite measures in general stating that they do not provide actionable data and that inappropriate weighting of measure components may skew results. If a composite measure is used, commenters recommended that data on the component measures and the weighting methodology also be reported.

Response: We thank commenters for their viewpoints on the use of an electronic all-cause harm measure for inclusion in the HAC Reduction Program and will take them into consideration in future rule making.

Comment: One commenter requested more insight into what CMS envisions for the measure and how the measure will be reported through the EHR system, in order to provide feedback to CMS.

Response: At this time, we do not have a specific measure in mind but rather are soliciting feedback on the feasibility and perceived value of a standardized electronic composite measure of all-cause harm in the HAC Reduction Program. As we develop a more specific plan we will share that information in future rulemaking.

K. Payments for Indirect and Direct Graduate Medical Education (GME) Costs (§§ 412.105 and 413.75 Through 413.83)

1. Background

Section 1886(h) of the Act, as added by section 9202 of the Consolidated Omnibus Budget Reconciliation Act (COBRA) of 1985 (Pub. L. 99-272) and as currently implemented in the regulations at 42 CFR 413.75 through 413.83, establishes a methodology for determining payments to hospitals for the direct costs of approved graduate medical education (GME) programs. Section 1886(h)(2) of the Act sets forth a methodology for the determination of a hospital-specific base-period per resident amount (PRA) that is calculated by dividing a hospital's allowable direct costs of GME in a base period by its number of full-time equivalent (FTE) residents in the base period. The base period is, for most hospitals, the hospital's cost reporting period beginning in FY 1984 (that is, October 1, 1983 through September 30, 1984). The base year PRA is updated annually for inflation. In general, Medicare direct GME payments are calculated by multiplying the hospital's updated PRA by the weighted number of FTE residents working in all areas of the hospital complex (and at nonprovider sites, when applicable), and the hospital's Medicare share of total inpatient days.

Section 1886(d)(5)(B) of the Act provides for a payment adjustment known as the indirect medical education (IME) adjustment under the hospital inpatient prospective payment system (IPPS) for hospitals that have residents in an approved GME program, in order to account for the higher indirect patient care costs of teaching hospitals relative to nonteaching hospitals. The regulations regarding the calculation of this additional payment are located at 42 CFR 412.105. The hospital's IME adjustment applied to the DRG payments is calculated based on the ratio of the hospital's number of FTE residents training in either the inpatient or outpatient departments of the IPPS hospital to the number of inpatient hospital beds.

The calculation of both direct GME and IME payments is affected by the number of FTE residents that a hospital is allowed to count. Generally, the greater the number of FTE residents a hospital counts, the greater the amount of Medicare direct GME and IME payments the hospital will receive. Therefore, Congress, through the Balanced Budget Act of 1997 (Pub. L. 105-33), established a limit on the number of allopathic and osteopathic residents that a hospital may include in its FTE resident count for direct GME and IME payment purposes. Under section 1886(h)(4)(F) of the Act, for cost reporting periods beginning on or after October 1, 1997, a hospital's unweighted FTE count of residents for purposes of direct GME may not exceed the hospital's unweighted FTE count for direct GME in its most recent cost reporting period ending on or before December 31, 1996. Under section 1886(d)(5)(B)(v) of the Act, a similar limit based on the FTE count for IME during that cost reporting period is applied effective for discharges occurring on or after October 1, 1997. Dental and podiatric residents are not included in this statutorily mandated cap.

The Affordable Care Act made a number of statutory changes relating to the determination of a hospital's FTE resident count for direct GME and IME payment purposes and the manner in which FTE resident limits are calculated and applied to hospitals under certain circumstances. Regulations implementing these changes are discussed in the November 24, 2010 final rule (75 FR 72133) and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53416).

2. Changes in the Effective Date of the FTE Resident Cap, 3-Year Rolling Average, and Intern- and Resident-to-Bed (IRB) Ratio Cap for New Programs in Teaching Hospitals

Section 1886(h)(4)(H)(i) of the Act requires the Secretary to establish rules for calculating the direct GME caps for new teaching hospitals that are training residents in new medical residency training programs established on or after January 1, 1995. Under section 1886(d)(5)(B)(viii) of the Act, such rules also apply to the establishment of a hospital's IME cap on the number of FTE residents training in new programs. We implemented these statutory requirements in rules published in the August 29, 1997 Federal Register (62 FR 46002 through 46008) and in the May 12, 1998 Federal Register (63 FR 26323 through 26325 and 26327 through 26336). Generally, under existing regulations at 42 CFR 413.79(e)(1) (for direct GME) and 42 CFR 412.105(f)(1)(vii) (for IME), if a hospital did not train any allopathic or osteopathic residents in its most recent cost reporting period ending on or before December 31, 1996, and it begins to participate in training residents in a new medical residency training program (allopathic or osteopathic) on or after January 1, 1995, the hospital's unweighted FTE resident cap (which would otherwise be zero) may be adjusted based on the sum of the product of the highest number of FTE residents in any program year during the third year of the first new program's existence, for each new residency training program established during that 3-year period, and the minimum accredited length for each type of program. The number of FTE resident cap slots that a teaching hospital receives for each new program may not exceed the number of accredited slots that are available for each new program. Once a hospital's FTE resident cap is established, no subsequent cap adjustments may be made for new programs, unless the teaching hospital is a rural hospital. A rural hospital's FTE resident caps may be adjusted for participation in subsequent new residency training programs. A hospital that did not train any allopathic or osteopathic residents in its most recent cost reporting period ending on or before December 31, 1996, may only receive a permanent FTE resident cap adjustment for training residents in a truly “new” residency training program; no permanent cap adjustment would be given for training residents associated with an existing program. That is, if a hospital that did not train any allopathic or osteopathic residents in its most recent cost reporting period ending on or before December 31, 1996, serves as a training site for residents in a program that exists or existed previously at another teaching hospital that remains open, that “new” teaching hospital does not receive a “new program” cap adjustment because it is not participating in training residents in a truly “new” program. However, it may be possible for that “new” teaching hospital to receive a temporary cap adjustment if it enters into a Medicare GME affiliation agreement with the existing teaching hospital as specified at § 413.79(f) (for direct GME) and § 412.105(f)(1)(vi) (for IME). (For a detailed discussion of the distinctions between a new medical residency training program and an existing medical residency training program, we refer readers to the August 27, 2009 final rule (74 FR 43908 through 43920). For a detailed discussion regarding participation in Medicare GME affiliation agreements, we refer readers to 74 FR 43574.)

For new programs started prior to October 1, 2012, hospitals that did not yet have an FTE resident cap established had a “3-year window” in which to participate in and “grow” new programs, before the FTE resident caps for IME and direct GME were permanently set for the hospital beginning with the fourth program year of the first new program started. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53415 through 53425), we revised the regulations at § 413.79(e) to increase the cap-building period for new programs from 3 years to 5 years. That is, for a hospital that did not yet have an FTE resident cap established, the hospital's FTE resident cap is effective beginning with the sixth program year of the first new program's existence. This revised policy is effective for urban hospitals that first begin to participate in training residents in their first new program on or after October 1, 2012, and for rural hospitals that start a new program on or after October 1, 2012. In that final rule, we also finalized a methodology used to calculate a cap adjustment for an individual hospital if residents in a new program rotate to more than one hospital (or hospitals). The methodology is based on the sum of the products of the following three factors: (1) the highest total number of FTE residents trained in any program year, during the fifth year of the first new program's existence at all of the hospitals to which the residents in that program rotate; (2) the number of years in which residents are expected to complete the program, based on the minimum accredited length for each type of program; and (3) the ratio of the number of FTE residents in the new program that trained at the hospital over the entire 5-year period to the total number of FTE residents that trained at all hospitals over the entire 5-year period. Finally, we made minor revisions to the regulation text at §§ 413.79(e)(2) through (e)(4) for purposes of maintaining consistency throughout § 413.79(e). We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53415 through 53425) for further details regarding the methodology for calculating the FTE resident caps.

While the FY 2013 IPPS/LTCH PPS final rule discussed the methodology for calculating the FTE resident caps to be effective beginning with the sixth program year of the first new program's existence, for hospitals that do not yet have FTE resident caps established, that final rule did not discuss when the 3-year rolling average for IME and direct GME or the intern- and resident-to-bed (IRB) ratio cap for IME is effective for FTE residents training in new programs. The regulations regarding the 3-year rolling average and the IRB ratio cap with respect to new medical residency training programs were established in the following Federal Register rules: the FY 1998 IPPS final rule with comment period (62 FR 46002 through 46008); the May 12, 1998 final rule (63 FR 26323 through 26325 and 26327 through 26336); FY 2000 IPPS final rule (64 FR 41518 through 41523); and the FY 2002 IPPS final rule (66 FR 39878 through 39883). Specifically, the regulations at § 412.105(f)(1)(v) regarding the 3-year rolling average and new medical residency training programs for IME states that if a hospital qualified for an adjustment to the limit established under paragraph (f)(1)(iv) of the section for new medical residency programs created under paragraph (f)(1)(vii) of the section, the count of residents participating in new medical residency training programs above the number included in the hospital's FTE count for the cost reporting period ending during calendar year 1996 is added after applying the averaging rules in paragraph (f)(l)(v) for a period of years. Residents participating in new medical residency training programs are included in the hospital's FTE count before applying the averaging rules after the period of years has expired. For purposes of this paragraph, for each new program started, the period of years equals the minimum accredited length for each new program. The period of years for each new program begins when the first resident begins training in each new program. In addition, the regulations for the interaction of the IRB ratio cap and new medical residency training programs for IME at § 412.105(a)(1)(ii) states that the exception for new programs described in paragraph (f)(1)(vii) of the section applies to each new program individually for which the full-time equivalent cap may be adjusted based on the period of years equal to the minimum accredited length of each new program.

The regulations at § 413.79(d)(5) regarding the interplay of the 3-year rolling average with new medical residency training programs for direct GME similarly states that if a hospital qualifies for an adjustment to the limit established under paragraph (c)(2) of the section for new medical residency programs created under paragraph (e) of the section, the count of the residents participating in new medical residency training programs above the number included in the hospital's FTE count for the cost reporting period ending during calendar year 1996 is added after applying the averaging rules in paragraph (d), for a period of years. Residents participating in new medical residency training programs are included in the hospital's FTE count before applying the averaging rules after the period of years has expired. For purposes of paragraph (d), for each new program started, the period of years equals the minimum accredited length for each new program. The period of years begins when the first resident begins training in each new program.

Therefore, the FTE resident caps for IME and direct GME are always effective beginning with the start of the sixth program year of the first new program started for urban hospitals that do not yet have FTE resident caps established (§ 413.79(e)(1)(iii)), and for rural hospitals, beginning with the start of the sixth program year of each new individual program started (§ 413.79(e)(3)), regardless of the fact that other new programs may have started after the start of the first new program. However, the timing of when the 3-year rolling average for IME and direct GME and the IRB ratio cap for IME are first applied is dependent upon the minimum accredited length of each new program started within the 5-year window. For example, new teaching Hospital A participates in training residents in new medical residency training programs for the first time beginning on July 1, 2013. On July 1, 2013, Hospital A participates in training residents in a new family medicine program (minimum accredited length is 3 years), on July 1, 2014, it also participates in training residents in a new sports medicine fellowship (minimum accredited length is 1 year), and on July 1, 2015, it also participates in training residents in a new general surgery program (minimum accredited length is 5 years). For the purpose of establishing Hospital A's FTE resident caps, the 5-year growth window for Hospital A closes on June 30, 2018, and the IME and direct GME FTE resident caps for Hospital A are effective on July 1, 2018, the beginning of the sixth program year of the first new program's existence; that is, family medicine. However, the 3-year rolling average and the IRB ratio cap are effective at different points in time. Because the family medicine residency is 3 years in length, FTE residents in the new family medicine program are subject to the 3-year rolling average and the IRB ratio cap beginning on July 1, 2016. Because the sports medicine fellowship is a 1-year program, and it started on July 1, 2014, the number of sports medicine FTE residents must be included in the 3-year rolling average and is subject to the IRB ratio cap effective on July 1, 2015. Lastly, the FTE residents in the new general surgery program would only be subject to the rolling average and the IRB ratio cap effective July 1, 2020. The Medicare cost report worksheets on CMS Form 2552-10 for IME (Worksheet E, Part A) and for direct GME (Worksheet E-4) currently can accommodate reporting of FTE residents separately based on whether those FTE residents are in new medical residency training programs and are not subject to the FTE resident cap (line 16 of Worksheet E, Part A, and line 15 of Worksheet E-4). However, these cost report worksheets are not designed to accommodate reporting of FTE residents that are exempt from the FTE resident cap, but are subject to the rolling average and IRB ratio cap, because the “period of years” equal to the minimum accredited length of each new program started has already expired. The reverse also may occur, as in the example above with the new general surgery program started by Hospital A, where the FTE resident caps are effective July 1, 2018, but the number of FTE residents in the general surgery program would not be subject to the rolling average or the IRB ratio cap until July 1, 2020. Complicating matters further is the fact that, while the effective dates of these policies associated with new medical residency training program FTE residents are effective on a program year basis (that is, July 1), many teaching hospitals do not have a fiscal year that begins on July 1. Therefore, under the existing policy, the number of FTE residents needs to be prorated, and special accommodations need to be made to calculate the portion of FTE residents that are subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap for the respective portions of the hospital's cost reporting period occurring on and after July 1. Integrating the rolling average, the IRB ratio cap, and the FTE resident caps for residents in new medical residency training programs in an accurate manner on the Medicare cost report has proved challenging to the point where we have had to deal with each instance brought to our attention by the new teaching hospital or by a Medicare contractor on an individual and manual basis (in order to ensure application of a consistent methodology). In fact, the Medicare cost report instructions direct the hospital to do the following: for CMS Form 2552-10, Worksheet E, Part A, line 10—“. . . Contact your contractor for instructions on how to complete this line if you have a new program for which the period of years is less than or more than three years. . . .”; for CMS Form 2552-10, Worksheet E-4, line 6—“. . . Contact your contractor for instructions on how to complete this line if you have a new program for which the period of years is less than or greater than 3 years. . . .”

The MACs, in turn, have been instructed to contact CMS for instructions on how to report the number of FTE residents that are still within the “period of years” of the new program. The “three years” referenced in the Form 2552-10 cost report instructions are based on the 3-year growth window for new medical residency training programs that is in effect for new programs started prior to October 1, 2012, when, within the 3-year growth window, new teaching hospitals also may have started new medical residency training programs with different minimum accredited lengths. (We note that while the previous Form 2552-96 cost report did not include the same instructions, CMS did deal with the reporting of the number of FTE residents in new medical residency training programs on an individual basis when requests for assistance were brought to its attention.) However, these instructions also apply for new medical residency training programs started with different minimum accredited lengths on and after October 1, 2012.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28147), we proposed to simplify and streamline the timing of when FTE residents in new medical residency training programs are subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap, both for urban teaching hospitals that have not yet had FTE resident caps established under § 413.79(e)(1) and for rural teaching hospitals that may or may not have FTE resident caps established under § 413.79(e)(3). That is, we proposed that the methodology for calculating the FTE resident caps for hospitals that participate in training residents in new medical residency training programs would continue to be the same methodology instituted in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53415 through 53425) for new medical residency training programs started on or after October 1, 2012, specified at § 413.79(e)(1). However, once the FTE resident caps are calculated, we proposed to change the timing of when the FTE resident caps would be effective, to synchronize the effective dates and the application of the 3-year rolling average and the IRB ratio cap with each applicable hospital's fiscal year begin date. Specifically, we proposed that the FTE resident caps would continue to be calculated as finalized in the FY 2013 IPPS/LTCH PPS final rule—the methodology is based on the sum of the products of the following three factors: (1) the highest total number of FTE residents trained in any program year, during the fifth year of the first new program's existence at all of the hospitals to which the residents in that program rotate; (2) the number of years in which residents are expected to complete the program, based on the minimum accredited length for each type of program; and (3) the ratio of the number of FTE residents in the new program that trained at the hospital over the entire 5-year period to the total number of FTE residents that trained at all hospitals over the entire 5-year period. However, once calculated in this manner, we proposed that, instead of the FTE resident caps being effective beginning with the sixth program year of the first new program started, those FTE resident caps, the 3-year rolling average, and the IRB ratio cap would be effective beginning with the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started. Using the example of Hospital A that we presented earlier, assume Hospital A has a January 1 to December 31 cost reporting year. The first new program started, family medicine, was started on July 1, 2013. A sports medicine fellowship and a general surgery program also were started timely within the 5-year growth window. Hospital A has 5 program years to grow its FTE resident caps, from July 1, 2013 through June 30, 2018. The FTE resident caps would be calculated based on the 5 program years in accordance with the methodology established at § 413.79(e)(1) in the FY 2013 IPPS/LTCH PPS final rule; therefore, the hospital would wait until after June 30, 2018 to obtain the FTE counts to calculate the FTE resident caps. However, we proposed that those IME and direct GME FTE resident caps, once calculated after June 30, 2018, instead of being effective on July 1, 2018, would be effective at the beginning of Hospital A's cost reporting period that precedes July 1, 2018; that is, the FTE resident caps for Hospital A would be effective permanently on January 1, 2018, the start of Hospital A's cost reporting period that precedes the start of the sixth program year of the first new program started. The hospital could file its fiscal year end December 31, 2018 cost report including the FTE resident caps applicable to the entire cost reporting period accordingly.

As noted earlier, we proposed that, for all new medical residency training programs in which the hospital participates during the 5-year growth window, the FTEs in those new programs also would be subject to the 3-year rolling average and the IRB ratio cap simultaneously with the effective date of the FTE resident caps, at the beginning of the applicable hospital's cost reporting period that precedes the beginning of the sixth program year of the first new program started. Again, using the example of Hospital A that we presented earlier, the FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would all be subject to the 3-year rolling average and IRB ratio cap beginning on January 1, 2018. With regard to reporting on the Medicare cost report, for Hospital A's fiscal year end dates of December 31, 2013 through and including December 31, 2017, we proposed that the number of FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would be reported so as not to be included in the IME rolling average or the IRB ratio cap, and so as not to be included in the direct GME rolling average. (On the CMS Form 2552-10, for Hospital A's fiscal year end dates of December 31, 2013 through and including December 31, 2017, this means that the number of FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would be reported on Worksheet E, Part A, line 16, and on Worksheet E-4, line 15). However, on Hospital A's cost report for fiscal year ending December 31, 2018, the number of FTE residents in these three programs would be subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap and would be reported accordingly. (On the CMS Form 2552-10, for Hospital A's cost report for fiscal year ending December 31, 2018, this means that none of the FTE residents in these three programs would be reported on Worksheet E, Part A, line 16 for IME, and Worksheet E-4, line 15 for direct GME. Instead, all of the FTE residents would be reported on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME, in order to be subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap.) We note that once the 3-year rolling average is effective in that cost reporting period that includes the sixth program year of the first new program started, the number of FTE residents in the new programs also must be reported both as part of the prior year FTE resident counts and the penultimate FTE resident counts, in order to effectuate the 3-year rolling average calculation on the IME Worksheet E, Part A, and the direct GME Worksheet E-4, respectively.

In the example that we presented earlier, Hospital A has a fiscal year that begins on January 1. If Hospital A's fiscal year begin date would have been October 1, then, while the sixth program year of the first new program started would still be July 1, 2018, the FTE residents caps, the 3-year rolling average, and the IRB ratio cap would be effective on October 1, 2017, the fiscal year begin date that precedes July 1, 2018, the sixth program year. If Hospital A's fiscal year begin date would have been July 1, the FTE residents caps, the 3-year rolling average, and the IRB ratio cap would instead be effective on July 1, 2017, the fiscal year begin date that precedes July 1, 2018, the sixth program year.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28148), we stated that we understood that this proposal, if finalized, would reduce the amount of time that the new medical residency training programs would be exempt from the FTE resident caps. However, even though we proposed to make the effective date of the FTE resident caps earlier than under current policy, because we also proposed that the calculation of the FTE resident caps would still be based on the highest total number of FTE residents trained in any program year, during the fifth year of the first new program's existence at all of the hospitals to which the residents in that program rotate, a new teaching hospital would still have the full 5 program years to grow its program(s), and its FTE resident caps would reflect a full 5 years of growth. Therefore, because, by the fifth program year, a program should, in most typical circumstances, have grown to its full capacity, barring unusual circumstances, the FTE resident caps that would take effect under the proposed policy at the beginning of the fiscal year that precedes the sixth program year should accommodate the FTE resident count training in the fifth and subsequent program years. Therefore, we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28148) that we believe that this proposal to streamline and synchronize the effective dates of the FTE resident caps, the 3-year rolling average, and the IRB ratio cap not only is easier to comprehend and to implement, but also is reasonable and equitable in its effect on the IME and direct GME payments of hospitals establishing FTE resident caps. Specifically, we indicated that if the proposal is finalized, there would no longer be a need for CMS Form 2552-10, Worksheet E, Part A, line 10 and Worksheet E-4, line 6 to instruct hospitals to contact their MACs for instructions on how to complete those lines, as both hospitals and MACs would understand how to report the number of FTE residents in new programs, even when those programs have different accredited lengths. Instead, hospitals and MACs would follow the methodology instituted in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53415 through 53425) to calculate the FTE resident caps for new medical residency training programs started on or after October 1, 2012, and once the FTE resident caps are calculated, hospitals and MACs would implement the FTE resident caps, the 3-year rolling average, and the IRB ratio cap effective beginning with the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started. Under the proposed methodology, FTE residents and FTE resident caps would no longer need to be prorated, and we would no longer need to make special accommodations to calculate the portion of FTE residents that are subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap for the respective portions of the hospital's cost reporting period occurring on and after July 1. The existing CMS Form 2552-10 already accommodates the proposed methodology, unlike the complicated process currently in place. Therefore, clarity, efficiency, and payment accuracy would be improved for hospitals, MACs, and CMS.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28148), we stated that with regard to rural hospitals that, under § 413.79(e)(3) of the regulations, may receive FTE resident cap adjustments at any time for participating in training residents in new programs, we proposed a similar policy, with modifications reflecting the fact that each new program in which the rural hospital participates receives its own 5-year growth window before the rural hospital's FTE resident cap is adjusted based on that new program. That is, we proposed that, for rural hospitals, the FTE resident caps, the 3-year rolling average, and the IRB ratio cap for each new program started would be effective beginning with the applicable hospital's cost reporting period that precedes the start of the sixth program year of each new program started. For example, rural Hospital B has a fiscal year that begins on January 1. It starts a family medicine program on July 1, 2013, and a general surgery program on July 1, 2016. The sixth program year for the family medicine program begins on July 1, 2018. The sixth program year for the general surgery program begins on July 1, 2021. With regard to Medicare cost reporting, during Hospital B's fiscal year end dates of December 31, 2013 through and including December 31, 2017, the number of family medicine FTE residents would be reported so as not to be included in the IME 3-year rolling average or the IRB ratio cap, and so as not to be included in the direct GME 3-year rolling average. (This means that on CMS Form 2552-10, during Hospital B's fiscal year end dates of December 31, 2013 through and including December 31, 2017, the number of family medicine FTE residents would be reported on Worksheet E, Part A, line 16 for IME, and on Worksheet E-4, line 15, for direct GME. Instead, the number of family medicine FTE residents would be reported on Worksheet E, Part A, line 16, and Worksheet E-4, line 15.) Then, beginning with Hospital B's cost report for fiscal year ending December 31, 2018, the number of FTE residents in only the family medicine program would be subject to the FTE residents caps, the 3-year rolling average, and the IRB ratio cap, and would be reported accordingly in order to be subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap. (This means that on CMS Form 2552-10, beginning with Hospital B's cost report ending December 31, 2018, the number of family medicine FTE residents would be reported on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME.) Because the general surgery program started on July 1, 2016, for Hospital B's fiscal year end dates of December 31, 2016 through and including fiscal year end date of December 31, 2020, the number of general surgery FTE residents would be reported (on Worksheet E, Part A, line 16) so as not to be included in the IME 3-year rolling average or the IRB ratio cap, and (on Worksheet E-4, line 15), so as not to be included in the direct GME 3-year rolling average. Then, beginning with Hospital B's cost report for fiscal year ending December 31, 2021, the number of FTE residents in the general surgery program would be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap, and would be reported accordingly (on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME), in order to be subject to the FTE resident cap, the 3-year rolling average, and the IRB ratio cap. We note that once the 3-year rolling average is effective in that cost reporting period that includes the sixth program year of each new program started, the number of FTE residents in the new programs also must be reported as part of the prior year FTE resident counts, and the penultimate FTE resident counts, in order to effectuate the 3-year rolling average calculation on the IME Worksheet E, Part A, and the direct GME Worksheet E-4, respectively.

We proposed that this policy regarding the effective dates of the FTE residency caps, the 3-year rolling average, and the IRB ratio cap for FTE residents in new medical residency training programs would be consistent with the methodology for calculation of the FTE resident caps as described in the FY 2013 IPPS/LTCH PPS final rule, and implemented in the regulations at §§ 413.79(e)(1) and (e)(3). That is, because the policy providing a 5-year growth period for establishing the FTE resident caps (§§ 413.79(e)(1) and (e)(3)) is effective for new programs started on or after October 1, 2012, this proposal would be effective for urban hospitals that first begin to participate in training residents in their first new medical residency training program, and for rural hospitals, on or after October 1, 2012. We also proposed to revise the regulations for IME and direct GME, respectively, at § 412.105(a)(1)(ii) for the IME IRB ratio cap, at § 412.105(f)(1)(v) for the IME 3-year rolling average, and at § 413.79(d)(5) for the direct GME 3-year rolling average to reflect that the exception from the IRB ratio cap and the 3-year rolling average for new programs applies to each new program individually during the cost reporting periods prior to the beginning of the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(1), and prior to the beginning of the applicable hospital's cost reporting period that precedes the start of the sixth program year of each individual new program started, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3). After the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(1), and after the applicable hospital's cost reporting period that precedes the start of the sixth program year of each individual new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3), FTE residents participating in new medical residency training programs are included in the hospital's IRB ratio cap and the 3-year rolling average.

Comment: Many commenters supported CMS' proposal to simplify and synchronize the timing of when FTE residents in new medical residency training programs are subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap. However, the commenters believed that the specific part of the proposal related to making the FTE resident caps effective beginning with the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started would result in premature application of the FTE resident cap while the hospital would still be within the 5-year cap building window, thereby reducing the number of FTEs to which the new teaching hospital would otherwise be entitled to payment. The commenters disputed CMS' suggestion in the proposed rule that the effect on a hospital's payment would be inconsequential or nonexistent “in most typical circumstances.” The commenters provided examples of where they believed CMS' proposal would result in the loss of payment for new teaching hospitals establishing an FTE resident cap. The commenters acknowledged CMS' statement in the proposed rule that a new teaching hospital could experience a payment benefit from the proposed changes related to the synchronized implementation of the 3-year rolling average and the IRB ratio cap. However, the commenters did not believe this “benefit justifies an imposition of the FTE resident cap within the 5-year cap building window.” The commenters urged CMS to finalize an alternative effective date that would be the start of the hospital's cost reporting period that follows the start of the sixth program year of the start of the first new program. The commenters believed this alternative would achieve the simplicity that CMS seeks in its proposal, yet would also permit new teaching hospitals to retain the payments they are “entitled” to receive for at least a full 5 program years under existing regulations.

Response: We appreciate the commenters' support of the proposal, and the commenters' concern that, by proposing that the effective date would be the applicable hospital's cost reporting period that precedes the start of the sixth program year of the first new program started, this earlier application of the FTE resident cap might result in reduced payment because some amount of FTE residents would be in excess of the hospital's newly calculated FTE resident caps. We also agree that the streamlining and simplification that we are seeking would be achieved by revising the proposal to instead take effect, as the commenters suggested, with the beginning of the hospital's cost reporting period that follows the start of the sixth program year of the first new program started. Therefore, in this final rule, we are modifying our proposal as follows, both for urban teaching hospitals that have not yet had FTE resident caps established under § 413.79(e)(1), and for rural teaching hospitals that may or may not have FTE resident caps established under § 413.79(e)(3). That is, the FTE resident caps would continue to be calculated as finalized in the FY 2013 IPPS/LTCH PPS final rule—the methodology is based on the sum of the products of the following three factors: (1) the highest total number of FTE residents trained in any program year, during the fifth year of the first new program's existence at all of the hospitals to which the residents in that program rotate; (2) the number of years in which residents are expected to complete the program, based on the minimum accredited length for each type of program; and (3) the ratio of the number of FTE residents in the new program that trained at the hospital over the entire 5-year period to the total number of FTE residents that trained at all hospitals over the entire 5-year period. However, once calculated in this manner, we are finalizing a policy that, instead of the FTE resident caps being effective beginning with the sixth program year of the first new program started, those FTE resident caps, the 3-year rolling average, and the IRB ratio cap would be effective beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started. (We are specifying “that coincides with or follows” the start of the sixth program year of the first new program started, rather than only specifying “that follows” the start of the sixth program year of the first new program started as the commenters suggested, in consideration of hospitals that have a fiscal year begin date of July 1, for whom the cost reporting period that starts after completion of the 5-year cap building window coincides with the beginning of the sixth program year of the first new program started. Under this finalized policy, hospitals with a fiscal year begin date of July 1 would not wait an entire 12 months after completion of their 5-year cap building window for their next cost reporting period to start in order for the FTE resident caps, the 3-year rolling average, and the IRB ratio cap to take effect. Rather, for hospitals with a fiscal year begin date of July 1, the FTE resident caps, the 3-year rolling average, and the IRB ratio cap would be effective beginning with the applicable hospital's cost reporting period that coincides with the start of the sixth program year of the first new program started.)

Using the example of Hospital A that we presented in the proposed rule, assume Hospital A has a January 1 to December 31 cost reporting year. The first new program started, family medicine, was started on July 1, 2013. A sports medicine fellowship and a general surgery program also were started timely within the 5-year growth window. Hospital A has 5 program years to grow its FTE resident caps, from July 1, 2013 through June 30, 2018. The FTE resident caps would be calculated based on the 5 program years in accordance with the methodology established at § 413.79(e)(1) in the FY 2013 IPPS/LTCH PPS final rule. Therefore, the hospital would wait until after June 30, 2018, to obtain the FTE counts to calculate the FTE resident caps. However, those IME and direct GME FTE resident caps, once calculated after June 30, 2018, instead of being effective on July 1, 2018, would be effective at the beginning of Hospital A's cost reporting period that follows July 1, 2018; that is, the FTE resident caps for Hospital A would be effective permanently on January 1, 2019, the start of Hospital A's cost reporting period that follows the start of the sixth program year of the first new program started. The hospital would file its fiscal year end December 31, 2019 cost report including the FTE resident caps applicable to the entire cost reporting period accordingly.

Regarding the application of the 3-year rolling average and the IRB ratio cap, using the example of Hospital A, the FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would all be subject to the 3-year rolling average and the IRB ratio cap beginning on January 1, 2019. With regard to reporting on the Medicare cost report, for Hospital A's fiscal year end dates of December 31, 2013 through and including December 31, 2018, the number of FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would be reported so as not to be included in the IME rolling average or the IRB ratio cap, and so as not to be included in the direct GME rolling average. (On the CMS Form 2552-10, for Hospital A's fiscal year end dates of December 31, 2013 through and including December 31, 2018, this means that the number of FTE residents in the family medicine program, the sports medicine fellowship, and the general surgery program would be reported on Worksheet E, Part A, line 16, and on Worksheet E-4, line 15.) However, on Hospital A's cost report for fiscal year ending December 31, 2019, the number of FTE residents in these three programs would be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap, and would be reported accordingly. (On the CMS Form 2552-10, for Hospital A's cost report for fiscal year ending December 31, 2019, this means that none of the FTE residents in these three programs would be reported on Worksheet E, Part A, line 16 for IME, and Worksheet E-4, line 15 for direct GME. Instead, all of the FTE residents would be reported on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME, in order to be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap.) We note that once the 3-year rolling average is effective, the number of FTE residents in the new programs also must be reported both as part of the prior year FTE resident counts and the penultimate FTE resident counts, in order to effectuate the 3-year rolling average calculation on the IME Worksheet E, Part A, and the direct GME Worksheet E-4, respectively.

In the example that we presented earlier, Hospital A has a fiscal year that begins on January 1. If Hospital A's fiscal year begin date would have been October 1, while the sixth program year of the first new program started would still be July 1, 2018, the FTE residents caps, the 3-year rolling average, and the IRB ratio cap would be effective on October 1, 2018, the fiscal year begin date that follows July 1, 2018, the sixth program year. If Hospital A's fiscal year begin date would have been July 1, the FTE residents caps, the 3-year rolling average, and the IRB ratio cap would be effective on July 1, 2018, the fiscal year begin date that follows completion of the fifth program year, and coincides with July 1, 2018, the sixth program year.

With regard to rural hospitals that, under § 413.79(e)(3) of the regulations, may receive FTE resident cap adjustments at any time for participating in training residents in new programs, we are finalizing a similar policy, with modifications reflecting the fact that each new program in which the rural hospital participates receives its own 5-year growth window before the rural hospital's FTE resident cap is adjusted based on that new program. That is, we are finalizing that, for rural hospitals, the FTE resident caps, the 3-year rolling average, and the IRB ratio cap for each new program started would be effective beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of each new program started. For example, rural Hospital B has a fiscal year that begins on January 1. It starts a family medicine program on July 1, 2013, and a general surgery program on July 1, 2016. The sixth program year for the family medicine program begins on July 1, 2018. The sixth program year for the general surgery program begins on July 1, 2021. With regard to Medicare cost reporting, during Hospital B's fiscal year end dates of December 31, 2013 through and including December 31, 2018, the number of family medicine FTE residents would be reported so as not to be included in the IME 3-year rolling average or the IRB ratio cap, and so as not to be included in the direct GME 3-year rolling average. (This means that on CMS Form 2552-10, during Hospital B's fiscal year end dates of December 31, 2013 through and including December 31, 2018, the number of family medicine FTE residents would be reported on Worksheet E, Part A, line 16 for IME, and on Worksheet E-4, line 15, for direct GME.) Beginning with Hospital B's cost report for fiscal year ending December 31, 2019, the number of FTE residents in only the family medicine program would be subject to the FTE residents caps, the 3-year rolling average, and the IRB ratio cap, and would be reported accordingly in order to be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap. (This means that on CMS Form 2552-10, beginning with Hospital B's cost report ending December 31, 2019, the number of family medicine FTE residents would be reported on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME.) Because the general surgery program started on July 1, 2016, for Hospital B's fiscal year end dates of December 31, 2016 through and including fiscal year end date of December 31, 2021, the number of general surgery FTE residents would be reported on Worksheet E, Part A, line 16 so as not to be included in the IME 3-year rolling average or the IRB ratio cap, and on Worksheet E-4, line 15, so as not to be included in the direct GME 3-year rolling average. Beginning with Hospital B's cost report for fiscal year ending December 31, 2022, the number of FTE residents in the general surgery program would be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap, and would be reported accordingly (on Worksheet E, Part A, line 10 for IME, and Worksheet E-4, line 6 for direct GME), in order to be subject to the FTE resident caps, the 3-year rolling average, and the IRB ratio cap. We note that once the 3-year rolling average is effective, the number of FTE residents in the new programs also must be reported as part of the prior year FTE resident counts, and the penultimate FTE resident counts, in order to effectuate the 3-year rolling average calculation on the IME Worksheet E, Part A, and the direct GME Worksheet E-4, respectively.

After consideration of the public comments we received, we are finalizing our proposal with certain modifications. Specifically, the policy regarding the effective dates of the FTE residency caps, the 3-year rolling average, and the IRB ratio cap for FTE residents in new medical residency training programs will be consistent with the methodology for calculation of the FTE resident caps as described in the FY 2013 IPPS/LTCH PPS final rule, and implemented in the regulations at §§ 413.79(e)(1) and (e)(3). That is, because the policy providing a 5-year growth period for establishing the FTE resident caps (§§ 413.79(e)(1) and (e)(3)) is effective for new programs started on or after October 1, 2012, this policy will be effective for urban hospitals that first begin to participate in training residents in their first new medical residency training program, and for rural hospitals, on or after October 1, 2012. We also are revising the regulations for IME and direct GME, respectively, at § 412.105(a)(1)(ii) for the IME IRB ratio cap, at § 412.105(f)(1)(v) for the IME 3-year rolling average, and at § 413.79(d)(5) for the direct GME 3-year rolling average, to reflect that the exception from the IRB ratio cap and the 3-year rolling average for new programs applies to each new program individually during the cost reporting periods prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(1), and prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of each individual new program started, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3). Beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(1), and beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of each individual new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3), FTE residents participating in new medical residency training programs are included in the hospital's IRB ratio cap and the 3-year rolling average.

3. Changes to IME and Direct GME Policies as a Result of New OMB Labor Market Area Delineations

a. New Program FTE Resident Cap Adjustment for Rural Hospitals Redesignated as Urban

As stated earlier in this final rule, under existing regulations, a new teaching hospital that starts training residents for the first time on or after October 1, 2012, has 5 years from when it first begins training residents in its first new program to build its FTE resident cap. If the teaching hospital is a rural teaching hospital, it can continue to receive permanent cap adjustments for training residents in new programs after the initial 5-year cap-building period that applies to new teaching hospitals ends. (We refer readers to section IV.K.2. of the preamble of this final rule for a discussion of our proposal and final policy to change the effective dates for when the FTE resident cap, the 3-year rolling average, and the IRB ratio cap are applied to new teaching hospitals and to new programs at rural teaching hospitals.)

In section III.B. of the preamble of this final rule, we discuss the final policies we are implementing as a result of the new OMB labor market area delineations announced in the February 28, 2013 OMB Bulletin No. 13-01. As a result of the new OMB delineations, some teaching hospitals may be redesignated from being located in a rural area to an urban area, thereby losing their ability to increase their FTE resident caps for new programs started after their initial 5-year cap-building period ends. We have been asked whether a rural teaching hospital that already has a cap and is redesignated as urban while it is in the process of establishing another new program(s) can still receive a permanent cap adjustment for that new program(s). We believe that because the hospital had already started training residents in the new program(s) while it was rural, the former rural hospital should be permitted to continue building its new program(s) and receive a permanent FTE resident cap adjustment for that new program(s). Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28149 through 28150), we proposed to revise the regulations to allow a hospital that was rural as of the time it started training residents in a new program(s) and is redesignated as urban for Medicare payment purposes during its cap-building period for that program(s) to be able to continue building that program(s) for the remainder of the cap-building period and receive a permanent FTE resident cap adjustment for that new program(s). Once the cap-building period for the new program(s) that was started while the hospital was still rural expires, the teaching hospital that has been redesignated as urban would no longer be able to receive any additional permanent cap adjustments. We proposed that the teaching hospital must be actively training residents in the new program while it is still rural, that is, prior to the redesignation taking effect, in order for the hospital to continue receiving a cap adjustment for the new program. For example, if a rural hospital begins training residents in a new internal medicine program on July 1, 2013, and begins training residents in a new general surgery program on July 1, 2014, and the rural hospital is redesignated as urban effective on October 1, 2014, the teaching hospital would be able to continue receiving a cap adjustment for both the new internal medicine program and the new general surgery program after it has been redesignated as urban. However, if the rural hospital is redesignated as urban effective on October 1, 2014, and started training residents in a new internal medicine program on July 1, 2013, but did not start training residents in a new general surgery program while it was still rural, that is, prior to October 1, 2014, the teaching hospital would receive a permanent cap adjustment for the new internal medicine program, but would not receive a cap adjustment for the new general surgery program. We proposed to revise the regulations at § 412.105(f)(1)(iv)(D) for IME and § 413.79(c)(6) for direct GME to implement this proposed change. We proposed that these regulatory revisions be effective for cost reporting periods beginning on or after October 1, 2014.

Comment: Commenters supported the proposal to allow a rural hospital that was training residents in a new program when it was redesignated as urban due to the most recent OMB delineations, to be able to continue with the cap-building period for that new program and receive a permanent cap adjustment for that new program. Commenters stated that rural hospitals develop and build their new programs with the expectation that they will have a 5-year cap building period in which to grow these new programs. Commenters stated that the proposal is fair and equitable and helps address physician shortages in rural areas by promoting residency training in these areas. However, several commenters requested that CMS take the proposal one step further. These commenters stated that if a rural hospital has received a letter of accreditation for a new program prior to the hospital being redesignated as urban, the hospital should be able to receive a permanent cap adjustment for that new program. One commenter stated that there are substantial resources and upfront costs that go into starting a new family medicine program. The commenter noted it may take some time for the program to begin training residents because the hospital must receive an initial letter of accreditation and then the program may have to wait for up to a year until it can participate in the match for residents to begin the following July 1.

Response: We thank the commenters for their support of our proposal. We appreciate that significant resources go into developing a brand new residency training program and that there may be a lag between when a program is accredited and when residents begin training in that program. We are persuaded by these comments and, therefore, we are finalizing our proposed policy with a modification, such that a rural hospital that has been redesignated as urban can receive a permanent cap adjustment for a new program (after a 5-year cap building period for that new program), if it received a letter of accreditation for the new program, and/or started training residents in the new program, prior to being redesignated as urban. Expanding upon the example that was included in the proposed rule, if a rural hospital is redesignated as urban effective on October 1, 2014, and started training residents in a new internal medicine program on July 1, 2013, but did not start training residents in a new general surgery program while it was still rural, that is, prior to October 1, 2014, but did receive a letter of accreditation for the general surgery program prior to October 1, 2014, the rural hospital would receive a permanent cap adjustment for the new internal medicine program, and would receive a permanent cap adjustment for the new general surgery program. We are amending the regulations at § 412.105(f)(1)(iv)(D) and § 413.79(c)(6) to implement this policy. Consistent with the effective date of the implementation of the new OMB delineations, we are making this final policy effective October 1, 2014.

Comment: Several commenters requested that CMS propose a policy through an interim final rule that would permit hospitals that remain rural referral centers (RRCs), even if they are no longer in a labor market designated as rural, to receive a cap increase for residents training in new programs. The commenters stated that RRCs are high-volume hospitals that treat complex cases, which may be referred to them from significant geographic distances. The commenters stated that RRCs meet important health care needs of rural communities because residency programs in RRCs train physicians who are equipped to deal with rural populations. The commenters requested that CMS specify that grandfathered RRCs are able to increase their caps for new programs so long as during the current Federal fiscal year, they continue to meet all RRC requirements other than being located in a rural area.

Response: Section 1886(h)(4)(H)(i) of the Act states in part, “[i]n promulgating such rules for purposes of subparagraph (F), the Secretary shall give special consideration to facilities that meet the needs of underserved rural areas.” Subparagraph (F) refers to the establishment of a hospital's FTE resident cap. We read this statutory language as providing special consideration only to rural hospitals for purposes of establishing their FTE resident caps, not that special consideration be provided to hospitals that are either not physically located in rural areas or have not reclassified as rural facilities (for IME payment purposes). Therefore, we are not making any special exceptions specific to RRCs that are no longer in rural areas in this final rule. As we have stated above for other hospitals that lose their rural status due to the new OMB delineations, an RRC that has been redesignated as urban may receive a permanent cap adjustment for a new program (after a 5-year cap building period for that new program), if it received a letter of accreditation for the new program, and/or started training residents in the new program, prior to being redesignated as urban. We note that if the redesignated RRC subsequently reclassified back to rural, it would be able to receive additional adjustments to its IME FTE resident cap for training residents in new programs.

Comment: One commenter stated that it operates a rural teaching hospital that received the 30-percent cap increase applicable to rural teaching hospitals. The commenter stated that, due to the most recent OMB delineations proposed to be implemented, the hospital will be located in an urban area. The commenter asked whether the 30-percent cap increase would carry over with the hospital's urban status.

Response: The regulations at § 412.105(f)(1)(iv)(D) and § 413.79(c)(6) implemented in this final rule state in part that effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may retain any existing increases to its FTE resident cap that it had received prior to when the redesignation became effective. Therefore, in the situation the commenter described, the hospital that is redesignated from rural to urban may retain the 30-percent cap increase it received while it was still rural.

After consideration of the public comments we received, we are finalizing the proposed policy with a modification, such that a rural hospital that has been redesignated as urban can receive a permanent cap adjustment for a new program (after a 5-year cap building period for that new program), if it received a letter of accreditation for the new program, and/or started training residents in the new program, prior to being redesignated as urban. The finalized regulations at § 412.105(f)(1)(iv)(D) state the following:

  • A rural hospital redesignated as urban after September 30, 2004, as a result of the most recent census data and implementation of the new labor market area definitions announced by OMB on June 6, 2003, may retain the increases to its full-time equivalent resident cap that it received under paragraphs (f)(1)(iv)(A) and (f)(1)(vii) of the section while it was located in a rural area.
  • Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may retain any existing increases to its FTE resident cap that it had received prior to when the redesignation became effective.
  • Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may receive an increase to its FTE resident cap for a new program, in accordance with paragraph (e) of the section, if it received a letter of accreditation for the new program and/or started training residents in the new program, prior to the redesignation becoming effective.

The finalized regulations at § 413.79(c)(6) state the following:

  • A rural hospital redesignated as urban after September 30, 2004, as a result of the most recent census data and implementation of the new MSA definitions announced by OMB on June 6, 2003, may retain the increases to its FTE resident cap that it received under paragraphs (c)(2)(i), (e)(1)(iii), and (e)(3) of the section while it was located in a rural area.
  • Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may retain any existing increases to its FTE resident cap that it had received prior to when the redesignation became effective.
  • Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may receive an increase to its FTE resident cap for a new program, in accordance with paragraph (e) of the section, if it received a letter of accreditation for the new program and/or started training residents in the new program prior to the redesignation becoming effective.

b. Participation of Redesignated Hospital in Rural Training Track

To encourage the training of residents in rural areas, section 407(c) of Public Law 106-113 amended section1886(h)(4)(H) of the Act to add a provision (subsection (iv)) that, in the case of a hospital that is not located in a rural area (an urban hospital) that establishes separately accredited approved medical residency training programs (or rural tracks) in a rural area or has an accredited training program with an integrated rural track, the Secretary shall adjust the urban hospital's cap on the number of FTE residents under subparagraph (F), in an appropriate manner in order to encourage training of physicians in rural areas. Section 407(c) of Public Law 106-113 was made effective for direct GME payments to hospitals for cost reporting periods beginning on or after April 1, 2000, and for IME payments applicable to discharges occurring on or after April 1, 2000. We refer readers to the August 1, 2000 interim final rule with comment period (65 FR 47033 through 47037) and the FY 2002 IPPS final rule (66 FR 39902 through 39909) where we implemented section 407(c) of Public Law 106-113.

The regulations at § 413.79(k) specify that, subject to certain criteria, an urban hospital may count the FTE residents in the rural track in addition to those FTE residents subject to its cap up to a “rural track FTE limitation” for that hospital. In the FY 2006 IPPS final rule, we revised the regulations at § 413.79(k) to add a new paragraph (7) to state that if an urban hospital had established a rural track program with a rural hospital and that hospital subsequently becomes urban due to the implementation of the new labor market area definitions announced by OMB on June 6, 2003, the urban hospital may continue to adjust its FTE resident limit for rural track programs established before the implementation of the new labor market area definitions. We also stated that, in order for the urban hospital to receive a cap adjustment for a new rural track program, the urban hospital must establish a rural track program with hospitals that are designated rural based on the most recent geographical location designations adopted by CMS (70 FR 47456; 47489).

As discussed earlier in this section, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28054), we proposed to implement, effective October 1, 2014, the new OMB labor market area delineations announced in the February 28, 2013 OMB Bulletin No. 13-01. As a result of the new delineations, certain areas are redesignated from urban to rural or from rural to urban, which may, in turn, affect GME policies that require the participation of rural teaching hospitals. For example, as noted above, in order for an urban teaching hospital to receive a FTE resident cap adjustment for training residents in a rural track, the residents must rotate for more than one-half of the duration of the program to a rural hospital(s) or rural nonprovider(s) site. We have received a question as to what happens to a rural track when a rural hospital that is participating as the rural site is redesignated as urban, while the rural track for the urban hospital is in the process of being established. That is, what happens to the rural track when the rural hospital is redesignated as urban during the period that is used to establish the urban hospital's rural track FTE limitation, prior to the effective date of the urban hospital's rural track FTE limitation being established?

Existing regulations at § 413.79(k)(7) address the scenario where a rural hospital that is participating as the rural site is redesignated as urban, after the rural track FTE limitation for the urban hospital has already become effective. Specifically, the regulations at § 413.79(k)(7) state that if an urban hospital had established a rural track with a hospital located in a rural area and that rural area subsequently becomes an urban area due to the most recent census data and implementation of new labor market area definitions announced by OMB June 6, 2003, the urban hospital may continue to adjust its FTE resident limit for the rural track programs established prior to the adoption of the new labor market area definitions. Therefore, consistent with the existing regulations at § 413.79(k)(7) and with our proposal to allow rural hospitals redesignated as urban to continue receiving a FTE resident cap adjustment for new programs that started while the redesignated hospital was still rural, we proposed to revise the existing regulations applicable to urban hospitals generally. Specifically, we proposed to address the status of the “original” urban hospital's (throughout this preamble, “original” urban hospital refers to the hospital that is the urban participant in the rural track program) rural track FTE limitation, in the situation where a rural hospital that is participating in the original urban hospital's rural track is located in an area redesignated by OMB as urban during the 3-year period that is used to calculate the “original” urban hospital's rural track FTE limitation. We proposed that, in these situations, the “original” urban hospital's opportunity to receive a rural track FTE limitation would not be negatively impacted by the fact that the rural hospital with which it has partnered to be the rural site for its rural training track is located in an area redesignated by OMB as urban during the 3-year period that is used to calculate the “original” urban hospital's rural track FTE limitation. That is, we proposed that the “original” urban hospital may receive a rural track FTE limitation for that new rural track program.

With regard to the status of the rural hospital that is partnered with the “original” urban hospital to serve as a rural training site for the rural training track program, as mentioned earlier, existing regulations at § 413.79(k)(7) address the scenario where a rural hospital that is participating as the rural site is redesignated as urban, after the rural track FTE limitation for the “original” urban hospital has already become effective. (We note that we proposed to apply the existing policy at § 413.79(k)(7), which applies to redesignations that occurred on June 6, 2003, in a similar manner, to redesignations announced by OMB after June 6, 2003, as well.) In addition, we proposed that once the rural hospital is redesignated as located in an urban area due to the implementation of the new OMB labor market area delineations, regardless of whether that redesignation occurs during the 3-year period that is used to establish the rural track FTE limitation for the “original” urban hospital, or after the 3-year period that is used to establish the rural track FTE limitation for the “original” urban hospital, the redesignated urban hospital can no longer qualify as the rural site and the “original” urban hospital would not be able to count those residents under its rural track FTE limitation if it continues to use the redesignated urban hospital as the rural site for purposes of the rural track. However, because the redesignated urban hospital was rural when residents started training in the rural track, we proposed to provide for a 2-year transition period during which either of the following two conditions must be met in order for the “original” urban hospital to be able to count the residents under its rural track FTE limitation when the 2-year transition period ends: (1) The redesignated newly urban hospital must reclassify back to rural under § 412.103 of the regulations; or (2) the “original” urban hospital must find a new geographically rural site to participate as the rural site for purposes of the rural track. We note that we proposed to apply these two criteria both in the case where the rural hospital is redesignated as urban after the “original” urban hospital already has its rural track FTE limit established, and also in the case where the rural hospital is redesignated as urban during the 3-year period when the rural track program is still growing, prior to the rural track FTE limit being established. This 2-year transition period would begin when new OMB labor market area delineations take effect for Medicare payment purposes and would end exactly 2 years from that date. During this 2-year transition period, we would hold the “original” urban hospital harmless and would pay the “original” urban hospital for the FTE residents in the rural track. At the end of the 2-year transition period, in order for the “original” urban hospital to receive payment for a rural track program under § 413.79(k)(1) or (k)(2), either the redesignated urban hospital must be granted reclassification as rural under § 412.103 or the “original” urban hospital must already be training FTE residents at a geographically rural site. We note that, because the rural reclassification provision of § 412.103 only applies to IPPS hospitals and for purposes of section 1886(d) of the Act, it only applies to IPPS hospitals for IME payment purposes and not for direct GME payment purposes because direct GME is authorized under section 1886(h) of the Act. Therefore, if the redesignated hospital reclassifies as rural under § 412.103, the “original” urban hospital would only be able to count FTE residents towards its rural track FTE limitation for IME payment purposes, but not for direct GME payment purposes. In addition, we note that this discussion has centered on the scenario where a rural hospital that is the rural site for purposes of the rural track has been redesignated as urban. Under such a scenario, the redesignated urban hospital does have an option to reclassify as rural. However, as noted above, the reclassification only applies to IPPS hospitals for IME payment purposes. If a nonprovider site is functioning as the rural site under § 413.79(k)(2) for purposes of the rural track and the area where that nonprovider site is located is redesignated as urban, the nonprovider site would not have the option of reclassifying as rural and, therefore, the “original” urban hospital would be required to find a new geographically rural site within the 2-year transition period in order for the “original” urban hospital to receive payment for a rural track program under § 413.79(k)(1) or (k)(2).

The following examples illustrate how the proposed policy would be applied to a rural track in which the rural site is a hospital and the rural hospital has been redesignated as urban:

  • An urban teaching hospital and a rural teaching hospital are participating in training residents in a new rural track program that begins July 1, 2014. Effective October 1, 2014, the rural hospital is redesignated as urban. We proposed that the timeframe for the “original” urban hospital to build the rural track program for purposes of calculating its rural track FTE limitation would continue to be through June 30, 2017. During the time period of October 1, 2014 to September 30, 2016, the redesignated urban hospital would continue participating as a rural hospital and the “original” urban hospital would count FTE residents it is training that are in the rural track for IME and direct GME. However, in order for the “original” urban hospital to continue to get paid for its rural track program after September 30, 2016, then, by September 30, 2016, the redesignated urban hospital must either reclassify as rural under § 412.103 of the regulations for purposes of IME payment only, or the “original” urban hospital must find a new geographically rural hospital or nonprovider site to train the residents in the rural track for more than one-half of their training. If neither of these conditions is met, by September 30, 2016, the “original” urban hospital would not able to receive payment for that specific program as a rural training track under § 413.79(k)(1) or (k)(2) because it would no longer meet the requirement that more than one-half of the training must be provided in a rural setting.
  • Another scenario could be one in which the rural hospital is redesignated as urban after the 3-year cap-building period for the rural track has passed. For example, the rural track program began July 1, 2007, but effective October 1, 2014, the rural hospital is redesignated as urban. We proposed in this scenario that, by September 30, 2016, either the redesignated urban hospital must reclassify to rural under § 412.103 for purposes of IME payment only, or the “original” urban hospital must find a new geographically rural site that can participate as the rural site for purposes of the rural track. If neither of these conditions is met by September 30, 2016, the “original” urban hospital would not be able to receive payment for that specific program as a rural track under § 413.79(k)(1) or (k)(2) because it would no longer meet the requirement that more than one-half of the training must be provided in a rural setting.

We noted that if the “original” urban hospital was not able to meet one of the two proposed conditions noted earlier in this section by the end of the 2-year transition period, but at some point later is able to meet one of the two proposed conditions, we proposed that the “original” urban hospital would be able to “revive” and use its already established rural track FTE limitation from that point forward. In the instance where the “original” urban hospital's rural track FTE limitation was not set because the hospital was not able to meet one of the two proposed conditions by the end of the 2-year transition period, which fell within the 3-year cap-building timeframe, but at some point later is able to meet one of the two proposed conditions, we proposed that the “original” urban hospital would be able to have a rural track FTE limitation calculated and established based on the highest number of FTE residents in any program year training in the rural track in the third year of the program, even if during the third year of the program, the “original” urban hospital was not in compliance with the two proposed conditions. Consistent with similar policy discussed in the FY 2002 IPPS final rule (66 FR 39905), it would be the responsibility of the hospitals involved to provide the necessary information regarding the rotations of the residents in the third program year to the MAC in order for the calculation to be completed and the rural track FTE limit to be set.

In summary, we proposed that any time a rural hospital participating in a rural track is in an area redesignated by OMB as urban after residents started training in the rural track and during the 3-year period that is used to calculate the “original” urban hospital's rural track FTE limitation, the “original” urban hospital may receive a cap adjustment for that rural track after the rural hospital has been redesignated as urban. Furthermore, we proposed that, regardless of whether the redesignation of the rural hospital occurs during the 3-year period that is used to calculate the urban hospital's rural track FTE limitation, or after the 3-year period used to calculate the “original” urban hospital's rural track FTE limitation, the redesignated urban hospital can continue to be considered a rural hospital for purposes of the rural track for up to 2 years. However, by the end of those 2 years, either the redesignated urban hospital must reclassify as rural under § 412.103 for purposes of IME payment only (in addition, this reclassification option only applies to IPPS hospitals, not nonprovider sites) or the “original” urban hospital must have found a new site in a geographically rural area that will serve as the rural site for purposes of the rural track in order for the “original” urban hospital to receive payment under § 413.79(k)(1) or (k)(2).

We proposed to revise the regulations at § 413.79(k)(7) to implement these provisions and to establish that these changes would be effective for cost reporting periods beginning on or after October 1, 2014.

Comment: In general, commenters supported the rural track proposals. Some commenters requested that, instead of providing for a 2-year transition period, CMS provide a 3-year transition period for the original urban hospital to find a new rural site. Commenters stated that it usually takes 3 years of financial and operational planning in order to develop a new training site and that it may take more time in rural areas where staffing is limited. Commenters stated that the program would also need time to request approval from the ACGME or the AOA to move its training site.

Response: We appreciate the commenters' support of our proposals related to rural track programs. As a result of commenters' concerns that 2 years is not a sufficient transition period to allow the “original” urban hospital to find another rural hospital to participate as the rural site for purposes of the rural track, we are finalizing a policy providing for an alternative transition period. The transition period will begin effective with the date of the implementation of the new OMB delineations and extend through the end of the second residency training year following the implementation date of the new OMB delineations. For example, if as a result of the OMB delineations implemented effective October 1, 2014, the rural hospital participating in a rural track program is redesignated as urban, the transition period for the “original” urban hospital to find a new rural site or for the redesignated hospital to reclassify back to rural for IME payment purposes, would last from October 1, 2014 through June 30, 2017. In addition, consistent with the effective date of the new OMB delineations, we are making these final policies effective October 1, 2014. We are revising the regulations at § 413.79(k)(7) to implement this change.

The following examples illustrate how the policy finalized in this rule would be applied to an urban hospital that is training residents as part of a rural track program in the case where the rural hospital participating in the rural track program is redesignated as urban.

  • In this scenario, the rural hospital is redesignated as urban during the cap-building period for the urban hospital's rural track FTE limitation. The urban hospital (referred to as the “original” urban hospital) and the rural hospital are participating in training residents in a rural track program that begins July 1, 2014. Effective October 1, 2014, the rural hospital is redesignated as urban. Because urban teaching hospitals have a 3-year cap-building period in which to grow their rural track FTE limitation, the timeframe for the “original” urban hospital to build the rural track program for purposes of calculating its rural track FTE limitation will be July 1, 2014 (when the program begins) through June 30, 2017. In addition, for purposes of meeting the requirement that residents in a rural track program spend more than one-half of their time training at a rural site, the “original” urban hospital will have a transition period that lasts from October 1, 2014 (the implementation date of the new OMB delineations) through June 30, 2017 (the end of the second residency training year following the implementation date of the new OMB delineations, instead of September 30, 2016, as proposed). During the time period of October 1, 2014 through June 30, 2017, the redesignated urban hospital would continue participating as a rural hospital and the “original” urban hospital would count FTE residents it is training that are in the rural track for IME and direct GME. However, in order for the “original” urban hospital to receive a rural track FTE limitation effective July 1, 2017, and to continue to get paid for its rural track program after June 30, 2017, by June 30, 2017, the redesignated urban hospital must either reclassify as rural under § 412.103 of the regulations for purposes of IME payment only, or the “original” urban hospital must find a new geographically rural hospital or nonprovider site to train the residents in the rural track for more than one-half of their training. If neither of these conditions is met, by June 30, 2017, the “original” urban hospital would not able to receive payment for that specific program as a rural training track under § 413.79(k)(1) or (k)(2) because it would no longer meet the requirement that more than one-half of the training be provided in a rural setting. If at some point later, the “original” urban hospital is able to find a new rural site to participate in the rural track program, the “original” urban hospital would be able to receive a rural track FTE limitation based on the training that occurred from July 1, 2014 through June 30, 2017, and be paid for residents training in the rural track.
  • Another scenario could be one in which the rural hospital is redesignated as urban after the 3-year cap-building period for the “original” urban hospital's rural track FTE limitation has passed. For example, the rural track program began July 1, 2007, but effective October 1, 2014, the rural hospital is redesignated as urban. Again, in this example, the “original” urban teaching hospital has a transition period that runs from October 1, 2014 through June 30, 2017 (instead of September 30, 2016, as proposed). In this scenario, by June 30, 2017, either the redesignated urban hospital must reclassify to rural under § 412.103 for purposes of receiving IME payment only, or the “original” urban hospital must find a new geographically rural site that can participate as the rural site for purposes of the rural track. If neither of these conditions is met by June 30, 2017, the “original” urban hospital would not be able to receive payment for that specific program as a rural track under § 413.79(k)(1) or (k)(2) because it would no longer meet the requirement that more than one-half of the training must be provided in a rural setting. If at some point later, the “original” urban hospital is able to find a new rural site to participate in the rural track program, the “original” urban hospital would be able to use its rural track FTE limitation and be paid for residents training in the rural track.

Comment: Several commenters had concerns regarding rural tracks in general and concerns about what they believed would be unintended consequences resulting from the proposed policies. Commenters recommended that changes to OMB delineations be carefully managed with respect to rural track programs. Commenters stated that rural track programs are one of the best ways to expose residents to practicing in rural areas which, in turn, helps to address physician shortages in those areas. Commenters stated that if a rural hospital is the rural site for a rural track program and that rural hospital is subsequently redesignated as urban, it may not want to reclassify back to rural for a variety of reasons. Commenters stated that if the newly redesignated urban hospital does not want to reclassify back to rural and the “original” urban hospital wants to train residents in another rural area, the “original” urban hospital does not have any means by which to grow its cap. Commenters stated that even if the rural track program would be able to find and move to a different rural site, because the program would usually have the same staff and program director, CMS' policies would consider the program to be an existing program rather than a new program, and therefore, a rural hospital that is a new site for purposes of the rural track program would not be able to receive a cap adjustment for training residents in that program. Commenters stated that it is within CMS' purview to address this problem by changing the definition of a “new” program through the authority provided to the Secretary under section 1886(h)(4)(H)(i) of the Act, which states, “[in] promulgating such rules for purposes of subparagraph (F), the Secretary shall give special consideration to facilities that meet the needs of underserved rural areas.” Commenters believed CMS could use this authority to allow rural hospitals that are new rural track participants to receive a cap adjustment for training residents in the existing rural track program.

One commenter expressed concern that CMS, in its rulemaking, has not provided enough consideration to the promoting of physician training in rural areas. The commenter referred to a study by Candice Chen, MD, et al, in Academic Medicine, which “reports that only 4.8% of all graduates of 759 sponsoring institutions practiced in rural areas and 198 of those 759 institutions produced no rural physicians. This percentage compares extremely unfavorably to the 19.3% of the population classified as rural by the 2010 census.” The commenter stated that it expects that hospitals that have been reclassified as urban will still have a focus on training residents to practice in rural areas. The commenter stated that CMS should realize that the training these hospitals provide is more important than the location of these hospitals, and therefore, CMS should give special consideration to residents training in programs at these hospitals by changing its definition of “new” programs. The commenter included comments it previously submitted on the clarification of the definition of new residency training programs in the rule in the May 22, 2009 Federal Register.

Response: We appreciate the commenters' support of residency training in rural areas, and we may consider their general concerns regarding CMS' policies related to new programs and training in rural areas for future rulemaking. However, because we did not specifically propose any changes to our existing policy regarding what constitutes a “new” versus an “existing” program, we are not addressing those comments at this time. Instead, we wish to focus on several of the commenters' concerns specifically related to our proposals in the FY 2015 IPPS/LTCH PPS proposed rule.

In response to the commenters' concerns that if the “original” urban hospital wishes to establish training in another rural area, there is no way for the “original” urban hospital to grow its cap, we believe that the commenters have misunderstood our proposal. We proposed that if the “original” urban hospital does find a new rural hospital for its existing rural track program, the original urban hospital would be able to apply its existing rural track FTE limitation to the residents that train at its hospital as part of that rural track. In addition, if the “original” urban hospital was not able to receive a rural track FTE limitation because either the redesignated urban hospital did not reclassify back to rural for IME payment purposes during the transition period or the “original” urban hospital was not able to find a new rural site during the transition period, but either of these conditions is met in the future, the “original” urban hospital would receive a rural track FTE limitation at that time, based on the training that occurred during the 3-year cap-building period for the rural track FTE limitation. We also point out that if the “original” urban hospital moves the rural portion of its training to a nonprovider site that is located in a geographically rural area, under existing regulations at 42 CFR 413.79(k)(2), the “original urban” hospital may continue to count the FTE residents training in the rural nonprovider site for more than one-half the duration of the program up to its own existing rural track FTE limitation. In addition, if in the future, the “original” urban hospital would want to develop a rural track program in a different specialty, it would be able to receive a separate rural track FTE limitation for that rural track program in a different specialty.

In terms of any potential cap adjustment for a rural hospital that trains residents as part of the rural track, if the rural track is considered a new program for Medicare payment purposes, and if at the time that the “original” urban hospital moves the program to the new rural hospital, the new rural training track is still within its cap-building period, any rural hospital that trains residents in that new program during the cap-building period for that new program will receive a permanent cap adjustment. Therefore, if the “original” urban hospital is able to find a new rural hospital to participate in the rural track during the cap-building period for the new rural track program, that new rural hospital will, in fact, also be able to receive a cap adjustment and receive direct GME and IME payments for training residents in the new rural track program.

After consideration of the public comments we received, we are finalizing a policy that if a rural hospital is training residents in a rural training track and is in an area redesignated by OMB as urban during the 3-year period that is used to calculate the “original” urban hospital's rural track FTE limitation, the “original” urban hospital may receive a cap adjustment for that rural track after the rural hospital has been redesignated as urban. However, regardless of whether the redesignation of the rural hospital occurs during the 3-year period that is used to calculate the “original” urban hospital's rural track FTE limitation, or even after the 3-year period used to calculate the “original” urban hospital's rural track FTE limitation, the redesignated urban hospital may continue to be considered a rural hospital for purposes of the rural track for the term of a transition period. That transition period begins effective with the date the new OMB delineations are implemented by CMS and lasts through the end of the second residency training year following the implementation date of the new OMB delineations. By the end of the transition period, either the redesignated urban hospital must reclassify as rural under § 412.103 for purposes of IME payment only (in addition, this reclassification option only applies to IPPS hospitals, not nonprovider sites), or the “original” urban hospital must have found a new site in a geographically rural area that will serve as the rural site for purposes of the rural track in order for the “original” urban hospital to receive payment under § 413.79(k)(1) or (k)(2). The finalized regulations at § 413.79(k)(7) state the following:

  • Effective prior to October 1, 2014, if an urban hospital had established a rural track training program under the provisions of paragraph (k) with a hospital located in a rural area and that rural area subsequently becomes an urban area due to the most recent census data and implementation of the new labor market area definitions announced by OMB on June 6, 2003, the urban hospital may continue to adjust its FTE resident limit in accordance with paragraph (k) for the rural track programs established prior to the adoption of such new labor market area definitions. In order to receive an adjustment to its FTE resident cap for a new rural track residency program, the urban hospital must establish a rural track program with hospitals that are designated rural based on the most recent geographical location designations adopted by CMS.
  • Effective October 1, 2014, if an urban hospital started a rural track training program under the provisions of paragraph (k) with a hospital located in a rural area and, during the 3-year period that is used to calculate the urban hospital's rural track FTE limit, that rural area subsequently becomes an urban area due to the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, the urban hospital may continue to adjust its FTE resident limit in accordance with paragraph (k) and subject to paragraph (k)(7)(iii) for the rural track programs started prior to the adoption of such new OMB standards for delineating statistical areas.
  • Effective October 1, 2014, if an urban hospital started a rural track training program under the provisions of paragraph (k) with a hospital located in a rural area and that rural area subsequently becomes an urban area due to the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, regardless of whether the redesignation of the rural hospital occurs during the 3-year period that is used to calculate the urban hospital's rural track FTE limit, or after the 3-year period used to calculate the urban hospital's rural track FTE limit, the urban hospital may continue to adjust its FTE resident limit in accordance with paragraph (k) based on the rural track programs started prior to the change in the hospital's geographic designation. In order for the urban hospital to receive or use the adjustment to its FTE resident cap for training FTE residents in the rural track residency program that was started prior to the most recent OMB standards for delineating statistical areas adopted by CMS, one of the following two conditions must be met by the end of a period that begins when the most recent OMB standards for delineating statistical areas are adopted by CMS and continues through the end of the second residency training year following the date the most recent OMB delineations are adopted by CMS: the hospital that has been redesignated from rural to urban must reclassify as rural under § 412.103, for purposes of IME only; or the urban hospital must find a new site that is geographically rural consistent with the most recent geographical location delineations adopted by CMS. In order to receive an adjustment to its FTE resident cap for an additional new rural track residency program, the urban hospital must participate in a rural track program with sites that are geographically rural based on the most recent geographical location delineations adopted by CMS.

We also have determined that there is an outdated, incorrect reference included in the definition of “Rural track FTE limitation” under § 413.75(b). The reference included in the definition is “§ 413.79(l)”. The correct reference is “§ 413.79(k)”. Therefore, as we proposed, we are making a technical correction to the definition of “Rural track FTE limitation” so that it means the maximum number of residents (as specified in § 413.79(k)) training in a rural track residency program that an urban hospital may include in its FTE count and that is in addition to the number of FTE residents already included in the hospital's FTE cap.

4. Clarification of Policies on Counting Resident Time in Nonprovider Settings Under Section 5504 of the Affordable Care Act

In the November 24, 2010 final rule with comment period (75 FR 71808, 72134 through 72141, and 72153), we implemented section 5504 of the Affordable Care Act regarding counting resident time in nonprovider settings. We also mentioned the scope of section 5504 of the Affordable Care Act in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27638) and final rule (78 FR 50735). Section 5504(a) of the Affordable Care Act made changes to section 1886(h)(4)(E) of the Act to reduce the costs that hospitals must incur for residents training in nonprovider sites in order to count the FTE residents for purposes of Medicare direct GME payments, but did so only on a prospective basis in connection with certain specified cost reporting periods. Notably and more specifically, section 5504(a)(3) of the Affordable Care Act amended the Act effective only for “cost reporting periods beginning on or after July 1, 2010,” for direct GME, to permit hospitals to count the time that a resident trains in activities related to patient care in a nonprovider site in its FTE count if the hospital incurs the costs of the residents' salaries and fringe benefits for the time that the resident spends training in the nonprovider site. Section 5504(b)(2) of the Affordable Care Act made similar changes to section 1886(d)(5)(B)(iv) of the Act for IME payment purposes, with the provision being effective only for discharges occurring on or after July 1, 2010, for IME. In connection with those periods and discharges, if more than one hospital incurs the residency training costs in a nonprovider setting, under certain circumstances, sections 5504(a)(3) and (b)(2) of the Affordable Care Act allow each hospital to count a proportional share of the training time that a resident spends training in that setting, as determined by a written agreement between the hospitals. When Congress enacted section 5504 of the Affordable Care Act, it retained the statutory language which provides that a hospital can only count the time so spent by a resident under an approved medical residency training program in its FTE count if that one single hospital by itself “incurs all, or substantially all, of the costs for the training program in that setting.” In doing so, Congress also revised the statutory language in sections 5504(a)(1) and (b)(1) to explicitly make this longstanding substantive standard and requirement applicable to “cost reporting periods beginning before July 1, 2010” for direct GME, and to “discharges occurring on or after October 1, 1997, and before July 1, 2010,” for IME (sections 1886(d)(5)(B)(iv)(I) and 1886(h)(4)(E)(i) of the Act). Beginning at least as early as 1988, the Secretary consistently noted in the preamble of various rules that the statute only allowed a hospital to count the time that its residents spent training in a nonprovider site in the FTE resident count for direct GME and IME purposes if that single hospital incurred “all of substantially all” of the costs of the training program in that setting. Indeed, in Borgess Medical Center v. Sebelius (966 F.Supp.2d at *6-*7 (D.D.C. 2013)), a court noted that CMS had done so in 1998, 2003, and 2007 preambles of rules. For a full discussion of the longstanding substantive standard and requirement that a hospital can only count residents training if that one single hospital incurs all or substantially all of the costs for the training, we refer readers to the discussion in the November 24, 2010 final rule with comment period (75 FR 72134 through 72141), the May 11, 2007 final rule (72 FR 26953 and 26969), the August 1, 2003 final rule (68 FR 45439), the July 31, 1998 final rule (63 FR 40954 and 40995), the September 29, 1989 final rule (54 FR 40286 and 40288), and the September 21, 1988 proposed rule (53 FR 36589 and 36591).

Section 5504(c) of the Affordable Care Act specifies that the amendments made by the provisions of sections 5504(a) and (b) “shall not be applied in a manner that requires reopening of any settled hospital cost reports as to which there is not a jurisdictionally proper appeal pending as of the date of the enactment of this Act on the issue of payment for indirect costs of medical education . . . or for direct graduate medical education costs. . . .” The date of enactment of the Affordable Care Act was March 23, 2010.

In the November 24, 2010 final rule with comment period, we revised the regulations at § 412.105(f)(1)(ii)(E) for IME and §§ 413.78(f) and (g) for direct GME to reflect the changes made by section 5504 of the Affordable Care Act. Section 413.78(g) is the implementing regulation that corresponds to the statutory amendments set forth in sections 5504(a)(3) and (b)(2) of the Affordable Care Act. The introductory regulatory language of § 413.78(g) explicitly states that paragraph (g) governs only “cost reporting periods beginning on or after July 1, 2010.” Paragraph (g)(5) of § 413.78 also expressly states that the paragraph is limited to “cost reporting periods beginning on or after July 1, 2010.” Accordingly, we have repeatedly stated, and we believe that the existing regulation makes plain, that paragraph (g) of § 413.78 “is explicitly made applicable only to `cost reporting periods beginning on or after July 1, 2010,' whereas earlier cost reporting periods are governed by other preceding paragraphs of § 413.78” (78 FR 50735). In addition, we also revised the definition of “all or substantially all of the costs for the training program in the nonhospital setting” in the regulations at § 413.75(b) to reflect that both the statute and regulations require that, for cost reporting periods beginning on and after July 1, 2007 and before July 1, 2010, one hospital must by itself incur “all or substantially all of the costs” of the residents training in the nonprovider site in order for the hospital to receive Medicare IME and direct GME payment for that training. Finally, we also revised the IME regulations at § 412.105 to reflect these statutory amendments, by incorporating by reference § 413.78(g).

Despite the fact that sections 5504(a) and (b) of the Affordable Care Act provide clear effective dates with respect to the amendments provided therein to sections 1886(h)(4)(E) and 1886(d)(5)(B)(iv) of the Act, and that the preamble discussion of the implementation of these provisions and further discussion of the statutory amendments in the November 24, 2010 final rule with comment period and in the August 19, 2013 final rule provide further explanation that, specifically, nothing in section 5504(c) overrides those effective date (75 FR 72136), we have received questions about the applicability of section 5504(c) and the associated regulation text at § 413.78(g)(6). Specifically, questions have been raised with respect to the applicability of sections 5504(c) of the Affordable Care Act and § 413.78(g)(6) of the regulations to periods prior to July 1, 2010, particularly if a hospital had, as of March 23, 2010, appealed an IME or direct GME issue for a settled cost reporting period occurring prior to July 1, 2010. As noted earlier, section 5504(c) of the Affordable Care Act provides that the amendments made by the provisions of sections 5504(a) and (b) “shall not be applied in a manner that requires reopening of any settled hospital cost reports as to which there is not a jurisdictionally proper appeal pending as of . . . [March 23, 2010] on the issue of payment for indirect costs of medical education . . . or for direct graduate medical education costs. . . .”

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28153), we stated that upon revisiting the existing regulation text, we determined that § 413.78(g)(6) was not written in a manner that is as consistent with section 5504(c) of the Affordable Care Act and reflective of our reading of section 5504 and our policy as it could be. Specifically, § 413.78(g)(6) states that the provisions of paragraphs (g)(1)(ii), (g)(2), (g)(3), and (g)(5) of the section cannot be applied in a manner that would require the reopening of settled cost reports, except those cost reports on which there is a jurisdictionally proper appeal pending on direct GME or IME payments as of March 23, 2010. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28152 through 28154 and 28307), we reiterated our existing interpretation of the statutory amendments made by sections 5504(a), (b), and (c) of the Affordable Care Act and also proposed to clarify the regulation text implementing these provisions by revising the language at § 413.78(g)(6) to read more consistently with our reading of section 5504 and the language in section 5504(c) of the Affordable Care Act and to ensure no further confusion with respect to the applicability of section 5504(c) of the Affordable Care Act and § 413.78(g)(6) of the regulations.

We believe that sections 5504(a) and (b) of the Affordable Care Act contained three primary directives (a fourth regarding recordkeeping requirement is tangential to this discussion): (1) under sections 5504(a)(1) and (b)(1) of the Affordable Care Act (sections 1886(h)(4)(E)(i) and 1886(d)(5)(B)(iv)(I) of the Act), for “cost reporting periods beginning before July 1, 2010” for direct GME, and for “discharges occurring on or after October 1, 1997, and before July 1, 2010” for IME, these sections explicitly retained the statutory language that provides that a hospital can only count the time so spent by a resident under an approved medical residency training program in its FTE count if a hospital by itself “incurs all, or substantially all, of the costs for the training program in that setting”; (2) under sections 5504(a)(3) and (b)(2) of the Affordable Care Act (sections 1886(h)(4)(E)(ii) and 1886(d)(5)(B)(iv)(II) of the Act), for “cost reporting periods beginning on or after July 1, 2010” for direct GME, and for “discharges occurring on or after July 1, 2010” for IME, these sections eliminated the “all or substantially all” requirement, instead requiring a hospital to incur the residents' salaries and fringe benefits for the time spent at the nonprovider site; and (3) under sections 5504(a)(3) and (b)(2) of the Affordable Care Act (sections 1886(h)(4)(E)(ii) and 1886(d)(5)(B)(iv)(II) of the Act), for “cost reporting periods beginning on or after July 1, 2010” for direct GME, and for “discharges occurring on or after July 1, 2010” for IME, these sections created a new provision with regard to allowing more than one hospital to share the costs of residents training in a nonprovider setting under certain circumstances, in order for each hospital to count a proportional share of the FTE training time in the nonprovider setting.

Separately from sections 5504(a) and (b) of the Affordable Care Act, section 5504(c) of the Affordable Care Act, as mentioned earlier, specifies that the amendments made by the provisions of sections 5504(a) and (b) “shall not be applied in a manner that requires reopening of any settled hospital cost reports as to which there is not a jurisdictionally proper appeal pending as of” March 23, 2010, the date of the enactment of the Affordable Care Act, on the issue of payment for IME and direct GME. When we proposed to implement section 5504(c) in the August 3, 2010 proposed rule (75 FR 46385) and when we implemented section 5504(c) in the November 24, 2010 final rule with comment period (75 FR 72136), we had to consider what new meaning it was adding to sections 5504(a) and (b) of the Affordable Care Act because unlike, for example, section 5505 of the Affordable Care Act which has an effective date prior to enactment of the Affordable Care Act and, therefore, would apply to prior cost reporting periods, section 5504's applicable effective date for the new standards it creates was July 1, 2010, a date that came after enactment of the Affordable Care Act and was fully prospective. As we stated in the November 24, 2010 final rule with comment period (75 FR 72136), “Section 5504 is fully prospective with an explicit effective date of July 1, 2010, for the new standards it creates. Nothing in section 5504(c) overrides that effective date. Section 5504(c) merely notes that the usual discretionary authority of Medicare contractors to reopen cost reports is not changed by the provisions of section 5504; it simply makes clear that Medicare contractors are not required by reason of section 5504 to reopen any settled cost report as to which a provider does not have a jurisdictionally proper appeal pending. It does not require reopening in any circumstance; and the new substantive standard is, in any event, explicitly prospective. We believe if Congress had wanted to require such action or to apply the new standards to cost years or discharges prior to July 1, 2010, it would have done so in far more explicit terms.” We also noted in that rule (75 FR 72139) that “[the] statute does not provide CMS discretion to allow the counting of resident time spent in shared nonprovider site rotations for cost reporting periods beginning prior to July 1, 2010.” We continue to believe that Congress was clear in amending sections 1886(h)(4)(E) and 1886(d)(5)(B)(iv) of the Act to provide for new standards to be applied only prospectively, effective for cost reporting periods beginning on or after, and discharges occurring on or after, July 1, 2010. We also continue to believe that the plain meaning of section 5504(c) of the Affordable Care Act is that the Secretary is not required to reopen a cost report when there is no jurisdictionally proper appeal pending as of March 23, 2010, the date of the enactment of the Affordable Care Act, on the issue of payment for IME and direct GME. Therefore, we believe that section 5504(c) of the Affordable Care Act is merely a confirmation of the Secretary's existing discretionary authority in one particular context, and that sections 5504(a) and (b) of the Affordable Care Act and their effective dates become all the more prominent, and are not affected by section 5504(c).

As noted earlier, we revised the regulations at § 412.105(f)(1)(ii)(E) for IME, and § 413.78(g) for direct GME, to reflect the changes made by section 5504 of the Affordable Care Act in the November 24, 2010 final rule with comment period. We reiterate here that the introductory language of § 413.78(g) explicitly states that paragraph (g) governs only “cost reporting periods beginning on or after July 1, 2010” and paragraph (g)(5) also expressly states that the paragraph is limited to “cost reporting periods beginning on or after July 1, 2010” (78 FR 50735 and 78 FR 27639). As we noted before, we believe that the paragraphs of the regulations which precede paragraph (g), particularly paragraphs (c) through (f), consistent with the statute, make clear that a hospital may only count the time so spent by a resident under an approved medical residency training program in its FTE count, in connection with its pre-July 1, 2010 cost reporting periods and pre-July 1, 2010 patient discharges, if that one single hospital by itself “incurs all, or substantially all, of the costs for the training program in that setting.” Separately, we believe that the new standards set forth in sections 5504(a)(3) and (b)(2) of the Affordable Care Act and implemented by regulation at §§ 413.78(g) and 412.105(f)(1)(ii)(E), allowing cost sharing under certain circumstances do not ever apply to pre-July 1, 2010 cost reporting periods and pre-July 1, 2010 patient discharges. Moreover, we continue to believe the language in paragraph (g)(6) (along with the remainder of paragraph (g)) only applies to cost reporting periods beginning on or after July 1, 2010 and does not apply retroactively to cost reporting periods beginning before July 1, 2010. We had intended that the language under § 413.78(g)(6) do no more than simply paraphrase the language in section 5504(c) of the Affordable Care Act.

Accordingly, we believe that it is apparent that the provisions of sections 5504(a)(3) and (b)(2) of the Affordable Care Act are not to be applied prior to July 1, 2010, irrespectively of whether a hospital may have had a jurisdictionally proper appeal pending as of March 23, 2010, on an IME or direct GME issue from a cost reporting period occurring prior to July 1, 2010.

In the FY 2015 IPPS/LTCH PPS proposed rule, we reiterated our existing interpretation of the statutory amendments made by sections 5504(a) and (b) of the Affordable Care Act and also proposed to clarify the regulatory text that implements these provisions by revising the § 413.78(g)(6) to be more consistent with the language at section 5504(c) of the Affordable Care Act. We proposed to revise the regulatory language to read as follows: “The provisions of paragraphs (g)(1)(ii), (g)(2), (g)(3), and (g)(5) of this section shall not be applied in a manner that requires reopening of any settled cost reports as to which there is a jurisdictionally proper appeal pending as of March 23, 2010, on direct GME or IME payments. Cost reporting periods beginning before July 1, 2010 are not governed by paragraph (g) of this section.” The IME regulation at § 412.105(f)(1)(ii)(E) includes a reference to § 413.78(g)(6); therefore, no proposed change was needed to this section.

Comment: One commenter supported CMS' proposed changes with regard to implementation of section 5504 of the Affordable Care Act. Other commenters objected to CMS' interpretation that section 5504 is fully prospective with an effective date of July 1, 2010, and that CMS' proposed revision of § 413.78(g)(6) would be with a “retroactive effective date.” The commenters asserted that CMS' interpretation is contrary to the plain meaning of the statute because Congress expected that cost reports that were settled prior to 2010 would not be reopened, thereby explicitly adding under section 5504(c) that if the cost report was not settled, and if there was a jurisdictionally proper appeal pending as of the date of the enactment of the Affordable Care Act, the provisions of section 5504 would apply. One commenter noted that an interpretation must “give effect, if possible, to every clause and word of the statute” (United States v. Menasche, 348 U.S. 528, 538-39 (1955)). The commenter therefore believed that subsection (c) would be superfluous if section 5504 were only prospective, particularly considering that Congress had no need to instruct the Secretary not to do something that she would not have done anyway (that is, reopen cost reports without a statutory mandate). The commenters asserted that CMS, in the August 3, 2010 proposed rule (75 FR 46385), initially interpreted section 5504(c) to effectively override the directives in sections 5504(a) and (b) and to require reopening and application of the new standards set forth in section 5504(a)(3) and section 5504(b)(2) to pre-July 1, 2010 cost reporting periods and patient discharges whenever a hospital had a pending, jurisdictionally proper appeal pending on a direct GME or IME issue as of March 23, 2010. The commenters also did not believe it is appropriate for CMS to contend that section 5504 is strictly prospective but, at the same time, propose to clarify an amendment to the regulations at § 413.78(g)(6) “with retroactive effect to 2010.” One commenter argued that a final rule must be a “logical outgrowth” of the proposed rule, and the final regulation implemented in the November 24, 2010 final rule with comment period was the same as that proposed. The commenter surmised that CMS “likely did not revise the final codified regulation in order to avoid a challenge that the final rule was not the `logical outgrowth' of the proposed rule,” and asserted that CMS' proposed clarification of § 413.78(g)(6) in the FY 2015 IPPS/LTCH PPS proposed rule cannot be applied prior to October 1, 2014. The commenters suggested that the Secretary and CMS reconsider its proposal to change § 413.78(g)(6), and acknowledge that, as promulgated in the November 24, 2010 final rule with comment period, § 413.78(g)(6) required reopening of a hospital cost report for which a jurisdictionally proper appeal was pending regarding GME and/or IME as of the date of enactment of the Affordable Care Act.

Response: We agree with the commenters that some meaning must be attributed to the statutory language at section 5504(c) of the Affordable Care Act that the amendments made by the provisions of sections 5504(a) and (b) “shall not be applied in a manner that requires reopening of any settled hospital cost reports as to which there is not a jurisdictionally proper appeal pending as of the date of the enactment of this Act on the issue of payment for indirect costs of medical education . . . or for direct graduate medical education costs. . . .” Congress knows how to explicitly provide for retroactive application or apply new standards to pending appeals when it so desires. Indeed, the same statute at issue here, the Affordable Care Act, contains numerous sections that, unlike section 5504 of the Affordable Care Act, are either explicitly retroactive or expressly apply new standards to pending appeals. For example, section 5505 of the Affordable Care Act (unlike section 5504) contains explicitly retroactive language. Section 5505 (c)(1) of the Affordable Care Act states, “[e]xcept as otherwise provided, the Secretary . . . shall implement the amendments made by this section in a manner so as to apply to cost reporting periods beginning on or after January 1, 1983”; section 5505(c)(2) instructs that a subsection “shall apply to cost reporting periods beginning on or after July 1, 2009”); section 5505(c)(3) instructs that another subsection “shall apply to cost reporting periods beginning on or after October 1, 2001”. Section 5504 has nothing comparable to the express retroactive language which is to be found in section 5505. As another example, section 1556(c) is explicitly retroactively and expressly applies a standard to pending appeals, unlike section 5504 of the Affordable Care Act. Section 1556(c) of the Affordable Care Act states, “[t]he amendments made by this section shall apply with respect to claims filed under [a 1976 statute] after January 1, 2005, that are pending on or after the date of enactment of this Act”. The fact that Congress was explicit when it intended for particular provisions of the Affordable Care Act to apply retroactively and/or to apply to pending proceedings, but section 5504 of the Affordable Care Act contains no such statements, suggests that Congress did not intend for the new substantive standards set forth in sections 5504(a)(3) and (b)(2) of the Affordable Care Act to apply to earlier periods and discharges and/or to pending appeals. Instead, we can presume that Congress acted intentionally and purposely by omitting such language in section 5504 of the Affordable Care Act. As we explained in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28152 through 28154), when we proposed to implement section 5504(c) in the August 3, 2010 proposed rule (75 FR 46385), and when we implemented section 5504(c) in the November 24, 2010 final rule with comment period (75 FR 72136), we had to consider what new meaning it was adding to sections 5504(a) and (b) of the Affordable Care Act because unlike, for example, section 5505 of the Affordable Care Act, which has an effective date prior to enactment of the Affordable Care Act and, therefore, would apply to prior cost reporting periods, section 5504's applicable effective date for the new standards it creates was July 1, 2010, a date that came after enactment of the Affordable Care Act and was fully prospective (the new standards being that hospitals would be permitted to count the time that a resident trains in activities related to patient care in a nonprovider site in its FTE count if the hospital incurs the costs of the residents' salaries and fringe benefits for the time that the resident spends training in the nonprovider site, and if more than one hospital incurs the residency training costs in a nonprovider setting, under certain circumstances, each hospital could count a proportional share of the training time that a resident spends training in that setting). As we stated in the November 24, 2010 final rule with comment period (75 FR 72136), “Section 5504 is fully prospective with an explicit effective date of July 1, 2010, for the new standards it creates. Nothing in section 5504(c) overrides that effective date. Section 5504(c) merely notes that the usual discretionary authority of Medicare contractors to reopen cost reports is not changed by the provisions of section 5504; it simply makes clear that Medicare contractors [MACs] are not required by reason of section 5504 to reopen any settled cost report as to which a provider does not have a jurisdictionally proper appeal pending. It does not require reopening in any circumstance; and the new substantive standard is, in any event, explicitly prospective. We believe if Congress had wanted to require such action or to apply the new standards to cost years or discharges prior to July 1, 2010, it would have done so in far more explicit terms.”

Therefore, we believe we were clear in the November 24, 2010 final rule with comment period that we did not interpret section 5504(c) to override the clear directives in sections 5504(a) and (b) concerning the substantive standards that would apply to pre- and post-July 1, 2010 cost reporting periods and discharges. We rejected the notion there that section 5504(c) requires reopening and application of the new, more generous standard (which sections 5504(a)(3) and (b)(2) created and expressly made “effective” only for cost reporting periods beginning and discharges occurring “on or after July 1, 2010”) to earlier periods and discharges whenever a hospital had a jurisdictionally proper appeal pending on direct or indirect GME as of the Affordable Care Act's enactment. Since that time, we have maintained our position that the new, more generous standard set forth in sections 5504(a)(3) and (b)(2) only apply to cost reporting periods beginning, and discharges occurring, “on or after July 1, 2010.” We have at the same time noted that Congress chose in sections 5504(a) and (b) to explicitly and pointedly retain the longstanding statutory substantive standard (that requires a single hospital to incur “all, or substantially all” of the costs of the nonprovider residency training before it may receive Medicare direct GME and IME payment for that training), and make it applicable to pre- July 1, 2010 cost reporting periods and discharges, while creating a new, more generous standard which it directed would apply to later periods and discharges. It is Congress who decided that the July 1, 2010 date would be significant, and we are honoring the choice Congress made. Therefore, we disagree with the commenters that it is inappropriate for CMS to propose to clarify an amendment to the regulations at § 413.78(g)(6) “with retroactive effect to 2010.” Moreover, we have consistently expressed our position that the new substantive standards which sections 5504(a)(3) and (b)(2) added to the Medicare statute apply only to cost reporting periods beginning, and discharges occurring, on or after July 1, 2010 (75 FR 46385) and 75 FR 72136). Accordingly, our proposed clarification of § 413.78(g)(6) reiterating our existing interpretation of the statutory amendments made by sections 5504(a) and (b) of the Affordable Care Act is appropriate.

Commenters argued that CMS' statements in the August 3, 2010 proposed rule initially interpreted section 5504(c) to mean that section 5504 could be applied retroactively to hospitals that indeed had a pending, jurisdictionally proper appeal pending on a direct GME or IME issue as of March 23, 2010. However, the commenters misapprehended the position we took in the August 3, 2010 proposed rule. While it is true that the proposed rule defined the meaning of the term “pending, jurisdictionally proper appeal” that appears in section 5504(c) of the Affordable Care (75 FR 46385), it did not state that reopening was required when a hospital had such an appeal pending as of the date of enactment or in other circumstances. In addition, it never stated that the new standard set forth in sections 5504(a) and (b) could ever apply to a cost reporting period beginning prior to July 1, 2010 for direct GME purposes, or to a discharge occurring before July 1, 2010 for IME purposes. Quite to the contrary, the proposed rule noted that “[f]or direct GME payments, [section 5504] is effective for cost reporting periods beginning on or after July 1, 2010; for IME payments, the provision is effective for discharges occurring on or after July 1, 2010” (75 FR 46385 and 46386 (along similar lines)), and advised that: “We are proposing to revise our regulation at § 413.75(b) accordingly to conform to these new statutory requirements [in section 5504 of the Affordable Care Act]. Specifically, we are proposing to revise the existing definition of “all or substantially all of the costs for the training program in the nonhospital setting” to be effective for cost reporting periods beginning on or after July 1, 2007, and before July 1, 2010. We also are proposing to add a new § 413.78(g) that details how hospitals should count residents that train in nonhospital sites for cost reporting periods beginning on or after July 1, 2010.”

Therefore, the August 3, 2010 proposed rule recognized that section 5504 required pre-July 1, 2010 cost reporting periods and discharges to be subject to the longstanding requirement that a single hospital incur all or substantially all of the costs of residents training in a nonprovider site, not the new, more generous standard set forth in section 5504(a) and (b) of the Affordable Care Act. As noted, in the November 24, 2010 final rule, in response to comments, the Secretary only made it even more explicit that she did not read section 5504(c) to require her to retroactively apply the new substantive standard in sections 5504(a) and (b) to pre-July 1, 2010 cost reporting periods and discharges (75 FR 72136 and 72153).

At least one court has held that our reading of section 5504 and the implementing regulation is reasonable and has rejected many of the arguments that the commenters made. The Eastern District of Michigan has recognized that “while section 5504(c) [of the Affordable Care Act] establishes that if there was not a pending appeal concerning a final cost report when the Affordable Care Act was enacted, that cost report will not be reopened, section 5504(c) does not establish that if there was a pending appeal concerning a final cost report when the Affordable Care Act was enacted, that cost report must be reopened; on this point the statute is silent,” and “Congress expressly indicated in the statute itself what standards apply to what cost periods” in sections 5504(a) and (b) of the Affordable Care Act (Covenant Medical Center v. Sebelius, No. 12-12901, 2014 WL 340247, at *8-*10 (E.D. Mich. Jan. 30, 2014)). The district court also noted that our reading of section 5504 gives effect to every clause and word of the provision as it honors the effective dates and standards prescribed in sections 5504(a) and (b). The court further noted that the current version of 42 CFR 413.78(g)(6) is “almost identical to section 5504(c)” and held that CMS' “interpretation of § 5504(c) is not undermined by her identical conclusion regarding section 413.78(g)(6)” (Covenant Medical Center v. Sebelius, No. 12-12901, 2014 WL 340247, at *11-12 (E.D. Mich. Jan. 30, 2014)). Therefore, we disagree with the commenter that surmised that, in the November 24, 2010 final rule with comment period, CMS “likely did not revise the final codified regulation in order to avoid a challenge that the final rule was not the `logical outgrowth' of the proposed rule.” Rather, as the court noted, the current version of 42 CFR 413.78(g)(6) is “almost identical to section 5504(c)” and held that CMS' “interpretation of § 5504(c) is not undermined by her identical conclusion regarding section 413.78(g)(6)” (Covenant Medical Center v. Sebelius, No. 12-12901, 2014 WL 340247, at *11-12 (E.D. Mich. Jan. 30, 2014)). We had intended that the language under § 413.78(g)(6) do no more than simply paraphrase the language in section 5504(c) of the Affordable Care Act. Accordingly, we did not believe that it was necessary to revise the final regulation in the November 24, 2010 final rule with comment period. Nevertheless, as stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28153), because we have received questions about the applicability of section 5504(c) and the associated regulation text at § 413.78(g)(6), we took the opportunity to revisit the regulations and clarify them so that they would be even more consistent with the language at section 5504(c).

Comment: One commenter asked that CMS clarify that section 5504 “filled a gap in the law” regarding funding of residency training occurring in a nonprovider setting “by establishing for the first time the definitive law regarding Medicare payment for medical education to hospitals jointly funding training in a nonprovider setting,” and that section 5504 applies to hospitals with jurisdictionally proper appeals regarding that issue that were pending as of the date of the enactment of the Affordable Care Act.

Response: We do not agree with the commenter that section 5504 “filled a gap in the law” regarding more than one hospital incurring the costs of training residents in a nonprovider setting. Beginning at least as early as 1988, the Secretary has consistently noted in the preamble of various rules that the statute only allowed a hospital to count the time that its residents spent training in a nonprovider site in the FTE resident count for direct GME and IME purposes if that single hospital incurred “all of substantially all” of the costs of the training program in that setting. Indeed, in Borgess Medical Center v. Sebelius (966 F.Supp.2d 1 at *6-*7 (D.D.C. 2013)), a court noted that CMS had done so in 1998, 2003, and 2007 preambles of rules. For a full discussion of the longstanding substantive standard and requirement that a hospital can only count residents training if that one single hospital incurs all or substantially all of the costs for the training, we refer readers to the discussion in the November 24, 2010 final rule with comment period (75 FR 72134 through 72141), the May 11, 2007 final rule (72 FR 26953 and 26969), the August 1, 2003 final rule (68 FR 45439), the July 31, 1998 final rule (63 FR 40954 and 40995), the September 29, 1989 final rule (54 FR 40286 and 40288), and the September 21, 1988 proposed rule (53 FR 36589 and 36591). We continue to believe that Congress was clear in amending sections 1886(h)(4)(E) and 1886(d)(5)(B)(iv) of the Act to provide for new standards to be applied only prospectively, effective for cost reporting periods beginning on or after, and discharges occurring on or after, July 1, 2010. Moreover, we continue to believe the language in paragraph (g)(6) of § 413.78 (along with the remainder of paragraph (g)) only applies to cost reporting periods beginning on or after July 1, 2010, and does not apply retroactively to cost reporting periods beginning before July 1, 2010. We believe that the new standards set forth in sections 5504(a)(3) and (b)(2) of the Affordable Care Act and implemented by regulation at §§ 413.78(g) and 412.105(f)(1)(ii)(E), allowing cost sharing under certain circumstances, do not ever apply to pre-July 1, 2010 cost reporting periods and pre-July 1, 2010 patient discharges. We had intended that the language under § 413.78(g)(6) do no more than simply paraphrase the language in section 5504(c) of the Affordable Care Act.

Accordingly, after consideration of the comments we received, we are not making any changes to our proposed clarification to the regulatory language at § 413.78(g)(6). The regulatory language at § 413.78(g)(6) states that the provisions of paragraphs (g)(1)(ii), (g)(2), (g)(3), and (g)(5) of the section shall not be applied in a manner that requires reopening of any settled cost reports as to which there is not a jurisdictionally proper appeal pending as of March 23, 2010, on direct GME or IME payments. Cost reporting periods beginning before July 1, 2010 are not governed by paragraph (g) of the section. The IME regulations at § 412.105(f)(1)(ii)(E) include a reference to § 413.78(g)(6); therefore, no change is needed to this section of the IME regulations.

5. Changes to the Review and Award Process for Resident Slots Under Section 5506 of the Affordable Care Act

In the past, if a teaching hospital closed, its direct GME and IME FTE resident cap slots would be “lost” because those cap slots are associated with a specific hospital's Medicare provider agreement, which would be retired upon the hospital's closure. Under existing regulations at § 413.79(h) for direct GME and § 412.105(f)(1)(ix) for IME, a hospital that is training FTE residents at or in excess of its FTE resident caps and takes in residents displaced by the closure of another teaching hospital may receive a temporary increase to its FTE resident caps so that it may receive direct GME and IME payment associated with those displaced FTE residents. However, those temporary FTE resident caps are tied to those specific displaced FTE residents, and the temporary caps expire when those displaced residents complete their training program.

Section 5506 of the Affordable Care Act amended section 1886(h)(4)(H) of the Act to add a new clause (vi) that instructs the Secretary to establish a process by regulation under which, in the event a teaching hospital closes, the Secretary will permanently increase the FTE resident caps for hospitals that meet certain criteria up to the number of the closed hospital's FTE resident caps. The Secretary is directed to ensure that the aggregate number of FTE resident cap slots distributed shall be equal to the aggregate number of slots in the closed hospital's direct GME and IME FTE resident caps, respectively. For a detailed discussion of the regulations implementing section 5506 of the Affordable Care Act, we refer readers to the November 24, 2010 final rule with comment period (75 FR 72212 through 72238) and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53434 through 53448).

a. Effective Date of Slots Awarded Under Section 5506 of the Affordable Care Act

In distributing slots permanently under the provisions of section 5506 of the Affordable Care Act, section 5506(d) provides that “the Secretary shall give consideration to the effect of the amendments made by this section on any temporary adjustment to a hospital's FTE cap under § 413.79(h) . . . (as in effect on the date of enactment of this Act) in order to ensure that there is no duplication of FTE slots . . .” In consideration of this statutory language, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53437), we stated that in distributing slots permanently under section 5506, we would be cognizant of the number of FTE residents for whom a temporary FTE cap adjustment was provided under existing regulations at § 413.79(h), and when those residents will complete their training, at which point the temporary slots associated with those displaced residents would then be available for permanent redistribution. Therefore, in initially developing ranking criteria and application materials that we would use to award available slots, we considered how to interpret this statutory language at section 5506(d) of the Affordable Care Act within the context of our existing GME regulations and section 5506's amendment to section 1886(h) of the Act generally.

In the November 24, 2010 final rule with comment period and the FY 2013 IPPS/LTCH PPS final rule (75 FR 72216 and 77 FR 53436, respectively), we discussed the various ranking criteria that we would use for hospitals applying for slots from closed hospitals. Currently, if after distributing the slots from a closed hospital to increase the FTE caps for applying hospitals that fall within Ranking Criteria One, Two, and Three, there are still excess slots available and any of those excess slots are associated with displaced residents for whom temporary cap adjustments under § 413.79(h) are in place, any slots awarded to hospitals that fall within Ranking Criteria Four through Eight are permanently assigned only once the displaced residents have completed their training and the temporary cap adjustments associated with those residents have expired. That is, in applying the requirement for “no duplication of FTE slots” set forth in section 5506(d), we currently consider all temporary cap adjustments received by hospitals on a national basis and not specifically the hospital that is applying for cap slots under section 5506, when deciding the effective date for slots permanently awarded to hospitals applying under Ranking Criteria Four through Eight. Specifically, in the November 24, 2010 final rule with comment period, we stated that we believe the “no duplication of FTE slots” requirement applies across all hospitals. Therefore, although a hospital may not have received a temporary cap adjustment under § 413.79(h), other hospitals may have taken in residents and received temporary cap adjustments for the same program, and we believed that the appropriate policy was to delay the slots associated with that program from being permanently distributed until it is known that any and all temporary cap adjustments for those slots have expired (75 FR 72227) Applying this policy to an example, if Hospital A is training displaced residents and is receiving a temporary cap adjustment under § 413.79(h) for training those residents and Hospital B, which is not receiving a temporary cap adjustment for training any displaced residents, has applied under Ranking Criterion Five to expand its internal medicine program, as explained in the November 24, 2010 final rule with comment period, we would only award permanent slots under section 5506 to Hospital B on a flow basis; that is, effective after each displaced resident completes his/her training, and, therefore, the temporary cap adjustments associated with that resident expire at Hospital A.

However, the policy of applying the “no duplication of FTE slot” requirement at section 5506(d) of the Affordable Care Act to all hospitals rather than simply to each specific hospital that is applying for slots has thus far proven to be a very complex process due to the number of displaced residents and the timing of multiple graduation dates which must be tracked and considered when awarding slots on a permanent basis. We believe this practice has delayed the awarding of slots and is also unnecessarily burdensome for hospitals applying under Ranking Criteria Four through Eight that are not receiving any cap adjustments for training displaced residents under § 413.79(h). We believe the current policy that we apply for “no duplication of FTE slots” is unnecessarily burdensome for these hospitals because, instead of receiving their permanent slots under section 5506 as soon as possible, the hospitals may receive their section 5506 awards with staggered effective dates due to the graduation dates of displaced FTE residents training at other hospitals that did receive temporary adjustments under § 413.79(h). While we believe that awarding permanent slots to a hospital that is simultaneously receiving a temporary cap adjustment for training displaced FTE residents under § 413.79(h) would clearly be a duplication of FTE slots and contrary to the statutory directive, we believe there is flexibility in interpreting this statutory language and that the statute does not require such a policy to be applied to hospitals that are not receiving temporary cap adjustments under § 413.79(h). Furthermore, in considering the specific statutory language regarding “no duplication of FTE slots,” section 5506(d) in part provides that “The Secretary of Health and Human Services shall give consideration to the effect of the amendments made by this section on any temporary adjustment to a hospital's FTE cap under section 413.79(h) of title 42, Code of Federal Regulations (as in effect on the date of enactment of this Act) in order to ensure that there is no duplication of FTE slots.” Because this language refers to “a hospital,” we believe the statute provides us with the flexibility to apply the “no duplication of FTE slots” requirement on a hospital-specific basis, considering separately whether each hospital did or did not receive a temporary cap adjustment under § 413.79(h), rather than on a national all-hospital basis. Bearing in mind the statutory language and our experience to date in awarding slots as well as the unnecessary burden placed on hospitals that are receiving section 5506 slots, but are not receiving temporary cap adjustments under § 413.79(h), we stated in the FY 2015 IPPS/LTCH PPS proposed rule our belief that it was appropriate to propose a policy that would provide for a more efficient and faster method for awarding of slots to hospitals applying under Ranking Criteria Four through Eight. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28155), we proposed that, effective for section 5506 application rounds announced on or after October 1, 2014, for purposes of applying the requirement for “no duplication of FTE slots,” we would only require that there be no duplication of FTE slots on a hospital-specific basis. That is, in determining the effective date for slots awarded permanently under section 5506, we would only be concerned with whether the hospital that is applying for slots is also receiving a temporary cap adjustment under § 413.79(h) for training displaced residents. When awarding slots to the applying hospital, we would not be concerned whether any other hospital is receiving a temporary cap adjustment for training displaced residents under § 413.79(h). For example, if Hospital A is receiving a temporary cap adjustment under § 413.79(h) for training displaced residents in its general surgery program but is applying under Ranking Criterion Five to start a pediatrics program and Hospital B is not receiving a temporary cap adjustment for training displaced residents and is applying under Ranking Criterion Eight to expand a cardiology program, in awarding section 5506 slots, we would only allow Hospital A to receive a permanent adjustment to its FTE cap for training residents in its pediatrics program once its temporary adjustments for the displaced residents training in the general surgery program have expired. We would not consider displaced residents when awarding section 5506 slots to Hospital B.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28156), we stated that, in conjunction with our proposal to interpret the “no duplication of FTE slots” requirement to apply on a hospital-specific basis to hospitals that are receiving temporary cap adjustments under § 413.79(h), we proposed to amend the effective dates of section 5506 slots received under Ranking Criteria Four through Eight for those hospitals that are not receiving temporary cap adjustments under § 413.79(h). (We refer readers to section IV.K.5.c. of the preamble of this final rule where we discuss our proposal and final policy to amend Ranking Criteria Seven and Eight.) Existing policy requires that slots awarded under Ranking Criteria Four through Eight for expanding an existing residency training program or starting a new residency training program are effective the later of when a hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive), or the July 1 after displaced residents complete their training. If a hospital is awarded slots under Ranking Criterion Eight for cap relief, slots are effective the date of CMS' award announcement, or the July 1 after displaced residents complete their training, whichever is later. However, because we proposed an alternative approach to interpreting section 5506(d) that would permit us to apply the “no duplication of FTE slots” requirement on a hospital-specific basis, we proposed to change the effective date for slots received under Ranking Criteria Four through Eight so that if a hospital is not receiving a temporary cap adjustment under § 413.79(h), the slots awarded under section 5506 would be effective when the hospital can demonstrate to its MAC that the slots needed for a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive). If a hospital is awarded slots under Ranking Criteria Four through Eight and is receiving a temporary cap adjustment to train displaced residents under § 413.79(h), the existing policy would apply such that the slots are awarded on a permanent basis, the later of when a hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive), or the July 1 after an equivalent amount of displaced FTE residents complete their training. In the proposed rule (79 FR 28156), we stated that, assume in a hypothetical situation that there is a closed teaching hospital and that another hospital takes in two displaced FTE residents for which the hospital is receiving a temporary cap adjustment under § 413.79(h). One resident is graduating on June 30, 2016, and the second resident is graduating on June 30, 2018. Assume that when the section 5506 Round is announced, the hospital also applies for two slots to expand an internal medicine program under Ranking Criterion Five. In January 2017, CMS awards two permanent slots to the hospital under Ranking Criterion Five. For the program year starting July 1, 2017, the hospital successfully demonstrates to the MAC that it filled the two additional internal medicine positions. Because one displaced FTE resident already graduated on June 30, 2016, the MAC may approve one slot on a permanent basis effective July 1, 2017. However, the hospital would have to wait until July 1, 2018, to receive from the MAC the permanent slot for the second displaced internal medicine resident because the second displaced FTE resident is not graduating until June 30, 2018.

Comment: Several commenters supported the proposal and agreed that applying the “no duplication of slots” policy on a national level, as opposed to a hospital-specific level, results in a very complex and unnecessarily burdensome review process which further delays the permanent distribution of slots from a closed hospital.

Response: We appreciate the commenters' support.

Comment: Two commenters opposed the proposal because they believed it added an unnecessary restriction to the effective dates of permanent section 5506 awards received under Ranking Criteria Four through Eight for hospitals that have temporary cap adjustments under § 413.79(h) and are training displaced residents from the closed hospital. The commenters noted that the proposal would require a hospital that has a temporary cap adjustment and is training a displaced resident from the closed hospital and is awarded slots under Ranking Criteria Four through Eight to wait until the displaced resident graduates in order to receive the permanent cap slot. On the other hand, if a hospital does not have a temporary cap adjustment and is awarded slots under Ranking Criteria Four through Eight, those slots would be effective when the hospital can demonstrate to its MAC that the slots needed for a new program or program expansion are actually filled without consideration of any temporary cap adjustment at another hospital. The commenters asserted that only Ranking Criteria One and Three are specifically tied to the training of displaced residents, and if a hospital applies under Ranking Criteria Four through Eight, they are, in fact, acknowledging that they do not qualify under Ranking Criterion One or Three and therefore should not be subject to limitations of the effective date of its award related to a temporary cap adjustment associated with a displaced resident. The commenters suggested that the revised effective date of slots awarded under Ranking Criteria Four through Eight apply for all hospitals and award slots, regardless of whether the hospitals received a temporary cap adjustment under § 413.79(h), and that the “no duplication of slots” policy should not apply when section 5506 slots are being awarded for a completely different program or purpose than the program for which the hospital was awarded a temporary cap adjustment.

Response: We appreciates the commenters' concerns and suggestions regarding our application of the “no duplication of slots” policy as it applies to the effective dates for Ranking Criterion Four through Eight. However, we continue to believe that allowing a hospital to receive a permanent cap slot under section 5506 while at the same time receiving a temporary cap adjustment under § 413.79(h) would be contrary to the statutory directive of the “no duplication of slots” policy because as long as the displaced resident is still training and the hospital has a temporary cap adjustment for that resident and is receiving IME and direct GME payments, that slot is still in use by the hospital. Section 5506(d) in part provides that “The Secretary of Health and Human Services shall give consideration to the effect of the amendments made by this section on any temporary adjustment to a hospital's FTE cap under section 413.79(h) of title 42, Code of Federal Regulations (as in effect on the date of enactment of this Act) in order to ensure that there is no duplication of FTE slots” (emphasis added). Thus, we disagree with the commenters' overly broad interpretation of the statutory language, and continue to believe that the statute does not allow for duplication of slots within a hospital overall, even when those slots are awarded for completely different programs or purposes. In addition, prior to our proposal, our existing policy regarding effective dates for slots awarded under Ranking Criteria Four through Eight has been that where a temporary cap adjustment was in effect for displaced residents from a closed hospital, the effective dates for awards under Ranking Criteria Four through Eight are tied to the graduation dates of the displaced residents because as long as a hospital was awarded a temporary cap adjustment for a particular displaced resident, the slot associated with that resident is not yet available, regardless of the ranking criteria or the program or purpose for which the permanent section 5506 FTE cap slot was awarded. We believe that our proposed policy strikes the necessary balance of avoiding unnecessary complexity in the review of section 5506 applications and maintaining a policy that conforms to the statutory requirement for “no duplication of slots” under section 5506.

Consistent with policy implemented in subregulatory guidance in Change Request 7746, Transmittal 1171 (issued January 31, 2013; http://www.cms.gov/Regulations-and-Guidance/Guidance/Transmittals/downloads/R1171OTN.pdf) where we stated that slots awarded under a given round may only replace temporary FTE cap adjustments associated with residents displaced from that same round, we would like to clarify that our proposed application of the “no duplication of slots” policy would only apply for temporary cap adjustments and permanent section 5506 FTE cap slots associated with the same closed hospital (§ 413.79(h)). In addition, we note that, as we stated in the proposed rule (79 FR 28156), if a hospital is awarded slots under Ranking Criteria Four through Eight and is receiving a temporary cap adjustment to train displaced residents under § 413.79(h), the existing policy would apply such that the slots are awarded on a permanent basis, the later of when a hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive), or the July 1 after an equivalent amount of displaced FTE residents complete their training. That is, so long as a hospital continues to receive a temporary cap adjustment under § 413.79(h) for residents displaced from a specific closed hospital, that hospital's section 5506 award under Ranking Criteria Four through Eight associated with that specific closed hospital would also not be fully effective. Stating it simply, if a hospital has a temporary FTE cap adjustment of three FTEs due to the closure of Hospital Z, and the hospital is also awarded three permanent FTE cap slots under the section 5506 Round associated with Hospital Z, this hospital's permanent FTE cap adjustment of three would not take full effect until all three displaced FTEs from Hospital Z graduate, when the hospital's temporary FTE cap would go down to zero (§ 413.79(h)). When determining the effective dates of section 5506 FTE cap slots awarded under Ranking Criteria Four through Eight for a given Round of section 5506 from a given closed hospital, the hospital receiving the section 5506 slots would consider (1) whether it has a temporary cap adjustment associated with residents displaced from the closed hospital associated with that Round of section 5506, and (2) the difference (if any) between its section 5506 FTE cap slot award from that closed hospital, and the temporary cap adjustment associated with the same closed hospital. If a hospital is receiving a temporary cap adjustment for training displaced residents and its section 5506 award is less than or equal to the temporary cap adjustment, the section 5506 slots would become effective the later of when the hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed, or the July 1 after displaced residents complete their training. If a hospital is receiving a temporary cap adjustment for training displaced residents and its section 5506 award is greater than the temporary cap adjustment, the number of slots by which the section 5506 award exceeds the temporary cap adjustment would be available for use when the hospital can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive). The effective dates for those slots in excess of the hospital's temporary cap adjustment in a given round would not hinge on whether a displaced resident has completed his/her training and, therefore, the temporary cap adjustment associated with that resident expires because there would be no duplication of slots for that hospital with respect to the slots awarded in excess of the hospital's temporary cap adjustment. However, the portion of the hospital's section 5506 award that is equal to or less than its temporary cap adjustment for displaced residents associated with the closed hospital from the same round would be subject to the “no duplication of FTE slots” requirement, and those section 5506 slot awards would become available only as an equivalent amount of temporary cap adjustment expires. The following examples illustrate the interplay between section 5506 slots awarded and temporary cap adjustments under § 413.79(h) associated with the same closed hospital:

Example 1:

Hospital A takes in two displaced FTE residents from a closed teaching hospital for which the hospital is receiving a temporary cap adjustment of 2.0 FTEs under § 413.79(h). One resident is graduating on June 30, 2016, and the second resident is graduating on June 30, 2018. When the section 5506 Round is announced, Hospital A also applies for two slots to expand an internal medicine program under Ranking Criterion Five. In January 2017, CMS awards two permanent slots to the hospital under Ranking Criterion Five. Hospital A would consider (1) whether it has a temporary cap adjustment associated with residents displaced from the closed hospital associated with that Round of section 5506 (yes, 2.0 FTEs), and (2) the difference (if any) between its section 5506 FTE cap slot award from that closed hospital, and the temporary cap adjustment associated with the same closed hospital (2.0 temporary cap—2.0 section 5506 award = 0, no difference). Because Hospital A's section 5506 award is (less than or) equal to the temporary cap adjustment, the section 5506 slots would become effective on a flow basis; that is, effective after each displaced resident completes his/her training, and as the temporary cap adjustment associated with that resident expires. For the program year starting July 1, 2017, Hospital A successfully demonstrates to the MAC that it filled the two additional internal medicine positions. Because one displaced FTE resident already graduated on June 30, 2016, the MAC may approve one slot on a permanent basis effective July 1, 2017. However, Hospital A would have to wait until July 1, 2018, to receive from the MAC the permanent slot for the second displaced internal medicine resident because the second displaced FTE resident is not graduating until June 30, 2018.

Example 2:

Hospital B takes in two displaced FTE residents from a closed teaching hospital for which Hospital B is receiving a temporary cap adjustment of 2.0 FTEs under § 413.79(h). One resident is graduating on June 30, 2018, and the second resident is graduating on June 30, 2019. When the section 5506 Round is announced, Hospital B applies for five slots to expand a geriatrics program under Ranking Criterion Four. In January 2017, CMS awards five permanent slots to Hospital B under Ranking Criterion Four. Hospital B would consider (1) whether it has a temporary cap adjustment associated with residents displaced from the closed hospital associated with that Round of section 5506 (yes, 2.0 FTEs), and (2) the difference (if any) between its section 5506 FTE cap slot award from that closed hospital, and the temporary cap adjustment associated with the same closed hospital (2.0 temporary cap—5.0 section 5506 award = 3, absolute value). Because Hospital B's section 5506 award is greater than the temporary cap adjustment, then the number of slots by which the section 5506 award exceeds the temporary cap adjustment would be available for use when the hospital can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed. For the program year starting July 1, 2017, Hospital B successfully demonstrates to the MAC that it filled all five additional geriatrics positions. Even though the displaced residents did not yet graduate, the MAC may approve three slots on a permanent basis effective July 1, 2016 because Hospital B's section 5506 award exceeds its temporary cap adjustment and Hospital B can use up to three of its five slots while the displaced residents are still training. However, Hospital B would have to wait until July 1, 2018, to receive from the MAC the fourth slot for the geriatrics program because the first displaced FTE resident is not graduating until June 30, 2018, and would then have to wait until July 1, 2019, to receive from the MAC the fifth slot for the geriatrics program because the second displaced resident is not graduating until June 30, 2019.

Example 3:

Hospital C does not take in any displaced residents and does not receive a temporary cap adjustment under § 413.79(h). When the section 5506 Round is announced, Hospital C applies for five slots to expand geriatrics program under Ranking Criterion Four. In January 2017, CMS awards five permanent slots to Hospital C under Ranking Criterion Four. For the program year starting July 1, 2017, Hospital C successfully demonstrates to the MAC that it filled all five additional geriatrics positions. Because Hospital C did not receive a temporary cap adjustment, there would be no need to consider displaced residents at other hospitals when awarding permanent slots and determining effective dates under section 5506 for Hospital C. Therefore, Hospital C could receive a permanent adjustment of five FTEs to its cap for training residents in its geriatrics program effective July 1, 2017.

With regard to a hospital that is training displaced FTE residents, has a temporary cap adjustment under § 413.79(h), and also applies both under Ranking Criteria One or Three, and under Ranking Criteria Four through Eight, the current policy with regard to the effective date of slots awarded under Ranking Criteria One and Three would still apply, and would not impact the policy described above for Ranking Criteria Four through Eight. That is, as stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53443), slots awarded under Ranking Criteria One or Three would continue to become permanent (or effective) on a flow basis as displaced FTEs finish their training programs. If a hospital has a temporary cap adjustment under § 413.79(h) and is awarded slots under Ranking Criteria One or Three for training those displaced residents, then as the displaced FTE residents graduate, an equivalent amount of permanent section 5506 slots can become effective under Ranking Criterion One or Three (thereby avoiding duplication of FTE cap slots). If the amount of section 5506 slots awarded under Ranking Criterion One or Three is equal to the amount of the temporary cap adjustment, there would be no concern of duplication of FTE slots with respect to a hospital's other awards under Ranking Criteria Four through Eight because “no duplication” would already be addressed with regard to slots awarded within Ranking Criterion One or Three. Accordingly, other slots that the hospital is awarded under Ranking Criteria Four through Eight would not depend on whether a displaced resident has completed his/her training and, therefore, would be made available for use when the hospital can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed. The following example illustrates this policy:

Example Four:

Hospital D takes in two displaced FTE residents from a closed teaching hospital for which Hospital D is receiving a temporary cap adjustment of 2.0 FTEs under § 413.79(h). One resident is graduating on June 30, 2018, and the second resident is graduating on June 30, 2019. When the section 5506 Round is announced, Hospital D applies for two slots under Ranking Criterion Three, and also applies for five slots to expand a geriatrics program under Ranking Criterion Four. In January 2017, CMS awards to Hospital D two permanent slots under Ranking Criterion Three, and five permanent slots under Ranking Criterion Four. With regard to the effective dates of the slots awarded under Ranking Criteria Four through Eight, Hospital D would consider (1) whether it has a temporary cap adjustment associated with residents displaced from the closed hospital associated with that Round of section 5506 (yes, 2.0 FTEs), and (2) the difference (if any) between its section 5506 FTE cap slot award from that closed hospital, and the temporary cap adjustment associated with the same closed hospital (2.0 temporary cap—7.0 section 5506 award = 5, absolute value). Because Hospital D's section 5506 award is greater than the temporary cap adjustment, the number of slots by which the section 5506 award exceeds the temporary cap adjustment (5 slots) would be available for use when Hospital D can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed. For the program year starting July 1, 2017, Hospital D successfully demonstrates to the MAC that it filled all five additional geriatrics positions. Even though the displaced residents did not yet graduate, the MAC may approve all five slots on a permanent basis effective July 1, 2017, because Hospital D's section 5506 award exceeds its temporary cap adjustment by five slots and the amount of section 5506 slots awarded under Ranking Criterion Three is equal to the amount of the temporary cap adjustment. Therefore, “no duplication” is already addressed with regard to slots awarded within Ranking Criterion Three. On July 1, 2018, one displaced FTE graduated, and if Hospital D can demonstrate to the MAC that it filled a slot to replace the displaced resident under Ranking Criterion Three, Hospital D may receive from the MAC one permanent slot awarded under Ranking Criterion Three effective on that date. Similarly, on July 1, 2019, when the second displaced resident graduates, and Hospital D's temporary cap adjustment goes down to zero, if Hospital D recruits an additional resident to replace that second displaced resident, Hospital D may receive from the MAC its final permanent slot awarded under Ranking Criterion Three effective on that date.

However, if a hospital's number of permanent slots awarded under section 5506 Ranking Criterion One or Three is less than its temporary cap adjustment, and the hospital is also awarded slots under Ranking Criteria Four through Eight, the amount of the section 5506 slots awarded under Ranking Criterion Four through Eight that is equal to the remaining portion of the temporary cap adjustment would become effective the later of when the hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed on the July 1 after the appropriate amount of displaced residents complete their training.

After consideration of the public comments we received, we are finalizing, as proposed, the policy that effective for section 5506 application rounds announced on or after October 1, 2014, the statutory provision at section 5506(d) requiring the Secretary when awarding slots under section 5506 to consider any temporary cap adjustment to a hospital's FTE cap under § 413.79(h) to ensure no duplication of FTE slots, be interpreted in a manner such that the requirement for “no duplication of FTE slots” is applied on a hospital-specific basis rather than across all hospitals receiving temporary cap adjustments under § 413.79(h). Consistent with this change, we are finalizing our proposal to amend the effective date for slots received under Ranking Criteria Four through Eight so that if a hospital is not receiving a temporary cap adjustment under § 413.79(h), the slots awarded under section 5506 would be effective when the hospital can demonstrate to its MAC that the slots needed for a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive). However, if a hospital is receiving a temporary cap adjustment under § 413.79(h), we would consider the number of displaced residents in determining the effective date of the slots awarded under section 5506 such that as long as a hospital continues to receive a temporary cap adjustment under § 413.79(h) for residents displaced from a specific closed hospital, that hospital's section 5506 award under Ranking Criteria Four through Eight associated with that specific closed hospital would also not be fully effective. When determining the effective date of section 5506 FTE cap slots awarded under Ranking Criteria Four through Eight for a given round of section 5506 from a given closed hospital, the hospital receiving the section 5506 slots would consider (1) whether it has a temporary cap adjustment associated with residents displaced from the closed hospital associated with that round of section 5506, and (2) the difference (if any) between its section 5506 FTE cap slot award from that closed hospital, and the temporary cap adjustment associated with the same closed hospital. If a hospital is receiving a temporary cap adjustment for training displaced residents and its section 5506 award is less than or equal to the temporary cap adjustment, the section 5506 slots would become effective the later of when the hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed or the July 1 after displaced residents complete their training. If a hospital is receiving a temporary cap adjustment for training displaced residents, and its section 5506 award is greater than the temporary cap adjustment, the number of slots by which the section 5506 award exceeds the temporary cap adjustment would be available for use when the hospital can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive). The effective dates for those slots in excess of the hospital's section 5506 award in a given round would not depend on whether a displaced resident has completed his/her training. However, the portion of the hospital's section 5506 award that is equal to or less than its temporary cap adjustment for displaced residents associated with the closed hospital from the same round would continue to be subject the “no duplication of FTE slots” requirement, and the section 5506 slots would become available only as an equivalent amount of temporary cap adjustment expires.

We did not propose any changes to the effective date for slots awarded under Ranking Criterion One, Ranking Criterion Two, or Ranking Criterion Three. Consistent with existing policy, if a hospital is applying under Ranking Criterion One or Ranking Criterion Three and is not receiving a temporary cap adjustment for training displaced residents under § 413.79(h), the effective date of the section 5506 slots is the date of the hospital closure. If a hospital is applying under Ranking Criterion One or Ranking Criterion Three and is receiving a temporary cap for training displaced residents under § 413.79(h), the effective date of the section 5506 slots is after the displaced resident(s) graduate. If a hospital is receiving a temporary cap for training displaced residents under § 413.79(h), and is applying under Ranking Criterion One or Ranking Criterion Three and is also separately applying under Ranking Criterion Four or subsequent Ranking Criteria, for slots awarded under Ranking Criteria One or Three, the effective date of the section 5506 slots is after the displaced resident(s) graduate. For slots awarded under Ranking Criteria Four or subsequent Ranking Criteria, the slots are awarded the later of when a hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive), or the July 1 after an equivalent amount of a displaced FTE resident(s) at the hospital complete their training. Therefore, for such a hospital, the effective dates of slots awarded under Ranking Criteria One/Three, and Ranking Criteria Four through Eight might coincide. Also, consistent with existing policy, if a hospital is applying under Ranking Criterion Two, the effective date of the permanent award of section 5506 slots is the date of the hospital closure. We discuss these existing policies in greater detail in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53437 through 53445).

b. Removal of Seamless Requirement

Under current policy, if a hospital is applying under Ranking Criterion One or Three, the hospital must show that it is seamlessly replacing displaced FTE residents with new FTE residents once the displaced residents graduate (75 FR 72219 and 72221 through 72222). We have stated that in instances where a hospital seamlessly operates an entire program or part of a program from the closed hospital (or takes over an entire program prior to the hospital's closure), such a hospital is demonstrating a strong commitment to maintain GME programs in the community for the long term and should be awarded slots under higher ranking criteria (75 FR 72216). Therefore, we required that, in order to receive slots under Ranking Criterion One and Three, the applying hospital must demonstrate that upon graduation of the displaced FTE residents that it is training, the slots held by those displaced FTEs are seamlessly replaced with new FTE residents (75 FR 72219 and 72221 through 72222). In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53441), in response to concerns associated with the seamless requirement and timeline used by the National Resident Match Program and other resident match services, we revised the seamless requirement. We stated that, in the instance where a teaching hospital closed after December 31 of an academic year, in order for a hospital to qualify under Ranking Criterion One or Three for cap slots associated with displaced FTE residents who will graduate June 30 of the academic year in which the applying hospital took in the displaced FTE residents, the applying hospital must be able to demonstrate that it will fill slots vacated by displaced FTE residents by July 1 of the second academic year following the hospital closure. However, in the instance where a teaching hospital closed before December 31 of an academic year, in order for a hospital to qualify under Ranking Criterion One or Three for cap slots associated with displaced FTE residents who will graduate June 30 of the academic year in which the applying hospital took in the displaced FTE residents, the applying hospital must be able to demonstrate that it will seamlessly fill slots vacated by displaced FTE residents by that July 1; that is, the day immediately after the June 30 that the displaced FTE residents graduate (77 FR 53441 through 53442). We also revised the CMS Application Form to instruct a hospital applying under Ranking Criterion One or Three to list the names and graduation dates of specific displaced residents who, upon their graduation, have been or will be seamlessly replaced by new residents (77 FR 53446). Because Ranking Criteria One and Three fall under Demonstrated Likelihood Criterion 2, the hospital is taking over all of part of an existing residency program from the closed hospital, or expanding an existing residency training program, the requirement to include a list with the names and graduation dates of specific displaced residents who have been or will be seamlessly replaced was added under Demonstrated Likelihood Criterion 2 on the CMS Application Form.

In addition to the match deadlines associated with the National Resident Matching Program and match deadlines associated with matching into osteopathic programs, we have recently been made aware of other match deadlines associated with certain fellowship programs. From the experience we have had so far in reviewing section 5506 applications, where we have observed the complexity of tracking various match deadlines as well as the intersection between these deadlines and when the section 5506 awards are announced by CMS, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28158), we proposed to remove the seamless requirement for slots awarded under Ranking Criterion One and Three effective for section 5506 application rounds announced on or after October 1, 2014. We did not propose to make any other additional changes to Ranking Criterion One or Three; that is, the hospital must still be training displaced residents and must either take over or have taken over an entire program from the closed hospital and continue operating that program in the same manner in which it was operated by the closed hospital or the hospital must take over part of a closed hospital's program and permanently expand its own program as a result of training displaced residents. Hospitals would continue to be required to submit supporting documentation when applying under Ranking Criterion One or Three that indicates that they have made a commitment to take over the closed hospital's program or that they have made the commitment to permanently expand their own residency training program resulting from taking over part of a closed hospital's program.

In determining the effective date of slots awarded under Ranking Criterion One or Three where the hospital has been training residents that were displaced by the closed hospital and receiving a temporary cap adjustment under § 413.79(h), the hospital would work with its MAC to determine when it could be permanently awarded the slots based on the graduation dates of the displaced residents it is training. Consistent with our proposal, we proposed to remove the following requirement under Demonstrated Likelihood Criterion 2 on the CMS Application Form: “Hospitals applying for slots under option (a) which correlates to Ranking Criterion 1 or (b) which correlates to Ranking Criterion 3 must list the names and graduation dates of specific displaced residents who, upon their graduation, have been or will be seamlessly replaced by new residents. The list may be added as an attachment to this application.” We proposed to replace this requirement with the following requirement under Demonstrated Likelihood Criteria 1 and 2” “Please indicate Y or N: As of the time of submitting this application, are you receiving a temporary cap adjustment for IME and/or direct GME under 42 CFR 413.79(h) for residents displaced by the closure of the hospital subject to this Round of section 5506? (Y/N)” so that we are aware which hospitals are receiving temporary cap adjustments for training displaced residents under § 413.79(h), and when we award slots, we would know which hospitals to instruct to work with their MACs to determine when the slots could be permanently awarded to them based on the graduation dates of the displaced residents they are training.

In summary, we proposed to remove the seamless requirement currently included as part of Ranking Criterion One or Three. We also proposed to remove from the CMS Application Form, the following requirement: “Hospitals applying for slots under option a) which correlates to Ranking Criterion 1 or b) which correlates to Ranking Criterion 3 must list the names and graduation dates of specific displaced residents who, upon their graduation, have been or will be seamlessly replaced by new residents. This list may be added as an attachment to this application.”

Comment: Commenters supported the proposal to remove the seamless requirement for slots awarded under Ranking Criteria One and Three effective for section 5506 application rounds announced on or after October 1, 2014. One commenter stated that, in addition to complicating the CMS review process of section 5506 applications, the seamless requirement created an administrative burden for hospitals applying under Ranking Criteria One and Three. Another commenter stated it supported removing the seamless requirement because it has become very complicated and burdensome for hospitals that legitimately plan to continue training residents in a program once the displaced residents training in that program graduate. However, commenters requested that CMS “provide clear and consistent guidance” to explain the type of documentation that would meet the requirement that a hospital has made a commitment to take over the closed hospital's program or has made the commitment to permanently expand their own residency training program resulting from taking over part of a closed hospital's program.

Response: We thank the commenters for their support of our proposed policy to remove the seamless requirement under Ranking Criteria One and Three. We are finalizing the policy as proposed. Effective for section 5506 rounds announced on or after October 1, 2014, we are removing the seamless requirement previously required as part of Ranking Criterion One and Three. We are removing from the CMS Application Form the following language: “Hospitals applying for slots under option a) which correlates to Ranking Criterion 1 or b) which correlates to Ranking Criterion 3 must list the names and graduation dates of specific displaced residents who, upon their graduation, have been or will be seamlessly replaced by new residents. This list may be added as an attachment to this application.” We are adding to the CMS Application Form the following language under Demonstrated Likelihood Criteria 1 and 2 “Please indicate Y or N: As of the time of submitting this application, are you receiving a temporary cap adjustment for IME and/or direct GME under 42 CFR 413.79(h) for residents displaced by the closure of the hospital subject to this Round of section 5506? (Y/N).”

In response to the commenters' request that CMS “provide clear and consistent guidance” to explain the type of documentation that would meet the requirement under Ranking Criterion One or Three, commenters should submit documentation as part of their application which indicates a commitment to take over the closed hospital's program or permanently expand their own residency training program resulting from taking over part of a closed hospital's program. We believe that the documentation that the hospital submits to demonstrate the likelihood that it would fill the requested slots under Demonstrated Likelihood Criterion 2 is sufficient. Demonstrated Likelihood Criterion 2 is for taking over all or part of an existing residency program from the closed hospital, or expanding an existing residency program. Applicants should refer to the description of documentation included on the CMS Application Form under “Demonstrated Likelihood Criterion 2: Taking Over All or Part of an Existing Residency Program from the Closed Hospital, or Expanding an Existing Residency Program,” for examples of acceptable documentation. For example, if a hospital is applying under Ranking Criterion Three because it is permanently expanding its surgery program as a result of training residents displaced from a closed hospital's surgery program and it has submitted documentation to the accrediting body requesting approval of additional positions, or it has already received approval from the accrediting body for the expansion, such documentation would meet the requirement that a hospital applying under Ranking Criterion Three has made the commitment to permanently expand its own surgery program as a result of training displaced residents.

c. Revisions to Ranking Criteria One, Seven, and Eight for Applications under Section 5506

In the November 24, 2010 final rule with comment period (75 FR 72223), we finalized the Ranking Criteria within each of the three first statutory priority categories (that is, same or contiguous CBSAs, same State, and same region) to be used to rank applications for assignment of slots under section 5506 of the Affordable Care Act. For each application, we assigned slots based on Ranking Criteria, with Ranking Criterion One being the highest ranking and Ranking Criterion Seven being the lowest. For a detailed discussion of the ranking categories, we refer readers to the November 24, 2010 final rule with comment period (75 FR 72212 through 72240).

After reviewing applications submitted during the first section 5506 application process (those applications that were due to CMS on April 1, 2011), we observed that the overwhelming majority of applications fell under Ranking Criterion Seven; that is, the applying hospital seeks the slots for purposes that do not fit into any of Ranking Criterion One through Ranking Criterion Six. These applications included applications from hospitals that applied for FTE cap slots for both primary care and/or general surgery and for nonprimary care specialties as well as applications for general cap relief. The sheer number of applications we received under Ranking Criterion Seven indicated a need to further prioritize among the applicants that would have qualified under Ranking Criterion Seven. Therefore, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53434 through 53437), we finalized changes to the Ranking Criteria, replacing Ranking Criterion Seven with two separate Ranking Criteria (Ranking Criterion Seven and Ranking Criterion Eight) resulting in a total of eight Ranking Criteria. Under the Ranking Criteria, as modified by the FY 2013 IPPS/LTCH PPS final rule, a hospital that is applying both for the purpose of establishing or expanding primary care or general surgery programs, and in addition is requesting slots for the purpose of establishing or expanding nonprimary care or nongeneral surgery programs and/or for cap relief must submit an application requesting additional FTE slots for its primary care or general surgery programs under Ranking Criterion Seven. The hospital's request for additional FTE slots to establish or expand a nonprimary care or nongeneral surgery program and/or for additional FTE slots for cap relief would then be made under Ranking Criterion Eight. Prior to this change, if a hospital applied for additional FTE slots to establish or expand both a primary care or general surgery program in addition to a nonprimary care or nongeneral surgery program and/or for additional FTE slots for cap relief, all of its applications (with the exception of Ranking Criteria One through Three) would fall under Ranking Criteria Seven. For a complete list of the Ranking Criteria, we refer readers to section IV.K.5.a. of the preamble of this final rule, which discusses the background for preservation of resident cap positions from closed hospitals under section 5506 of the Affordable Care Act.

After reviewing applications and making awards under several more rounds of section 5506 applications, we have observed that, as hospital closings continue to occur, there has been a significant increase in the time between a hospital's closure and the announcement of section 5506 awards by CMS. We believe that this delay is partly due to the administratively burdensome task of processing, reviewing, and responding to such a large number of applications for each hospital closure, or each round of section 5506 awards. When implementing section 5506 in the November 24, 2010 final rule with comment period (75 FR 72212 through 72249), we initially envisioned the reviewing of applications and awarding of section 5506 FTE slots as being a more streamlined and expedient process. However, as a practical matter, we have found that the process has been much more resource and time intensive than we had originally anticipated. This is partly due to the time and resources needed to properly apply the process established by CMS in reviewing section 5506 applications and awarding FTE cap slots. Since the initial implementation of section 5506, we have attempted to be responsive to these unexpected delays by refining the ranking criteria to make the review process less administratively burdensome. However, these changes did not alleviate the process to the desired extent. Furthermore, we have observed that, while many of the applications submitted to CMS are applications requesting FTE slots for purposes of general cap relief, we have more often than not awarded no slots at all for cap relief. This is due in large part to the limited number of slots available (many of the closed teaching hospitals did not have large FTE resident caps) and an overwhelming demand for those slots from applicants who apply for FTE slots for reasons other than cap relief. Since we finalized the modified Ranking Criterion Seven and added Ranking Criterion Eight in the FY 2013 IPPS/LTCH PPS final rule, and as of the issuance of the FY 2015 IPPS/LTCH PPS proposed rule, we had announced three new rounds of section 5506 applications due to the closures of six hospitals. We have received a total of 424 applications from hospitals seeking cap relief. Of those 424 applications, only 6 applications were ultimately awarded FTE slots, which is only 1.42 percent of the total cap relief applications. We believe that the ratio of cap relief awardees to cap relief applications does not warrant the administrative burden and the delay in announcements of section 5506 awards that result from the large number of cap relief applications submitted to CMS that are invariably denied. Therefore, in an effort to streamline the review process and to facilitate publishing section 5506 awards in a more timely manner, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28159 through 28160), we proposed to modify Ranking Criterion Eight so that Ranking Criterion Eight would only apply to hospitals seeking FTE slots to establish or expand a nonprimary care or nongeneral surgery program. Ranking Criterion Eight would no longer be applicable to hospitals seeking FTE cap slots for cap relief. Our proposal to eliminate section 5506 awards of FTE slots for cap relief is consistent with current policy goals to increase training in primary care and general surgery. By proposing to eliminate awarding of FTE slots for residents that are already being trained by a hospital, there will be more FTE resident slots available to award to other hospitals seeking to establish or expand a primary care or general surgery program under Ranking Criteria Four through Seven.

Accordingly, we proposed to revise Ranking Criterion Eight so that it reads as follows:

Proposed Ranking Criterion Eight: The program does not meet Ranking Criteria 1 through 7, and the applying hospital will use additional slots to establish or expand a nonprimary care or a nongeneral surgery program. In light of the modifications we proposed to Ranking Criterion Eight, we believe it is also necessary to modify the language of proposed Ranking Criterion Seven to specify the types of applications that would properly be made under this Ranking Criterion; that is, we proposed to remove the reference to cap relief from Ranking Criterion Seven so that it read as follows:

Proposed Ranking Criterion Seven: The applying hospital will use additional slots to establish or expand a primary care or general surgery program, but the program does not meet Ranking Criterion 5 or 6 because the hospital is also separately applying under Ranking Criterion 8 for slots to establish or expand a nonprimary care or nongeneral surgery program.

Comment: One commenter supported CMS' proposal to eliminate awarding of FTE slots for cap relief because doing so would increase the chance for a rural hospital that is located near very few teaching hospitals in the same or contiguous CBSA to apply under Level Priority Category One, to be awarded slots from a closed teaching hospital in the same state or region. One commenter supported the proposal because it would make more FTE resident slots available to award to other hospitals seeking to establish or expand a primary care or general surgery program.

Many commenters opposed CMS' proposal to eliminate awarding of FTE slots for cap relief. They asserted that hospitals are, in fact, being awarded slots under Ranking Criterion Eight for cap relief, albeit sparingly, and therefore CMS should not remove hospitals' one and only opportunity to receive funding for training residents above their caps. Several commenters offered suggestions and alternate ways to modify Ranking Criterion Eight in order to ease CMS' administrative burden. One commenter noted that under section 5506, the only requirement that Congress mandated was that hospitals need to demonstrate the likelihood of filling the slots within 3 years, and that hospitals applying for cap relief meet this requirement.

Response: We appreciate the commenters' support and the numerous comments and suggestions regarding the awarding of FTE slots under section 5506 for cap relief. One of the objectives behind our proposal to eliminate awarding of slots for cap relief was to find a way to reduce the number of applications submitted to CMS, most of which are not approved for awards due to the limited number of slots available for redistribution. By eliminating the possibility of applying for cap relief, the volume of applications that CMS would receive, process, and review would be reduced, allowing CMS to award slots in a more timely fashion.

While we appreciate that hospitals are training residents above their caps and that being awarded section 5506 slots for general cap relief would be a welcome opportunity to receive some funding for these positions, we believe that general cap relief is inconsistent with the intent of section 5506 and incompatible with the underlying principles of section 5506. We continue to believe that Congress intended that section 5506 be used to maintain the level of residents training in the area after the closure of a hospital by awarding permanent FTE cap slots to the hospital that take in and continue to train displaced residents from the closed hospital. In addition, the regulations promulgated under section 5506 are consistent with current policy goals to focus on increasing training in primary care and general surgery. By eliminating cap relief for residents that are already being trained by a hospital, more slots would be available to award to other hospitals in the same State as the closed hospital seeking to establish or expand a primary care or general surgery program. Moreover, we believe awarding slots for cap relief is contrary to the historical premise of Medicare GME payments, as it allows hospitals to shift costs borne by other means to the Medicare Trust Fund. Furthermore, we continue to believe that Congress did not intend for section 5506 awards to be used to pay hospitals for residents that they were already training, possibly even before the closure of the hospital whose slots are being redistributed.

For the reasons mentioned above, coupled with our efforts to streamline the review process and facilitate publishing section 5506 awards in a more timely manner, we are finalizing our proposal to modify Ranking Criterion Eight so that Ranking Criterion Eight would only apply to hospitals seeking FTE slots to establish or expand a nonprimary care or nongeneral surgery program, and would no longer be applicable to hospitals seeking cap slots for cap relief. In light of the modifications to Ranking Criterion Eight, we also are finalizing our proposed change to Ranking Criteria Seven to correctly specify the types of applications that would properly be made under this Ranking Criterion by removing the reference to cap relief from Ranking Criterion Seven.

Accordingly, we are finalizing Ranking Criterion Seven and Ranking Criterion Eight as follows:

Ranking Criterion Seven: The applying hospital will use additional slots to establish or expand a primary care or general surgery program, but the program does not meet Ranking Criterion 5 or 6 because the hospital is also separately applying under Ranking Criterion 8 for slots to establish or expand a nonprimary care or nongeneral surgery program.

Ranking Criterion Eight: The program does not meet Ranking Criteria 1 through 7, and the applying hospital will use additional slots to establish or expand a nonprimary care or a nongeneral surgery program.

We are making changes to the Section 5506 Application Form to remove language associated with cap relief, including removal of the existing Demonstrated Likelihood Criterin 3, which was for cap relief.

Separately, we also proposed a change related to Ranking Criterion One. Current ranking Criterion One is for an applying hospital that assumed an entire program or programs from the hospital that closed. We proposed to revise Ranking Criterion One to provide priority to hospitals in one scenario. Section 5503 of the Affordable Care Act amended section 1886(h) of the Act by adding new paragraph (8), which provided for the permanent reduction and distribution of residency slots. Section 1886(h)(8)(A)(ii) of the Act provides specific exceptions to the application of the reduction at section 1886(h)(8)(A)(i) of the Act, and expressly states: “Exceptions—This subparagraph shall not apply to (I) a hospital located in a rural area (as defined in subsection (d)(2)(D)(ii)) with fewer than 250 acute care inpatient beds.” The November 24, 2010 final rule with comment period (75 FR 72147) describes the agency's interpretation of this statutory provision. As of the time that the proposed rule was posted on the CMS Web site, we were aware of one instance in which CMS erroneously reduced a hospital's FTE resident cap contrary to this statutory exception. We proposed to amend Ranking Criterion One under section 5506 to provide priority to a hospital which had FTE resident cap slots erroneously removed under section 5503 contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act. We proposed to revise Ranking Criterion One as follows:

Ranking Criterion One. The applying hospital is requesting the increase in its FTE resident cap(s) because it is assuming (or assumed) an entire program (or programs) from the hospital that closed, and the applying hospital is continuing to operate the program(s) exactly as it had been operated by the hospital that closed (that is, same residents, possibly the same program director, and possibly the same (or many of the same) teaching staff). The applying hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and CMS Central Office was made aware of the error prior to posting of the FY 2015 IPPS proposed rule on the CMS Web site.

Comment: One commenter asked that CMS clarify that this modification to Ranking Criterion One does not override the statutory priority of the categories included in the text of section 5506. The commenter suggested that CMS clarify this by indicating that the applying hospitals located within or contiguous to the same CBSA as the closed hospital would be eligible to receive cap slots, regardless of their ranking criteria before an applying hospital that meets the new second clause included within Ranking Criterion One but is not located within the same or contiguous CBSA as the closed hospital.

Response: We are clarifying, as the commenter requested, that the applying hospitals located within or contiguous to the same CBSA as the closed hospital would be eligible to receive cap slots, regardless of their ranking criteria, before an applying hospital that meets the new second clause included within Ranking Criterion One but is not located within the same or contiguous CBSA as the closed hospital.

Comment: One commenter expressed concern that the proposed change to Ranking Criterion One does not ensure that a hospital that is located more than 70 miles from any other medical education program, and whose FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, can regain its lost slots when a teaching hospital closes in another part of its State. The commenter noted that CMS must follow the statutory categories in distributing slots under section 5506, and that, generally, the number of slots requested under the first priority category (same or contiguous CBSA as the closed hospital) far exceeds the number of slots available from the closed hospital, leaving no slots available for hospitals in the second or other priority category levels. The commenter cautioned that unless CMS takes steps to ensure that slots are awarded not only to hospitals in the first priority category, but also to hospitals in the second (same state) or third (same region) priority categories, the proposed change to Ranking Criterion One will not help a hospital that is located more than 70 miles from the nearest medical education program. The commenter stated that “CMS has several options in the FY 2015 IPPS final rule to ensure that hospitals located in the same State, and not just the same or contiguous CBSA as the closed hospital, have an opportunity to add new resident slots under section 5506.” The commenter made the following recommendations for CMS to finalize:

(1) In addition to finalizing the proposal to eliminate cap relief from Ranking Criterion Eight, CMS could further revise Ranking Criteria Seven and Eight so that even fewer hospitals located in the same or contiguous CBSA can satisfy either criterion. CMS could further narrow its Demonstrated Likelihood Criteria to achieve the same result.

(2) CMS could construe the language at section 1886(h)(4)(H)(vi)(II) of the Act to require the agency to follow the statutory priority categories, but to do so in a manner that at least some slots are awarded to hospitals within each of the first three priority categories, such as making a large proportion of slots available for the first priority category, and then successively smaller proportions of the slots available for the second and third priority categories.

(3) CMS could balance the competing statutory importance expressed within the statutory priority categories with the need to maintain and grow primary care residency programs in rural and underserved areas and maintain an adequate distribution of physicians, in general. CMS could conclude that one way to recognize this balance is to ensure that a hospital that had less than 250 beds and that was located in a rural area and had its FTE resident cap erroneously reduced by CMS would be awarded some of those slots after another teaching hospital in its State closes, even if the closed hospital is not located in the same or contiguous CBSA as such a hospital.

(4) CMS could conclude that section 1886(h)(4)(H)(vi)(II) of the Act did not contemplate the exact scenario where a hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, and that the hospital's remote location means it almost certainly will never be in the first priority level category.

Response: We regret that the commenter believes that CMS' proposed revision to Ranking Criterion One is not sufficient to rectify the scenario where a hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act. We do not agree with the commenter's options because each of the options that the commenter recommended would have an impact on other hospitals and stakeholders with an interest in how CMS implements section 5506. That is, the commenter's suggestions could potentially reduce the amount of slots available to other stakeholders. Moreover, accepting any such suggested options would require notice-and-comment rulemaking on each recommendation, respectively. We continue to believe that it is appropriate to provide priority to a hospital which had FTE resident cap slots erroneously removed under section 5503 contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and for which CMS CentralOffice was made aware of the error prior to posting of the FY 2015 IPPS/LTCH PPS proposed rule on the CMS Web site. Therefore, we are finalizing this policy, as proposed, in this final rule.

Comment: One commenter stated that the proposed language revising Ranking Criterion One could lead one to believe that a hospital must satisfy both conditions to qualify under this criterion. To clarify that this is not the case, the commenter recommended that CMS modify the language within Ranking Criterion One by adding an “or” as follows:

Ranking Criterion One. The applying hospital is requesting the increase in its FTE resident cap(s) because it is assuming (or assumed) an entire program (or programs) from the hospital that closed, and the applying hospital is continuing to operate the program(s) exactly as it had been operated by the hospital that closed (that is, same residents, possibly the same program director, and possibly the same (or many of the same) teaching staff); or, the applying hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and CMS Central Office was made aware of the error prior to posting of the FY 2015 IPPS/LTCH PPS proposed rule on the CMS Web site.

Response: We agree with the commenter that the conditions in the revised Ranking Criterion One are separate and distinct, and a hospital applying for slots under Ranking Criterion One would need to satisfy only one of the requirements, not both. Therefore, we are adopting the commenter's suggestion of adding “or” between the two conditions, and we are modifying the language of Ranking Criterion One in the CMS Application Form as well.

After consideration of the public comments we received, we are finalizing the following change to the text of Ranking Criterion One:

Ranking Criterion One. The applying hospital is requesting the increase in its FTE resident cap(s) because it is assuming (or assumed) an entire program (or programs) from the hospital that closed, and the applying hospital is continuing to operate the program(s) exactly as it had been operated by the hospital that closed (that is, same residents, possibly the same program director, and possibly the same (or many of the same) teaching staff); OR, the applying hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and CMS Central Office was made aware of the error prior to posting of the FY 2015 IPPS proposed rule on the CMS Web site.

d. Clarification to Ranking Criterion Two Regarding Emergency Medicare GME Affiliation Agreements

Ranking Criterion Two gives preference to applying hospitals that received slots under the terms of a Medicare GME affiliation agreement from the closed hospital. Under section 1886(h)(4)(H)(ii) of the Act, hospitals may form a Medicare GME affiliated group and elect to aggregate their respective FTE resident caps and apply them on an aggregate basis. The regulations at 42 CFR 413.75(b) and 413.79(f) implemented this statutory provision, providing specific rules for sharing FTE resident cap slots among members of the Medicare GME affiliated group, one such rule being that member hospitals must have a “shared rotational arrangement.” A “shared rotational arrangement” is defined at 42 CFR 413.75(b) as a residency training program under which a resident(s) participates in training at two or more hospitals in that program. Specifically, Ranking Criterion Two states the following:

Ranking Criterion Two. The applying hospital was listed as a participant of a Medicare GME affiliated group on the most recent Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed, and under the terms of that Medicare GME affiliation agreement, the applying hospital received slots from the hospital that closed, and the applying hospital will use the additional slots to continue to train at least the number of FTE residents it had trained under the terms of the Medicare GME affiliation agreement. If the most recent Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed was with a hospital that itself has closed or is closing, preference would be given to an applying hospital that was listed as a participant in the next most recent Medicare GME affiliation agreement (but not one which was entered into more than 5 years prior to the hospital's closure) of which the first closed hospital was a member before the hospital closed, and that applying hospital received slots from the closed hospital under the terms of that affiliation agreement.

A question has been raised as to whether hospitals that were members of an emergency Medicare GME affiliation agreement with the closed hospital prior to its closure may be considered under Ranking Criterion Two as well. The regulations at 42 CFR 413.79(f)(7) govern emergency Medicare GME affiliation agreements, which are applicable in the instance where a statutory section 1135 waiver is invoked. In this situation, due to emergency conditions, the “home” hospital is unable to continue to train its residents. Therefore, under the terms of the emergency Medicare GME affiliation agreement, the “home” hospital may agree to temporarily transfer FTE resident cap slots to “host” hospitals that would train the displaced residents during the emergency period.

In the November 24, 2010 final rule with comment period (75 FR 72216), we stated that “section 1886(h)(4)(H)(vi) of the Act, as added by section 5506(a) of the Affordable Care Act, directs the Secretary to give preference to hospitals that are members of the same affiliated group as the hospital that closed. We believe that, generally, if the applying hospital was affiliated to receive slots from the hospital that closed, then the applying hospital was relying on that number of FTE resident slots that it received in order to maintain its fair share of the cross-training of the residents in the jointly operated programs. In the absence of those slots received from the closed hospital, the applying hospital may not be able to continue training that number of FTE residents, and those same residents would not only be displaced from the closed hospital, but might essentially become `displaced' from the affiliated hospitals in which they were used to doing a portion of their training. Accordingly, we proposed this ranking criterion to allow hospitals that were affiliated with the closed hospitals to at least maintain their fair share of the training of the residents in the programs that they had jointly operated with the closed hospital.”

In determining whether Ranking Criterion Two may encompass emergency Medicare GME affiliation agreements, we considered the key differences and similarities between regular Medicare GME affiliation agreements and emergency Medicare GME affiliation agreements. Regarding the differences, in the case of emergency affiliations, there may not have been historical cross-training or jointly operated programs between the applicant hospital and the hospital that closed. Furthermore, after the natural disaster that precipitates the section 1135 waiver, the “home” hospital would be in no condition to train its share of residents, which is why the “shared rotational arrangement” requirements at 42 CFR 413.79(f)(2) for regular Medicare GME affiliation agreements are waived for emergency Medicare GME affiliation agreements. However, it is often true with emergency affiliations that a hospital agrees to take over the training of the hospital in need, “receiving” FTE cap slots and residents from the “home” hospital, thereby creating the training relationship. In the event where, following the disaster that triggers the section 1135 waiver, a hospital should actually close, the “host” hospital that accepted the residents perhaps might even continue to train its share of the residents in the program after the hospital closes. Therefore, emergency affiliation agreements are similar to regular affiliation agreements in that the “host” hospital received FTE cap slots from the “home” hospital to train the “home” hospital's residents. Further, in the event that the “home” hospital closes, triggering a Round of section 5506, the “host” hospital also would need those FTE cap slots in order to continue training the share of its program for which it had taken responsibility under the emergency Medicare GME affiliation agreement before the “home” hospital closed.

As we stated in the November 24, 2010 final rule with comment period (75 FR 72219 through 72220), “we believe the intent of section 5506 is to promote continuity and limit disruption in residency training. In that light, we believe it is logical to give preference to a hospital that received slots under the terms of the Medicare GME affiliation agreement so that the hospital could continue to train at least the number of FTE residents it had trained under the terms of the Medicare GME affiliation agreement, avoiding the displacement of even more residents. . . .” We further stated that we “. . . are only giving preference to hospitals that received slots from the closed hospital under the terms of the Medicare GME affiliation agreement, so that the hospital could continue to train at least the number of FTE residents it had trained under the terms of the Medicare GME affiliation agreement. . . .” Finally, we stated “that the hospital or hospitals that were most recently affiliated with and received slots from the closed hospital would have the most immediate need for those slots.”

While the circumstances may vary, we believe that “host” hospitals under emergency Medicare GME affiliation agreements could fulfill much of the same role as hospitals that received slots from the hospital that closed under regular Medicare GME affiliation agreements. That is, continuity of training would be encouraged and disruption would be mitigated, to the extent that the “host” hospital could document to CMS that it would continue to “train at least the number of FTE residents it had trained under the terms of the” emergency Medicare GME affiliation agreement, and in doing so, would demonstrate it has the “most immediate need for those slots” as compared to another hospital. Given these similarities between regular Medicare GME affiliation agreements and emergency Medicare GME affiliation agreements, we believe that the existing Ranking Criterion Two may be read to already encompass emergency Medicare GME affiliation agreements. Accordingly, we are clarifying the existing Ranking Criterion Two to include emergency Medicare GME affiliation agreements, to read as follows:

Ranking Criterion Two. The applying hospital was listed as a participant of a Medicare GME affiliated group on the most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed, and under the terms of that Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement, the applying hospital received slots from the hospital that closed, and the applying hospital will use the additional slots to continue to train at least the number of FTE residents it had trained under the terms of the Medicare GME affiliation agreement, or emergency Medicare GME affiliation agreement. If the most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed was with a hospital that itself has closed or is closing, preference would be given to an applying hospital that was listed as a participant in the next most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement (but not one which was entered into more than 5 years prior to the hospital's closure) of which the first closed hospital was a member before the hospital closed, and that applying hospital received slots from the closed hospital under the terms of that affiliation agreement.

We are making these changes to Ranking Criterion Two in the Section 5506 Application Form.

Comment: Commenters supported CMS' clarification that the existing Ranking Criterion Two includes emergency Medicare GME affiliation agreements.

Response: We thank the commenters for their support. The revised description of Ranking Criterion Two on the CMS Application Form refers to both Medicare GME affiliation agreements and emergency Medicare GME affiliation agreements.

The following list includes the final ranking criteria along with the final effective dates.

  • Ranking Criterion One: The applying hospital is requesting the increase in its FTE resident cap(s) because it is assuming (or assumed) an entire program (or programs) from the hospital that closed, and the applying hospital is continuing to operate the program(s) exactly as it had been operated by the hospital that closed (that is, same residents, possibly the same program director, and possibly the same (or many of the same) teaching staff); OR, the applying hospital's FTE resident caps were erroneously reduced by CMS under section 1886(h)(8)(A)(i) of the Act, contrary to the statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and CMS Central Office was made aware of the error prior to posting of the FY 2015 IPPS/LTCH PPS proposed rule on the CMS Web site. (This language reflects the finalized modification of Ranking Criterion One. We refer readers to section IV.K.5.c. of the preamble of this final rule where we discuss this proposed modification.)
  • Effective Date: If the hospital is receiving a temporary cap adjustment, slots are effective the day after the graduation date(s) of actual displaced resident(s). If the hospital is not receiving a temporary cap adjustment, slots are effective with the date of the hospital closure.
  • Clarified Ranking Criterion Two: The applying hospital was listed as a participant of a Medicare GME affiliated group on the most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed, and under the terms of that Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement, the applying hospital received slots from the hospital that closed, and the applying hospital will use the additional slots to continue to train at least the number of FTE residents it had trained under the terms of the Medicare GME affiliation agreement, or emergency Medicare GME affiliation agreement. If the most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement of which the closed hospital was a member before the hospital closed was with a hospital that itself has closed or is closing, preference would be given to an applying hospital that was listed as a participant in the next most recent Medicare GME affiliation agreement or emergency Medicare GME affiliation agreement (but not one which was entered into more than 5 years prior to the hospital's closure) of which the first closed hospital was a member before the hospital closed, and that applying hospital received slots from the closed hospital under the terms of that affiliation agreement.

(This language reflects our clarification in the proposed rule and this final rule regarding inclusion of emergency Medicare GME affiliation agreements in Ranking Criterion Two. We refer readers to section IV.K.5.d. of the preamble of this final rule where we discuss this clarification.)

  • Effective Date: Slots are effective with the date of the hospital closure.
  • Ranking Criterion Three: The applying hospital took in residents displaced by the closure of the hospital, but is not assuming an entire program or programs, and will use the additional slots to continue training residents in the same programs as the displaced residents, even after those displaced residents complete their training (that is, the applying hospital is permanently expanding its own existing programs).
  • Effective Date: If the hospital is receiving temporary cap adjustment, slots are effective the day after the graduation date(s) of actual displaced resident(s). If the hospital is not receiving a temporary cap adjustment, slots are effective with the date of the hospital closure.
  • Ranking Criterion Four: The program does not meet Ranking Criteria 1, 2, or 3, and the applying hospital will use additional slots to establish a new or expand an existing geriatrics residency program.
  • Ranking Criterion Five: The program does not meet Ranking Criteria 1 through 4, the applying hospital is located in a HPSA, and will use all the additional slots to establish or expand a primary care or general surgery residency program.
  • Ranking Criterion Six: The program does not meet Ranking Criteria 1 through 5, and the applying hospital is not located in a HPSA, and will use all the additional slots to establish or expand a primary care or general surgery residency program.
  • Ranking Criterion Seven: The applying hospital will use additional slots to establish or expand a primary care or general surgery program, but the program does not meet Ranking Criterion 5 or 6 because the hospital is also separately applying under Ranking Criterion 8 for slots to establish or expand a nonprimary care or nongeneral surgery program.

(This language reflects our proposal in this proposed rule to revise Ranking Criteria Seven and Eight. We refer readers to section IV.K.5.c. of the preamble of this final rule where we discuss our proposals and final policies to amend Ranking Criteria Seven and Eight.)

  • Ranking Criterion Eight: The program does not meet Ranking Criteria 1 through 7, and the applying hospital will use additional slots to establish or expand a nonprimary care or a nongeneral surgery program.

(This language reflects our proposal in the proposed rule to revise Ranking Criterion Eight. We refer readers to section IV.K.5.c. of the preamble of this final rule where we discuss our proposals and final policies to amend Ranking Criterion Eight.)

Effective Date Policy for Ranking Criterion Four through Ranking Criterion Eight: If the hospital is receiving a temporary cap adjustment for training displaced residents and its section 5506 award is less than or equal to the temporary cap adjustment, the section 5506 slots would become effective the later of when the hospital can demonstrate to the MAC that the slots associated with a new program or program expansion are actually filled, and therefore, are needed, or the July 1 after displaced residents complete their training. If the hospital is receiving a temporary cap adjustment for training displaced residents and its section 5506 award is greater than the temporary cap adjustment, the number of slots by which the section 5506 award exceeds the temporary cap adjustment would be available for use when the hospital can demonstrate to its MAC that the slots associated with the new program or program expansion are filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive). If the hospital is not receiving a temporary cap adjustment, slots would become effective when the hospital can demonstrate to the MAC that the slots needed for a new program or program expansion are actually filled and, therefore, are needed as of a particular date (usually July 1, possibly retroactive).

At the end of this GME section, we are including a revised Section 5506 Application Form that reflects all of the final changes discussed above.

Out of Scope GME Comments

We received several comments that were not related to the GME proposals in the FY 2015 IPPS/LTCH PPS proposed rule. Some commenters urged CMS to be more transparent and provide data on the effects of the section 5503 and the section 5506 redistributions. One commenter asked that CMS consider changing the calculation of the FTE cap for new teaching hospitals so that it is based on the final 1-year period of the 5-year growth window, as opposed to the entire 5 years. Another commenter stated that policies to redirect funding from specialty to primary care do not take into consideration the serious consequences of a potential shortage of specialty physicians, and that Medicare GME should fully fund the entire length of training required for initial board certification for neurosurgery, which is 6 to 7 years. Several commenters urged CMS to publish a clear statement that neither a hospital's PRA nor its cap-building window is triggered by the presence of a small number of residents performing brief rotations at the hospital. Another commenter asserted that second-year pharmacy residencies should receive Medicare pass-through reimbursement.

We appreciate these comments. However, because we did not propose any changes related to these issues in the proposed rule, we consider these comments to be outside the scope of the proposed rule and are not addressing these comments at this time.

6. Clarification and Policy Change Applicable to Direct GME Payments to Federally Qualified Health Centers (FQHCs) and Rural Health Clinics (RHCs) for Training Residents in Approved Programs

Under section 1886(k) of the Act, and as implemented in the regulations at 42 CFR 405.2468(f), federally qualified health centers (FQHCs) and rural health clinics (RHCs) may receive payment for the costs of direct GME for training residents in an approved program under certain circumstances. Specifically, the regulations at § 405.2468(f)(1) state that effective for that portion of cost reporting periods occurring on or after January 1, 1999, if an RHC or an FQHC incurs “all or substantially all” of the costs for the training program in the nonhospital setting as defined in § 413.75(b), the RHC or FQHC may receive direct graduate medical education payment for those residents. We refer readers to the July 31, 1998 final rule (63 FR 40986) for a detailed discussion of this longstanding policy. As noted earlier, the regulatory text of § 405.2468(f)(1) incorporates the definition of “all or substantially all of the costs for the training program in a nonhospital setting” that is defined at § 413.75(b), as part of a number of definitions applicable generally to hospital direct GME payments and those regulations at § 413.76 through § 413.83. Section 413.75(b) is based on the statutory provision at section 1886(h)(4)(E) of the Act, which establishes the requirements that hospitals must meet in order to receive direct GME payment for residents training in nonprovider settings.

The statutory use of the phrase “all or substantially all of the costs for the training program in that setting” is located in section 1886(h)(4)(E) of the Act, as added by section 9314 of the Omnibus Budget Reconciliation Act of 1986 (Pub. L. 99-509) (OBRA `86). For a detailed discussion of the implementation of section 9314 of OBRA `86, we refer readers to the September 29, 1989 final rule (54 FR 40292). Section 1886(h)(4)(E) of the Act, as added by OBRA '86, established the requirements that hospitals must meet in order to receive direct GME payment for residents training in nonprovider settings. However, section 5504(a) of the Affordable Care Act made changes to section 1886(h)(4)(E) of the Act to reduce the costs that hospitals must incur for residents training in nonprovider sites in order to count the FTE residents for purposes of direct GME payments. In making these changes to section 1886(h)(4)(E) of the Act, section 5504(a) of the Affordable Care Act amended the Act prospectively, effective with “cost reporting periods beginning on or after July 1, 2010” for direct GME, by removing the phrase “all or substantially all of the costs for the training program in that setting” and instead permitting hospitals to count the time that residents train in activities related to patient care in a nonprovider site if the hospital incurs the costs of the residents' salaries and fringe benefits for the time that the resident spends training in the nonprovider site. In effect, this amendment reduced the costs that hospitals must incur for residents training in nonprovider settings.

Based on this statutory amendment, in the November 24, 2010 final rule with comment period (75 FR 72134), we revised the regulations at § 412.105(f)(1)(ii)(E) for IME and §§ 413.78(f) and (g) for direct GME to reflect the changes made by section 5504(a) of the Affordable Care Act. In addition, we revised the regulatory definition of “all or substantially all of the costs for the training program in the nonhospital setting” in order to implement the statutory amendment and apply the effective date as set forth in the statute to cost reporting periods beginning on or after July 1, 2010. Specifically, the regulations at § 413.75(b), which define “all or substantially all of the costs for the training program in the nonhospital setting” were revised to state:

  • Effective on or after January 1, 1999 and for cost reporting periods beginning before July 1, 2007, the residents' salaries and fringe benefits (including travel and lodging where applicable) and the portion of the cost of teaching physicians' salaries and fringe benefits attributable to direct graduate medical education (GME); and
  • Effective for cost reporting periods beginning on or after July 1, 2007 and before July 1, 2010, at least 90 percent of the total of the costs of the residents' salaries and fringe benefits (including travel and lodging where applicable) and the portion of the cost of teaching physicians' salaries attributable to nonpatient care direct GME activities.

Ultimately, with regard to the costs that hospitals must incur for residents training in nonprovider sites in order to count the FTE residents for purposes of direct GME payments, the phrase “all or substantially all of the costs for the training program in the nonhospital setting” no longer applies, effective for cost reporting periods beginning on and after July 1, 2010.

In the November 24, 2010 final rule with comment period (75 FR 72134), we amended the regulations applicable to direct GME payments to hospitals at §§ 413.75(b) and 413.78(g) to reflect the changes made by section 5504(a) of the Affordable Care Act. However, at that time, we inadvertently did not make conforming changes to the regulations at § 405.2468(f)(1) to clarify the requirements that FQHCs and RHCs must meet in order to receive direct GME payment for training residents in their facilities. Therefore, in compliance with our longstanding policy that FQHCs and RHCs must meet the same requirements applicable to teaching hospitals for direct GME payments with respect to training residents in nonprovider settings, as we did in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28164), we are providing clarification that, based on statutory amendments discussed earlier, the applicable policy cross-referenced in § 405.2468(f)(1) has changed for cost reporting periods beginning on or after July 1, 2010. In addition, to ensure statutory and regulatory consistency, we proposed to revise the regulations at § 405.2468(f)(1) to add a sentence at the end of the paragraph that stated that in connection with cost reporting periods for which “all or substantially all of the costs for the training program in the nonhospital setting” is not defined in § 413.75(b), if an RHC or an FQHC incurs the salaries and fringe benefits (including travel and lodging where applicable) of residents training at the RHC or FQHC, the RHC or FQHC may receive direct graduate medical education payment for those residents.

We did not receive any public comments regarding our proposed clarification and policy change applicable to direct GME payments to FQHCs and RHCs for training residents in approved programs. Therefore, we are finalized this policy as proposed.

L. Rural Community Hospital Demonstration Program

1. Background

Section 410A(a) of Public Law 108-173 required the Secretary to establish a demonstration program to test the feasibility and advisability of establishing “rural community” hospitals to furnish covered inpatient hospital services to Medicare beneficiaries. The demonstration pays rural community hospitals under a reasonable cost-based methodology for Medicare payment purposes for covered inpatient hospital services furnished to Medicare beneficiaries. A rural community hospital, as defined in section 410A(f)(1), is a hospital that—

  • Is located in a rural area (as defined in section 1886(d)(2)(D) of the Act) or is treated as being located in a rural area under section 1886(d)(8)(E) of the Act;
  • Has fewer than 51 beds (excluding beds in a distinct part psychiatric or rehabilitation unit) as reported in its most recent cost report;
  • Provides 24-hour emergency care services; and
  • Is not designated or eligible for designation as a CAH under section 1820 of the Act.

Section 410A(a)(4) of Public Law 108-173 specified that the Secretary was to select for participation no more than 15 rural community hospitals in rural areas of States that the Secretary identified as having low population densities. Using 2002 data from the U.S Census Bureau, we identified the 10 States with the lowest population density in which rural community hospitals were to be located in order to participate in the demonstration: Alaska, Idaho, Montana, Nebraska, Nevada, New Mexico, North Dakota, South Dakota, Utah, and Wyoming. (Source: U.S. Census Bureau, Statistical Abstract of the United States: 2003).

CMS originally solicited applicants for the demonstration in May 2004; 13 hospitals began participation with cost reporting periods beginning on or after October 1, 2004. In 2005, 4 of these 13 hospitals withdrew from the program and converted to CAH status. This left nine hospitals participating at that time. In 2008, we announced a solicitation for up to six additional hospitals to participate in the demonstration program. Four additional hospitals were selected to participate under this solicitation. These four additional hospitals began under the demonstration payment methodology with the hospital's first cost reporting period starting on or after July 1, 2008. At that time, 13 hospitals were participating in the demonstration.

Five hospitals (3 of the hospitals were among the 13 hospitals that were original participants in the demonstration program and 2 of the hospitals were among the 4 hospitals that began the demonstration program in 2008) withdrew from the demonstration program during CYs 2009 and 2010. (Three of these hospitals indicated that they would be paid more for Medicare inpatient hospital services under the rebasing option allowed under the SCH methodology provided for under section 122 of the Medicare Improvements for Patients and Providers Act of 2008 (Pub. L. 110-275). One hospital restructured to become a CAH, and one hospital closed.) In CY 2011, one hospital that was among the original set of hospitals that participated in the demonstration withdrew from the demonstration. These actions left seven of the originally participating hospitals (that is, hospitals that were selected to participate in either 2004 or 2008) participating in the demonstration program as of June 1, 2011.

Sections 3123 and 10313 of the Affordable Care Act (Pub. L. 111-148) amended section 410A of Public Law 108-173, which established the rural community hospital demonstration program. Sections 3123 and 10313 of the Affordable Care Act changed the rural community hospital demonstration program in several ways. First, the Secretary is required to conduct the demonstration program for an additional 5-year period that begins on the date immediately following the last day of the initial 5-year period. Further, the Affordable Care Act requires, in the case of a rural community hospital that is participating in the demonstration program as of the last day of the initial 5-year period, the Secretary to provide for the continued participation of such rural hospital in the demonstration program during the 5-year extension, unless the hospital makes an election, in such form and manner as the Secretary may specify, to discontinue participation (section 410A(g)(4)(A) of Pub. L. 108-173, as added by section 3123(a) of the Affordable Care Act and further amended by section 10313 of such Act).

In addition, the Affordable Care Act provides that, during the 5-year extension period, the Secretary shall expand the number of States with low population densities determined by the Secretary to 20 (section 410A(g)(2) of Public Law 108-173, as added by section 3123(a) and amended by section 10313 of the Affordable Care Act). Further, the Secretary is required to use the same criteria and data that the Secretary used to determine the States under section 410A(a)(2) of Public Law 108-173 for purposes of the initial 5-year period. The Affordable Care Act also allows not more than 30 rural community hospitals in such States to participate in the demonstration program during the 5-year extension period (section 410A(g)(3) of Public Law 108-173, as added by section 3123(a) of the Affordable Care Act and as further amended by section 10313 of such Act).

We published a solicitation for applications for additional participants in the rural community hospital demonstration program in the Federal Register on August 30, 2010 (75 FR 52960). Applications were due on October 14, 2010. The 20 States with the lowest population density that were eligible for the demonstration program are: Alaska, Arizona, Arkansas, Colorado, Idaho, Iowa, Kansas, Maine, Minnesota, Mississippi, Montana, Nebraska, Nevada, New Mexico, North Dakota, Oklahoma, Oregon, South Dakota, Utah, and Wyoming (Source: U.S. Census Bureau, Statistical Abstract of the United States: 2003). We approved 19 new hospitals for participation in the demonstration program. We determined that each of these new hospitals would begin participating in the demonstration with its first cost reporting period beginning on or after April 1, 2011.

Three of these 19 hospitals declined participation prior to the start of the cost reporting periods for which they would have begun the demonstration. In addition to the 7 hospitals that were selected in either 2004 or 2008, the new selection led to a total of 23 hospitals in the demonstration. During CY 2013, one additional hospital among the set selected in 2011 withdrew from the demonstration, similarly citing a relative financial advantage to returning to the customary SCH payment methodology, which left 22 hospitals participating in the demonstration.

In addition, section 410A(c)(2) of Public Law 108-173 required that, “[i]n conducting the demonstration program under this section, the Secretary shall ensure that the aggregate payments made by the Secretary do not exceed the amount which the Secretary would have paid if the demonstration program under this section was not implemented.” This requirement is commonly referred to as “budget neutrality.” Generally, when we implement a demonstration program on a budget neutral basis, the demonstration program is budget neutral in its own terms; in other words, the aggregate payments to the participating hospitals do not exceed the amount that would be paid to those same hospitals in the absence of the demonstration program. Typically, this form of budget neutrality is viable when, by changing payments or aligning incentives to improve overall efficiency, or both, a demonstration program may reduce the use of some services or eliminate the need for others, resulting in reduced expenditures for the demonstration program's participants. These reduced expenditures offset increased payments elsewhere under the demonstration program, thus ensuring that the demonstration program as a whole is budget neutral or yields savings. However, the small scale of this demonstration program, in conjunction with the payment methodology, makes it extremely unlikely that this demonstration program could be viable under the usual form of budget neutrality.

Specifically, cost-based payments to participating small rural hospitals are likely to increase Medicare outlays without producing any offsetting reduction in Medicare expenditures elsewhere. Therefore, a rural community hospital's participation in this demonstration program is unlikely to yield benefits to the participant if budget neutrality were to be implemented by reducing other payments for these same hospitals.

In the past 10 IPPS final regulations, spanning the period for which the demonstration program has been implemented, we have adjusted the national inpatient PPS rates by an amount sufficient to account for the added costs of this demonstration program, thus applying budget neutrality across the payment system as a whole rather than merely across the participants in the demonstration program. As we discussed in the FYs 2005 through 2014 IPPS final rules (69 FR 49183; 70 FR 47462; 71 FR 48100; 72 FR 47392; 73 FR 48670; 74 FR 43922; 75 FR 50343; 76 FR 51698; 77 FR 53449; and 78 FR 50740; respectively), we believe that the language of the statutory budget neutrality requirements permits the agency to implement the budget neutrality provision in this manner. In light of the statute's budget neutrality requirement, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28166 through 28167), we proposed to continue to use the methodology we finalized in FY 2013 to calculate a budget neutrality adjustment factor to the FY 2015 national IPPS rates.

In general terms, in each of these previous years, we used available cost reports for the participating hospitals to derive an estimate of the additional costs attributable for the demonstration. Prior to FY 2013, we used finalized, or settled, cost reports, as available, and “as submitted” cost reports for hospitals for which finalized cost reports were not available. Annual market basket percentage increase amounts provided by the CMS Office of the Actuary reflecting the growth in the prices of inputs for inpatient hospitals were applied to these cost amounts. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53452), we used “as submitted” cost reports (for cost reporting periods ending in CY 2010) for each hospital participating in the demonstration in estimating the costs of the demonstration. In addition, in FY 2013, we incorporated different update factors (the market basket percentage increase and the applicable percentage increase, as applicable, to several years of data as opposed to solely using the market basket percentage increase) for the calculation of the budget neutrality offset amount. Finally, in each of the previous years, an annual update factor provided by the CMS Office of the Actuary reflecting growth in the volume of inpatient operating services was also applied. For the budget neutrality calculations in the IPPS final rules for FYs 2005 through 2011, the annual volume adjustment applied was 2 percent; for the IPPS final rules for FYs 2012, 2013, and 2014, it was 3 percent. For a detailed discussion of our budget neutrality offset calculations, we refer readers to the IPPS final rule applicable to the fiscal year involved.

In general, for FYs 2005 through 2009, we based the budget neutrality offset estimate on the estimated cost of the demonstration in an earlier given year. For these periods, we derived that estimated cost by subtracting the estimated amount that would otherwise be paid without the demonstration in an earlier given year from the estimated amount for the same year that would be paid under the demonstration under the reasonable cost-based methodology authorized by section 410A of Public Law 108-173. (We note that section 410A of Public Law 108-173 was later amended by the Affordable Care Act.) The reasonable cost-based methodology authorized by section 410A of Public Law 108-173, as amended, is hereafter referred to as the “reasonable cost methodology.” (We ascertained the estimated amount that would be paid in an earlier given year under the reasonable cost methodology and the estimated amount that would otherwise be paid without the demonstration in an earlier given year from “as submitted” cost reports that were submitted by the hospitals prior to the inception of the demonstration.) We then updated the estimated cost described above to the current year by multiplying it by the market basket percentage increases applicable to the years involved and the applicable annual volume adjustment. For the FY 2010 IPPS/RY 2010 LTCH PPS final rule, data from finalized cost reports reflecting the participating hospitals' experience under the demonstration were available. Specifically, the finalized cost reports for the first 2 years of the demonstration, that is, cost reports for cost reporting years beginning in FYs 2005 and 2006 (CYs 2004, 2005, and 2006) were available. These data showed that the actual costs of the demonstration for these years exceeded the amounts originally estimated in the respective final rules for the budget neutrality adjustment. In the FY 2010 IPPS/RY 2010 LTCH PPS final rule, we included in the budget neutrality offset amount an amount in addition to the estimate of the demonstration costs in that fiscal year. This additional amount was based on the amount that the costs of the demonstration for FYs 2005 and 2006 exceeded the budget neutrality offset amounts finalized in the IPPS rules applicable for those years.

Following upon the FY 2010 IPPS/RY 2010 LTCH PPS final rule, we continued to propose and use a methodology for calculating the budget neutrality offset amount to account for both the estimated demonstration costs in the upcoming fiscal year and an amount by which the actual demonstration costs corresponding to an earlier, given year (which would be known once finalized cost reports became available for that year) exceeded the budget neutrality offset amount finalized in the corresponding year's IPPS final rule. However, we noted in the FYs 2011, 2012, and 2013 IPPS final rules that, because of a delay affecting the settlement process for cost reports for IPPS hospitals occurring on a larger scale than merely for the demonstration, we were unable to finalize this component of the budget neutrality offset amount accounting for the amount by which the actual demonstration costs in a given year exceeded the budget neutrality offset amount finalized in the corresponding year's IPPS final rule for cost reports of demonstration hospitals dating to those beginning in FY 2007.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53449 through 53453), we adopted changes to the methodology for calculating the budget neutrality offset amount in an effort to further improve and refine it. We noted that the revised methodology varied, in part, from that finalized in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51698 through 51705). Specifically, in adopting refinements to the methodology, our objective was to simplify the calculation so that it included as few steps as possible. In addition, we incorporated different update factors (the market basket percentage increase and the applicable percentage increase, as applicable, to several years of data as opposed to solely using the market basket percentage increase) for the calculation of the budget neutrality offset amount. We stated that we believed this approach would maximize the precision of our calculation because it would more closely replicate payments made with and without the demonstration. We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53449 through 53453) for a detailed discussion of the methodology we used for FY 2013. We noted that, although we were making changes to certain aspects of the budget neutrality offset amount calculation for FY 2013, several core components of the methodology would remain unchanged. For example, we continued to include in the budget neutrality offset amount methodology the estimate of the demonstration costs for the upcoming fiscal year and the amount by which the actual demonstration costs corresponding to an earlier year (which would be determined once we have finalized cost reports for that year) exceeded the budget neutrality offset amount finalized in the corresponding year's IPPS final rule. However, finalized cost reports for the hospitals participating in the demonstration were not available for FYs 2007, 2008, 2009, and 2010 at the time of development of the FY 2013 IPPS/LTCH PPS final rule. Therefore, we were unable to finalize this component of the budget neutrality offset calculation. We stated in the final rule that we expected settled cost reports for all of the demonstration hospitals that participated in the applicable fiscal year (FYs 2007, 2008, 2009, and 2010) to be available prior to the FY 2014 IPPS/LTCH PPS proposed rule.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50739 through 50744), we determined the final budget neutrality offset amount to be applied to the FY 2014 IPPS rates to be $52,589,741. This amount was comprised of two distinct components: (1) the final resulting difference between the estimated reasonable cost amount to be paid under the demonstration to the 22 participating hospitals in FY 2014 for covered inpatient hospital services and the estimated amount that would otherwise be paid to such hospitals in FY 2014 without the demonstration (this amount was $46,549,861); and (2) the amount by which the actual costs of the demonstration for FY 2007, as shown in the finalized cost reports for the hospitals that participated in the demonstration during FY 2007, exceeded the budget neutrality offset amount that was finalized in the FY 2007 IPPS final rule (this amount, $6,039,880, was derived from finalized cost reports for cost reporting periods beginning in FY 2007 for the 9 hospitals that participated in the demonstration during that year).

2. FY 2015 Budget Neutrality Offset Amount

For the reasons discussed in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53449 through 53453), we proposed in the FY 2015 IPPS/LTCH PPS proposed rule (78 FR 28167) to continue to use the methodology finalized in the FY 2013 IPPS/LTCH PPS final rule to calculate a budget neutrality adjustment factor to be applied to the FY 2015 national IPPS payment rates. As we stated in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53451), we revised our methodology in that final rule to further improve and refine the calculation of the budget neutrality offset amount and to simplify the methodology so that it includes only a few steps. Consistent with the methodology finalized in the FY 2013 IPPS/LTCH PPS final rule, the methodology proposed for calculating the estimated FY 2015 demonstration cost for the participating hospitals was as follows:

Step 1: For each of the participating hospitals, we proposed to identify the general reasonable cost amount calculated under the reasonable cost methodology for covered inpatient hospital services (as indicated on the “as submitted” cost report for the hospital's cost reporting period ending in CY 2012). The general reasonable cost amount calculated under the reasonable cost methodology is hereafter referred to as the “reasonable cost amount.” As we explained in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53451), we believe that a way to streamline our methodology for calculating the budget neutrality offset amount would be to use cost reports with the same status and from the same time period for all hospitals participating in the demonstration. Because “as submitted” cost reports ending in CY 2012 are the most recent available cost reports, we believe they would be an accurate predictor of the costs of the demonstration in FY 2015 because they give us a recent picture of the participating hospitals' costs.

Because section 410A of Public Law 108-173 stipulates swing-bed services are to be included among the covered inpatient hospital services for which the demonstration payment methodology applies, we proposed to include the cost of these services, as reported on the cost reports for the hospitals that provide swing-bed services, within the general total estimated FY 2012 reasonable cost amount for covered inpatient hospital services under the demonstration. As indicated above, we proposed to use “as submitted” cost reports for the hospital's cost reporting period ending in CY 2012 for this calculation.

We proposed to sum the two above-referenced amounts to calculate the general total estimated FY 2012 reasonable cost amount for covered inpatient hospital services for all participating hospitals.

We proposed to multiply this sum (that is, the general total estimated FY 2012 reasonable cost amount for covered inpatient hospital services for all participating hospitals) by the FY 2013, FY 2014, and FY 2015 IPPS market basket percentage increases, which are formulated by the CMS Office of the Actuary. In this final rule, we are using the current estimate of the FY 2015 IPPS market basket percentage increase provided by the CMS Office of the Actuary as specified in section IV.B.1. of the preamble of this final rule. We then multiply the product of the general total estimated FY 2012 reasonable cost amount for all participating hospitals and the market basket percentage increases applicable to the years involved by a 3-percent annual volume adjustment for FYs 2013 through 2015—the result is the general total estimated FY 2015 reasonable cost amount for covered inpatient hospital services for all participating hospitals.

We proposed to apply the IPPS market basket percentage increases applicable for FYs 2013 through 2015 to the FY 2012 reasonable cost amount described above to model the estimated FY 2015 reasonable cost amount under the demonstration. We proposed to use the IPPS market basket percentage increases because we believe that these update factors appropriately indicate the trend of increase in inpatient hospital operating costs under the reasonable cost methodology for the years involved. The 3-percent annual volume adjustment was stipulated by the CMS Office of the Actuary and is being used because it is intended to accurately reflect the tendency of hospitals' inpatient caseloads to increase. On account of the possibility that inpatient caseloads for small hospitals may fluctuate, we are incorporating into the estimate of demonstration costs a factor to allow for a potential increase in inpatient hospital services.

Step 2: For each of the participating hospitals, we proposed to identify the general estimated amount that would otherwise be paid in FY 2012 under applicable Medicare payment methodologies for covered inpatient hospital services (as indicated on the “as submitted” cost report for cost reporting periods ending in CY 2012) if the demonstration was not implemented. Similarly, as in Step 1, for the hospitals that provide swing-bed services, we proposed to identify the estimated amount that generally would otherwise be paid for these services (as indicated on the “as submitted” cost report for cost reporting periods ending in CY 2012) and include it in the total FY 2012 general estimated amount that would otherwise be paid for covered inpatient hospital services without the demonstration. We proposed to sum these two amounts in order to calculate the estimated FY 2012 total payments that generally would otherwise be paid for covered inpatient hospital services for all participating hospitals without the demonstration.

We proposed to multiply the above amount (that is, the estimated FY 2012 total payments that generally would otherwise be paid for covered inpatient hospital services for all participating hospitals without the demonstration) by the FYs 2013 through 2015 IPPS applicable percentage increases. For the proposed rule, the estimate of the FY 2015 applicable percentage increase was specified in section IV.B. of the preamble. This methodology differs from Step 1, in which we proposed to apply the market basket percentage increases to the sum of the hospitals' general total FY 2012 estimated reasonable cost amount for covered inpatient hospital services. We believe that the IPPS applicable percentage increases are appropriate factors to update the estimated amounts that generally would otherwise be paid without the demonstration. This is because IPPS payments would constitute the majority of payments that would otherwise be made without the demonstration and the applicable percentage increase is the factor used under the IPPS to update the inpatient hospital payment rates. Hospitals participating in the demonstration would be participating under the IPPS payment methodology if they were not in the demonstration. Then we proposed to multiply the product of the estimated FY 2012 total payments that generally would otherwise be made without the demonstration and the applicable IPPS percentage increases for the years involved by a 3-percent annual volume adjustment for FYs 2013 through 2015. The result represents the general total estimated FY 2015 costs that would otherwise be paid without the demonstration for covered inpatient hospital services to the participating hospitals.

Step 3: We proposed to subtract the amount derived in Step 2 (representing the sum of estimated amounts that generally would otherwise be paid to the participating hospitals for covered inpatient hospital services for FY 2015 if the demonstration were not implemented) from the amount derived in Step 1 (representing the sum of the estimated reasonable cost amount that generally would be paid under the demonstration to all participating hospitals for covered inpatient hospital services for FY 2015). We proposed that the resulting difference would be one component of the estimated amount for which an adjustment to the national IPPS rates would be calculated (as further discussed below).

For the proposed rule, the resulting difference was $53,673,008. This estimated amount is based on the specific assumptions identified regarding the data sources used, that is, “as submitted” recently available cost reports.

We did not receive any public comments on our proposed budget neutrality offset methodology, as discussed above. Therefore, we are finalizing the budget neutrality offset methodology as proposed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28165 through 28168).

In the FY 2015 IPPS/LTCH PPS proposed rule, we noted that if updated data became available prior to the FY 2015 IPPS/LTCH PPS final rule, we would use them to the extent appropriate to estimate the costs of the demonstration program in FY 2015. Therefore, we noted that the estimated budget neutrality offset amount might change in the final rule, depending on the availability of updated data. In this final rule, we have used the market basket update and applicable percentage increase that have been finalized for FY 2015. Using these updated data, the difference between the total estimated FY 2015 reasonable cost amount to be paid under the demonstration to the 22 participating hospitals for covered inpatient hospital services and the total estimated amount that would otherwise be paid to the participating hospitals in FY 2015 without the demonstration is $54,177,144.

In addition, similar to previous years, we proposed to include in the budget neutrality offset amount the amount by which the actual demonstration costs corresponding to an earlier given year (which would be determined once we had finalized cost reports for that year) exceeded the budget neutrality offset amount finalized in the corresponding year's IPPS final rule. In the FY 2015 IPPS/LTCH PPS proposed rule, we calculated the amount by which the actual costs of the demonstration in FY 2008 (that is, the costs of the demonstration for the 10 hospitals that participated in FY 2008, as shown in these hospitals' finalized cost reports for the cost report period beginning in that fiscal year), exceeded the budget neutrality offset amount that was finalized in the FY 2008 IPPS final rule. The amount calculated for the FY 2015 IPPS/LTCH PPS proposed rule, $10,389,771, remains unchanged for this final rule. We did not receive any public comments on this aspect of the proposed budget neutrality offset methodology, and therefore, are finalizing this aspect of the methodology as proposed. We continue to examine the cost report data for FY 2009, and to work with the MACs that service the hospitals participating in the demonstration to obtain finalized cost reports for FYs 2010, 2011, and 2012. We note that if settled cost reports for all of the demonstration hospitals that participated in an applicable year (FYs 2009, 2010, 2011, or 2012) are available prior to the FY 2016 IPPS/LTCH PPS proposed rule, we intend to adjust the budget neutrality offset amount for FY 2016 for any amounts by which the final settled costs of the demonstration for the year (FYs 2009, 2010, 2011, or 2012) differ from the budget neutrality offset amount applicable to such year as finalized in the respective year's IPPS final rule.

Therefore, the total budget neutrality offset amount that we are applying to the FY 2015 IPPS rates is $64,566,915. This is the sum of two separate components: (1) the difference between the total estimated FY 2015 reasonable cost amount to be paid under the demonstration to the 22 participating hospitals for covered inpatient hospital services and the total estimated amount that would otherwise be paid to the participating hospitals in FY 2015 without the demonstration ($54,177,144); and (2) the amount by which the actual costs of the demonstration for FY 2008 (as shown in the finalized cost reports for cost reporting periods beginning in FY 2008 for the hospitals that participated in the demonstration during FY 2008) exceed the budget neutrality offset amount that was finalized in the FY 2008 IPPS final rule ($10,389,771)). In this final rule, we are adjusting the national IPPS rates by this total amount ($64,566,915). We discuss the final payment rate adjustment that is required to ensure the budget neutrality of the demonstration program for FY 2015 (the budget neutrality adjustment factor) in section II. of the Addendum to this final rule.

M. Requirement for Transparency of Hospital Charges Under the Affordable Care Act

1. Overview

Hospitals determine their charges for items and services provided to patients and are responsible for those charges. While Medicare does not pay billed charges, hospital reported charges are used in determining Medicare's national payment rates (for example, billed charges are adjusted to cost to determine how much to pay for one type of case relative to another). Although the Medicare payment amount for a discharge under the IPPS or a service furnished under the OPPS is not based directly on the hospital's charges for the individual services provided, we believe that hospital charges nevertheless remain an important component of our healthcare system. For example, hospital charges are often billed, in full, to uninsured patients who cannot benefit from discounts negotiated by insurance companies. Hospital charges also vary significantly by hospital, making it challenging for patients to compare the cost of similar services across hospitals.

In 2013, we released data that demonstrated significant variation across the country and within communities in what hospitals charge for a number of common inpatient and outpatient services. These data also showed that hospital charges for services furnished in both the inpatient setting and the outpatient setting were, in general, significantly higher than the amount paid by Medicare under the IPPS or the OPPS. The data that we released are posted on the Web site at: https://www/cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/index.html. Our intent in releasing these data was to enable the public to examine the relationship between the amounts charged by individual hospitals for comparable services and Medicare's payment for that inpatient or outpatient care. We believe that providing charge data comparisons is introducing both transparency and accountability to hospital pricing, and we are continuing to pursue opportunities to report on hospital charging practices.

2. Transparency Requirement Under the Affordable Care Act

The Affordable Care Act contains a provision that is consistent with our effort to improve the transparency of hospital charges. As a result of the Affordable Care Act, section 2718(e) of the Public Health Service Act requires that “[e]ach hospital operating within the United States shall for each year establish (and update) and make public (in accordance with guidelines developed by the Secretary) a list of the hospital's standard charges for items and services provided by the hospital, including for diagnosis-related groups established under section 1886(d)(4) of the Social Security Act.”

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28169), we reminded hospitals of their obligation to comply with the provisions of section 2718(e) of the Public Health Service Act. We appreciate the widespread public support we received for including the reminder in the proposed rule. We reiterate that our guidelines for implementing section 2718(e) of the Public Health Service Act are that hospitals either make public a list of their standard charges (whether that be the chargemaster itself or in another form of their choice), or their policies for allowing the public to view a list of those charges in response to an inquiry. MedPAC suggested that hospitals be required to post the list on the Internet, and while we agree that this would be one approach that would satisfy the guidelines, we believe hospitals are in the best position to determine the exact manner and method by which to make the list public in accordance with the guidelines.

We encourage hospitals to undertake efforts to engage in consumer friendly communication of their charges to help patients understand what their potential financial liability might be for services they obtain at the hospital, and to enable patients to compare charges for similar services across hospitals. We expect that hospitals will update the information at least annually, or more often as appropriate, to reflect current charges.

We are confident that hospital compliance with this statutory transparency requirement will improve the public accessibility of charge information. As hospitals continue to make data publicly available in compliance with section 2718(e) of the Public Health Service Act, we also will continue to review and post relevant charge data in a consumer friendly way, as we previously have done by posting on the CMS Web site the following hospital and physician charge information: May and June 2013 hospital charge data releases; 2013 physician data requests for information; and the April 2014 physician data releases and data provided on geographic variation in payments and payments per beneficiary.

N. Medicare Payment for Short Inpatient Hospital Stays

As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28169), some members of the hospital community have expressed support for the general concept of an alternative payment methodology under the Medicare program for short inpatient hospital stays. We sought public comments on such a payment methodology, specifically how it might be designed. We outlined some specific questions and considerations that we identified as critical for developing such a methodology. We noted that this list of questions and considerations was not exhaustive, and we welcomed additional questions, suggestions, and input from stakeholders.

  • Defining short or low cost inpatient hospital stays:

One issue would be how to define a short inpatient hospital stay for the purpose of determining the appropriate Medicare payment. For instance, would a short inpatient hospital stay be one where the average length of stay for the MS-DRG is short or would it be atypically short or low cost cases relative to other cases within same MS-DRG? There are significant differences in mean lengths of stay among MS-DRGs. For example, many frequently billed MS-DRGs have historically had mean lengths of stay of approximately 2 days, such as MS-DRG 313 (Chest Pain). Other MS-DRGs such as MS-DRG 871 (Septicemia or Severe Sepsis without Mechanical Ventilation 96+ hours with MCC) have had longer lengths of stay.

If we adopted a policy that paid less for atypically low-cost or short-stay cases relative to the average case in the same MS-DRG, we believe such a policy is more likely to affect an MS-DRG like MS-DRG 871 that has a longer average length of stay or higher average cost associated with the typical patient. Such a policy is less likely to apply to MS-DRG 313 because the typical case is already low cost or short stay.

  • Determining appropriate payment for short inpatient hospital stays:

Another issue would be how to determine the appropriate payment once a short stay has been identified. Some have suggested a per diem based payment amount, perhaps modelled on the existing transfer payment policy. Again, such a policy is far more likely to affect payment for an atypically short-stay or low-cost case in an MS-DRG with a longer average length of stay. For short-stay cases in an MS-DRG where the average length of stay for the MS-DRG is short, this methodology would be unlikely to affect payment as the full IPPS payment would be made in 1 or 2 days.

For these types of short-stay cases, one relevant issue to address may be that payment for the same case will be very different under the OPPS and the IPPS depending upon whether the patient has been formally admitted to the hospital as an inpatient, pursuant to a physician order. Under what circumstances should the IPPS payment amount be limited to the OPPS payment amount and under what circumstances might it be appropriate for the payment amount to be higher? If it were appropriate for the payment amount to be higher, how would the amount of the additional payment be determined?

In the proposed rule, we welcomed input on these and other issues related to an alternative payment methodology under the Medicare program for short inpatient hospital stays.

Comment: Many commenters indicated that any short-stay policy should adhere to certain general principles, specifically citing some or all of the following: a short-stay policy should provide more appropriate and adequate payment for medically necessary inpatient services that span less than 2 midnights—payment should be higher than the outpatient PPS rate for the service, but should not exceed the full IPPS payment; a short-stay policy should not apply to those procedures on the “inpatient only” list; a short-stay policy should be budget neutral; hospitals should be eligible for all add-on payments they would otherwise receive (for example, DSH and IME), either in full or on a pro rata basis; beneficiaries requiring short inpatient hospital stays paid under a short-stay policy should be considered inpatients and cost-sharing obligations should be calculated under Medicare Part A; a short-stay policy should be developed in a way that would not increase administrative burden for hospitals, physicians, or other medical providers; and CMS should provide clear and consistent guidance and allow adequate time for hospitals to implement the short-stay policy prior to its effective date.

Other commenters indicated that CMS could or should consider approaches such as a per diem approach modeled after the existing transfer policy, creating separate MS-DRG weights for short-stay cases and nonshort-stay cases, or allowing the full MS-DRG payment on an interim basis while the issue is studied further.

Some commenters also stated that the MS-DRG system is predicated on the understanding that there will be a diversity of treatment patterns and individual patient circumstances for any given clinical condition, and that this diversity balances out—high-intensity cases are balanced by low-intensity cases. These commenters contended that creating a new category of “short stays” and paying for them differentially undermines the MS-DRG system.

Many commenters stated that additional research and collaboration were needed before a formal short-stay policy proposal could be made. MedPAC indicated that it intended to explore alternative short-stay policies in its upcoming work cycle.

Almost all commenters provided their comments on Medicare payment for short hospital stays in the context of broader comments on the current 2-midnight policy.

Response: We thank commenters for the many comments submitted on this issue, and we will take these into account in any potential future rulemaking. Although there was no consensus among the commenters, we look forward to continuing to actively work with stakeholders to address the complex question of how to further improve payment policy for short inpatient hospital stays.

O. Suggested Exceptions to the 2-Midnight Benchmark

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50943 through 50954), we modified and clarified CMS' longstanding policy on how Medicare contractors review inpatient hospital and CAH admissions for payment purposes. Under that final rule, we established a 2-midnight benchmark for determining the appropriateness of an inpatient hospital admission versus treatment on an outpatient basis. We provided in regulations at § 412.3(e)(1) that, in addition to services designated as inpatient only, surgical procedures, diagnostic tests, and other treatments are generally appropriate for inpatient hospital admission and payment under Medicare Part A when the physician (1) expects the beneficiary to require a medically necessary hospital stay that crosses at least 2 midnights and (2) admits the beneficiary to the hospital based upon that expectation. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50944), we stated that the medical judgment of the physician and the physician's order for inpatient admission should be based on the expectation of care surpassing 2 midnights, with both the expectation of time and the determination of the underlying need for medical care at the hospital supported by complex medical factors such as history and comorbidities, the severity of signs and symptoms, current medical needs, and the risk of an adverse event. We also indicated that, in accordance with longstanding policy, factors that may result in an inconvenience to a beneficiary or family would not justify an inpatient hospital admission. The factors that lead a physician to admit a particular beneficiary based on the physician's clinical expectation are significant clinical considerations and must be clearly and completely documented in the medical record. Medicare review contractors consider complex medical factors that support a reasonable expectation of the needed duration of the stay relative to the 2-midnight benchmark. The FY 2014 policy responded to both hospital calls for more guidance about when an inpatient admission and Part A payment are appropriate, and beneficiaries' concerns about increasingly long stays as outpatients due to hospital uncertainties about payment.

In the FY 2014 IPPS/LTCH PPS final rule, at § 412.3(e)(2), we recognized that if an unforeseen circumstance, such as a beneficiary's death or transfer, results in a shorter beneficiary stay than the physician's expectation of at least 2 midnights, the patient may be considered to be appropriately treated on an inpatient basis and hospital inpatient payment may be made under Medicare Part A. We also clarified, in both the final rule and subsequent subregulatory guidance, that the unforeseen circumstances specified at § 412.3(e)(2) are not all-inclusive and could include additional circumstances such as unexpected clinical improvement, election of hospice care, or departure against medical advice.

The FY 2014 IPPS/LTCH PPS final rule also indicated that there are exceptions to the 2-midnight benchmark. In other words, there will be cases in which an admitting practitioner expects the beneficiary's length of stay to last less than 2 midnights and yet inpatient admission would still be appropriate. For example, we specified that procedures on the OPPS inpatient only list are always appropriately inpatient, regardless of the actual time expected at the hospital, so long as the procedure is medically necessary and performed pursuant to a physician order and formal admission.

In addition to procedures contained on the OPPS inpatient only list, we noted in the FY 2014 IPPS/LTCH PPS final rule that there may be other rare and unusual circumstances in which a hospital stay expected to last less than 2 midnights would nonetheless be appropriate for inpatient hospital admission and Part A payment. We indicated that we would explore other potential exceptions to the generally applicable benchmark and would detail any such rare and unusual circumstances in subregulatory guidance. As part of this process, throughout the year, we have accepted and considered suggestions from stakeholders on this topic.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 280170), we described the process for submitting suggestions regarding potential additional exceptions to the 2-midnight benchmark. Such suggestions may be sent to CMS via written correspondence or via email to SuggestedExceptions@cms.hhs.gov. As noted in the proposed rule, CMS will notify providers of any additional guidance regarding 2-midnight exceptions through subregulatory means, such as postings on the CMS Web site or manual instruction.

Although the FY 2015 IPPS/LTCH PPS proposed rule did not include any proposed regulatory changes relating to the 2-midnight benchmark, we nonetheless received a number of public comments regarding the current regulation. Commenters opined on the usefulness of the 2-midnight benchmark for making inpatient admission decisions and provided suggestions for improving the policy. During the summer and fall of 2014, CMS plans to evaluate the results of the “probe & educate” process (a process by which MACs are reviewing a prepayment, provider-specific probe sample of inpatient Part A claims for appropriateness of inpatient admission under the revised 2-midnight benchmark and providing provider-specific education, as necessary, to correct improper payments) and issue additional subregulatory guidance to our claim review contractors, if necessary, to ensure consistency in application of the 2-midnight policy. We will consider all suggestions as we develop this subregulatory guidance. We also will continue to maintain open communication with stakeholders to ensure that the inpatient classification and payment policies provide a uniform process for beneficiary treatment and claim submission.

P. Finalization of Interim Final Rule With Comment Period on the Extension of the Payment Adjustment for Low-Volume Hospitals and the Medicare-Dependent, Small Rural Hospital (MDH) Program for FY 2014 Discharges Through March 31, 2014

1. Background

In the interim final rule with comment period (IFC) that appeared in the Federal Register on March 18, 2014 (79 FR 15022) (hereinafter referred to as the March 2014 IFC), we implemented the extension of temporary changes to the payment adjustment for low-volume hospitals and the MDH program under the IPPS for FY 2014 discharges through March 31, 2014, in accordance with sections 1105 and 1106, respectively, of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67) enacted on December 26, 2013. In this final rule, we are providing a brief summary of the provisions of that IFC, responding to the public comments we received, and stating our final policy.

Section 1105 of the Pathway for SGR Reform Act extended changes to the payment adjustment for low-volume hospitals for an additional 6 months, through March 31, 2014, of FY 2014. Section 1106 of the Pathway for SGR Reform Act extended the MDH program for an additional 6 months, through March 31, 2014, of FY 2014. (As discussed previously in sections IV.D. and IV.G. of the preamble of this final rule, the provisions of the PAMA, enacted on April 1, 2014, further extended changes to the payment adjustment for low-volume hospitals and the MDH program for an additional year, through March 31, 2015.)

2. Summary of the Provisions of the Interim Final Rule With Comment Period

a. Extension of the Payment Adjustment for Low-Volume Hospitals

(1) Background

Section 1886(d)(12) of the Act provides for an additional payment to each qualifying low-volume hospital under the IPPS beginning in FY 2005. The regulations describing the payment adjustment for low-volume hospitals are at 42 CFR 412.101.

Sections 3125 and 10314 of the Affordable Care Act provided for a temporary change in the low-volume hospital payment policy for FYs 2011 and 2012. Section 605 of the American Taxpayer Relief Act of 2012 (ATRA) extended, for FY 2013, the temporary changes in the low-volume hospital payment policy provided for in FYs 2011 and 2012 by the Affordable Care Act. Prior to the enactment of the Pathway for SGR Reform Act, for FY 2014 (and subsequent years), the low-volume hospital qualifying criteria and payment adjustment returned to the statutory requirements under section 1886(d)(12) of the Act that were in effect prior to the amendments made by the Affordable Care Act and the ATRA. (As previously noted, the provisions of the PAMA, enacted on April 1, 2014, further extended changes to the payment adjustment for low-volume hospitals and the MDH program for an additional year, through March 31, 2015. The extension of the temporary changes to the low-volume hospital payment adjustment for FY 2014 discharges occurring on or after April 1, 2014 through September 30, 2014 was announced in a notice that appeared in the Federal Register on June 17, 2014 (79 FR 34444). The extension of the temporary changes to the low-volume hospital payment adjustment for FY 2015 discharges occurring on or after October 1, 2014 through March 31, 2015, is discussed in section IV.D. of the preamble of this final rule.

The Affordable Care Act expanded the definition of low-volume hospital and modified the methodology for determining the payment adjustment for hospitals meeting that definition for FYs 2011 and 2012. In general, the amendments made by the Affordable Care Act modified the qualifying criteria for low-volume hospitals such that a hospital qualifies as a low-volume hospital if it is more than 15 road miles from another subsection (d) hospital and has less than 1,600 Medicare discharges during the fiscal year. In addition, the amendments made by the Affordable Care Act provide that the low-volume hospital payment adjustment (that is, the percentage increase) is determined “using a continuous linear sliding scale” that ranges from 25 percent for low-volume hospitals with 200 or fewer Medicare discharges in the fiscal year to 0 percent for low-volume hospitals with greater than 1,600 Medicare discharges. For additional information on the implementation of the temporary changes in the low-volume hospital payment policy provided by the Affordable Care Act, we refer readers to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50238 through 50275) and the FY 2012 IPPS/LTCH PPS final rule (76 FR 51677 through 51680).

Section 605 of the ATRA extended the temporary changes in the low-volume hospital payment policy provided for in FYs 2011 and 2012 by the Affordable Care Act for FY 2013, that is, for discharges occurring before October 1, 2013. For additional information on the extension of the temporary changes in the low-volume hospital payment policy for FY 2013 as provided by the ATRA, we refer readers to the notice that appeared in the Federal Register on March 7, 2013 (78 FR 14689 through 14694). Additional information on the expiration of the temporary changes in the low-volume hospital payment policy for FYs 2011 through 2013 provided for by the Affordable Care Act and the ATRA can be found in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50610 through 50613).

(2) Summary of the Implementation of the Extension of the Low-Volume Hospital Payment Adjustment for FY 2014 (through March 31, 2014)

Section 1105 of the Pathway for SGR Reform Act extended the changes made by the Affordable Care Act and extended by the ATRA by amending sections 1886(d)(12)(B), (C)(i), and (D) of the Act. In the March 2014 IFC (79 FR 15023 through 15025), we amended the regulations text at 42 CFR 412.101 to make conforming changes to the qualifying criteria and the payment adjustment for low-volume hospitals according to the amendments made by section 1105 of the Pathway for SGR Reform Act as discussed in that rule.

To implement the extension of the temporary change in the low-volume hospital payment policy through the first half of FY 2014 (that is, for discharges occurring through March 31, 2014) provided for by the Pathway for SGR Reform Act, we updated the discharge data source used to identify qualifying low-volume hospitals and calculate the payment adjustment (percentage increase) for FY 2014 discharges occurring before April 1, 2014. This approach was consistent with the existing regulations at § 412.101(b)(2)(ii) and with our implementation of the changes in FYs 2011 and 2012 and the extension of those changes in FY 2013. Specifically, for FY 2014 discharges occurring before April 1, 2014, consistent with our historical policy, we established that qualifying low-volume hospitals and their payment adjustment are determined using Medicare discharge data from the March 2013 update of the FY 2012 MedPAR file, as these data were the most recent data available at the time of the development of the FY 2014 payment rates and factors established in the FY 2014 IPPS/LTCH PPS final rule. Table 14 of the March 2014 IFC (which is available only through the Internet on the CMS Web site at http://www.cms.hhs.gov/AcuteInpatientPPS/01_overview.asp) lists the “subsection (d)” hospitals with fewer than 1,600 Medicare discharges based on the March 2013 update of the FY 2012 MedPAR files and their FY 2014 low-volume payment adjustment (if eligible). However, that list of hospitals with fewer than 1,600 Medicare discharges in Table 14 does not reflect whether or not the hospital meets the distance criterion for FY 2014 discharges occurring before April 1, 2014.

We explained in the March 2014 IFC (79 FR 15024 through 15025) that in order to receive a low-volume hospital payment adjustment under § 412.101, in accordance with our previously established procedure, a hospital must notify and provide documentation to its MAC that it meets the distance criterion. We explained that the MAC would refer to the hospital's Medicare discharge data determined by CMS (as provided in Table 14) to determine whether or not the hospital meets the discharge criterion, and the amount of the payment adjustment for FY 2014 discharges occurring before April 1, 2014, once it is determined that the distance criterion has been met.

Consistent with our previously established procedure, we implemented a procedure for a hospital to request low-volume hospital status for FY 2014 discharges occurring before April 1, 2014. Specifically, we established that in order for the applicable low-volume percentage increase to be applied to payments for its discharges beginning on or after October 1, 2013 (that is, the beginning of FY 2014), a hospital must make its request for low-volume hospital status in writing and this request must be received by its MAC no later than March 31, 2014. We also stated that a hospital that qualified for the low-volume payment adjustment in FY 2013 may continue to receive a low-volume payment adjustment for FY 2014 discharges occurring before April 1, 2014 without reapplying if it continues to meet the Medicare discharge criterion based on the March 2013 update of the FY 2012 MedPAR data (shown in Table 14 of the March 2014 IFC), and the distance criterion. However, the hospital must send written verification that was received by its MAC no later than March 31, 2014, that it continued to be more than 15 miles from any other “subsection (d)” hospital. We noted that this procedure is similar to the policy we established when we implemented the extension of the temporary changes to the low-volume hospital payment adjustment for FY 2013 provided by the ATRA, as well as the procedure for a hospital to request low-volume hospital status for FYs 2011 and 2012 under the provisions of the Affordable Care Act.

b. Extension of the MDH Program

Section 1106 of the Pathway for SGR Reform Act of 2013 provided for a 6-month extension of the MDH program, effective from October 1, 2013 to March 31, 2014. Specifically, section 1106 of the Pathway for SGR Reform Act amended sections 1886(d)(5)(G)(i) and 1886(d)(5)(G)(ii)(II) of the Act by striking “October 1, 2013” and inserting “April 1, 2014”. Section 1106 of the Pathway for SGR Reform Act also made conforming amendments to sections 1886(b)(3)(D)(i) and 1886(b)(3)(D)(iv) of the Act.

In the March 2014 IFC (79 FR 15025 through 15027), we stated that, in general, as a result of the extension of the MDH program under the Pathway for SGR Reform Act, a provider that was classified as an MDH as of the September 30, 2013 expiration of the MDH program, would be reinstated as an MDH effective October 1, 2013 through March 31, 2014, subject to the requirements of the regulations at § 412.108, with no need to reapply for MDH classification. In that same IFC, we amended the regulations at § 412.108(a)(1) and (c)(2)(iii) to reflect the statutory extension of the MDH program through March 31, 2014, as provided for by section 1106 of the Pathway for SGR Reform Act. We also discussed that, while generally hospitals that previously qualified for MDH status would be reinstated as an MDH retroactively to October 1, 2013, there were two situations where the effective date of MDH status may not have been retroactive to October 1, 2013 (that is, MDHs that classified as SCHs on or after October 1, 2013, and MDHs that requested a cancellation of their rural classification under § 412.103(b)). We provided examples of various scenarios that illustrate how and when MDH status under section 1106 of the Pathway to SGR Reform Act would be determined for hospitals that were MDHs as of the September 30, 2013 expiration of the MDH program, subject to the timing considerations described in that IFC.

c. Summary of Public Comments, Responses, and Statements of Final Policies

We received approximately four timely pieces of correspondence in response to the March 2014 IFC.

Comment: Commenters generally supported CMS' implementation of the extension of the temporary changes to the payment adjustment for low-volume hospitals and the MDH program through March 31, 2014. However, they expressed concern that the March 31, 2014 deadline for hospitals to submit a written request for low-volume hospital status to the MAC did not allow sufficient and reasonable time period for hospitals to submit the documentation necessary to qualify for the low-volume payment adjustment during the 6-month extension. Therefore, the commenters urged CMS to extend this deadline to allow hospitals a minimum of 30 days to submit the documentation necessary to qualify for the low-volume payment adjustment for FY 2014 discharges through March 31, 2014.

Response: We appreciate the commenters' general support for our implementation of the extension of the temporary changes to the payment adjustment for low-volume hospitals and the MDH program through March 31, 2014. While we understand the commenters' concern regarding the time available for hospitals to request low-volume hospital status for FY 2014 discharges occurring before April 1, 2014, we note that, at this time, we are not aware of any hospitals that were unable to meet the March 31, 2014 deadline for hospitals to request the low-volume hospital payment adjustment for FY 2014 discharges occurring before April 1, 2014. Furthermore, as we stated in the March 2014 IFC, a hospital that qualified for the low-volume payment adjustment in FY 2013 did not need to reapply for FY 2014 if it continues to meet the applicable discharge and the distance criteria (that is, such a hospital did not have to resubmit a low-volume hospital request with supporting documentation to demonstrate that it continues to meet the distance criterion). Rather, such a hospital was only required to send written verification to its MAC that it continues to meet the distance criterion (that is, that it continues to be more than 15 miles from any other “subsection (d)” hospital) by the March 31, 2014 notification deadline. As in prior years, a short letter to the MAC stating that the hospital continues to meet the low-volume hospital distance criterion as documented in a prior low-volume hospital status request would be considered sufficient for this verification requirement.

For hospitals newly eligible for the low-volume hospital payment adjustment, in the March 2014 IFC, we included guidance, consistent with our previously established procedure, to provide focus to their request preparation efforts. Specifically, we stated that the use of a Web-based mapping tool, such as MapQuest, as part of documenting that the hospital meets the distance criterion for low-volume hospitals, is acceptable for the low-volume hospital request, along with providing other relevant information such as the name and street address of the nearest hospitals, location on a map, and distance from the hospital requesting low-volume hospital status. We also stated that the MAC may follow up with the hospital to obtain additional necessary information to determine whether or not the hospital meets the low-volume hospital status distance criterion.

Given the limited nature of the information required to satisfy the request and notification requirement, and the opportunity to provide additional information if needed, we believe that the March 31, 2014 deadline allowed for sufficient and reasonable time for hospitals to submit their requests. In addition, as we noted in the March 2014 IFC, the process for requesting and obtaining the low-volume hospital payment adjustment for FY 2014 discharges occurring before April 1, 2014 was similar to the policy we established when we implemented the extension of the temporary changes to the low-volume hospital payment adjustment for FY 2013 provided by the ATRA. For the reasons stated above, we are not adopting the commenters' request to allow hospitals a minimum of 30 days to submit the documentation necessary to qualify for the low-volume payment adjustment for FY 2014 discharges occurring before April 1, 2014.

Comment: One commenter opposed using Medicare discharge data from the March 2013 update of the FY 2012 MedPAR file (as listed in Table 14 of the March 2014 IFC) to assess the discharge criterion for low-volume hospital eligibility (that is, to determine if the hospital had less than 1,600 Medicare discharges) and to determine the amount of the payment adjustment for FY 2014 discharges occurring before April 1, 2014. The commenter believed certain scenarios were not accounted for by using historical Medicare discharge data in the MedPAR file to prospectively determine low-volume hospital eligibility and payment. For example, a hospital that became an IPPS hospital (either as a newly participating hospital or conversion from another provider type, like a CAH) would not be included in the historical MedPAR discharge data, or a hospital that previously did not meet the discharge criterion based on the historical Medicare discharge data in the MedPAR file that now has fewer than 1,600 Medicare discharges in the current year. The commenter requested that CMS modify its established policy of using historical MedPAR discharge data to determine if a hospital meets the discharge criterion to allow for scenarios such as the ones described above, and noted that CMS could develop a settlement procedure on the Medicare cost report for hospitals that did not have fewer than 1,600 Medicare discharges in the historical Medicare discharge data in the MedPAR file but have fewer than 1,600 Medicare discharges in the payment year.

Response: As explained in the March 2014 IFC (79 FR 15024), under the existing regulations at § 412.101(b)(2)(ii), for FYs 2011, 2012 and 2013, a hospital's Medicare discharges from the most recently available MedPAR data, as determined by CMS, are used to determine if the hospital meets the discharge criteria to receive the low-volume payment adjustment in the current year. Since its initial implementation in FY 2005, we established a policy of using historical discharge data to determine if the hospital meets the discharge criterion to receive the low-volume payment adjustment in the current year. Prior to the temporary changes to the low-volume hospital payment adjustment policy under the amendments made by the Affordable Care Act, discharges from a prior cost reporting period were used to determine if the hospital qualified for the low-volume payment adjustment in the current year. We adopted the use of historical Medicare discharge data from the MedPAR files when we implemented the amendments made by the Affordable Care Act because MedPAR data are the most recent available data that provide the number of discharges for individuals that are entitled to or enrolled for Medicare Part A, as required by statute (75 FR 50241). The most recent Medicare discharge data are generally available in the MedPAR files before the corresponding Medicare discharges from the cost report data are available due to the established timeframes for completion and submission of the Medicare cost report. (We note that the MedPAR file contains only Medicare discharge information, and does not contain discharge information for non-Medicare patients. Therefore, hospital cost report data are the best available data source for total discharges under the discharge criterion in § 412.101(b)(2)(i).)

As we discussed when we initially implemented the low-volume hospital payment adjustment in the FY 2005 IPPS final rule (69 FR 49100 through 49101), if the determination of whether hospitals qualify for low-volume payment adjustments and the computation of the payment adjustment amount are based on the number of discharges in the current fiscal year, neither CMS nor the hospital will know with certainty whether a hospital qualifies for the adjustment, or what the amount of the adjustment would be, until after the end of the payment year (probably not until the time of final cost report settlement for the year). In such circumstances, CMS could be faced with the prospect of recouping large overpayments in some cases or reimbursing for large underpayments in others, and hospitals would face similar uncertainties. On the other hand, if these determinations are based on discharge counts from a prior fiscal year, hospitals will know in advance whether they will be receiving a payment adjustment and what the size of the adjustment will be, which provides fiscal stability by allowing both hospitals and CMS to plan accordingly. Therefore, we established that the count of discharges, for purposes both of meeting the qualifying definition of a low-volume hospital and determining the amount of the low-volume hospital payment adjustment, is based on the number of discharges occurring during the cost reporting period for the most recent submitted cost report. In that same final rule, we also recognized that this policy may temporarily disadvantage certain hospitals, such as the situations mentioned by the commenter. However, we believe that the fiscal stability provided under a policy based on historical data offsets any temporary disadvantage hospitals in such situations may experience until their historical data are used to meet the low-volume hospital payment adjustment discharge criterion in a future year, and for these reasons we believe a settlement process on the Medicare cost report is not needed. Therefore, we are not adopting the commenter's suggestion to modify our established policy of using historical MedPAR discharge data to determine if a hospital meets the low-volume hospital discharge criterion or to determine the amount of the low-volume hospital payment adjustment for FY 2014 discharges occurring before April 1, 2014.

After consideration of the public comments we received, we are finalizing all of the provisions set forth in the March 2014 IFC without modification. We note that the revisions to the low-volume hospital payment adjustment regulations at § 412.101 and the MDH program regulations at § 412.108 under the March 2014 IFC are superseded by the final conforming changes to these same regulatory provisions to reflect the subsequent extension of the changes to the qualifying criteria and the payment adjustment methodology for low-volume hospitals and the MDH program through March 31, 2015 under the PAMA. We refer readers to sections IV.D. and IV.G. of the preamble of this final rule, respectively, for more information on these final conforming changes.

For information on the estimated change in payments to IPPS hospitals in FY 2014 as a result of the implementation of sections 1105 and 1106 of the Pathway for SGR Reform Act, we refer readers to the regulatory impact section of the March 2014 IFC (79 FR 15028 through 15030).

Q. Finalization of Interim Final Rule With Comment Period Relating to Changes to Certain Cost Reporting Procedures for Medicare Disproportionate Share Hospital (DSH) Uncompensated Care Payments

1. Background

Section 3133 of the Patient Protection and Affordable Care Act, as amended by section 10316 of the same Act and section 1104 of the Health Care and Education Reconciliation Act (Pub. L. 111-152), added a new section 1886(r) to the Social Security Act (the Act) that modified the methodology for computing the Medicare disproportionate share hospital (DSH) payment adjustment beginning in FY 2014. We implemented section 1886(r) of the Act in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50620 through 50647). For a detailed discussion of the background on the reduction in DSH payments under section 1886(d)(5)(F) of the Act and the uncompensated care payment under section 1886(r) of the Act, we refer readers to section IV.F.3.a. of the preamble of this final rule.

Following the publication of the FY 2014 IPPS/LTCH PPS final rule, we issued an interim final rule with comment period (CMS-1599-IFC) in which we revised certain policies and processes described in the FY 2014 IPPS/LTCH PPS final rule. The interim final rule with comment period appeared in the Federal Register on October 3, 2013 (78 FR 61191 through 61197). In the interim final rule with comment period, we revised certain operational considerations for hospitals with Medicare cost reporting periods that span more than one Federal fiscal year and also made changes to the data that will be used in the uncompensated care payment calculation in order to ensure that data from Indian Health Service (IHS) hospitals are included in Factor 1 and Factor 3 of that calculation. We found that there was good cause to waive prior notice and comment and the delay in effective date with respect to the revisions discussed in the interim final rule with comment period (78 FR 61195 through 61196). Accordingly, the provisions of the interim final rule with comment period went into effect on October 1, 2013.

We received 12 timely pieces of correspondence in response to the interim final rule with comment period. Below we summarize the provisions of the interim final rule with comment period and the public comments we received, present our responses, and finalize the policies that were originally implemented in the interim final rule with comment period.

2. Summary of Provisions of the Interim Final Rule With Comment Period, Public Comments Received, Responses, and Finalized Policy

a. Operational Considerations for Hospitals With Medicare Cost Reporting Periods That Span More Than One Federal Fiscal Year

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50645), we finalized “a process to distribute interim uncompensated care payments under the IPPS on a per-discharge basis through our claims processing system, with a reconciliation of the hospitals' [uncompensated care] payments at cost report settlement to ensure that hospitals receive no more than the estimated amount included in this final rule”. We described that process as follows (78 FR 50646):

“[A]t cost report settlement, the . . . fiscal intermediary/MAC will issue a notice of program reimbursement that includes a determination concerning whether each hospital is eligible for empirically justified Medicare DSH payments and, therefore, eligible for uncompensated care payments in FY 2014 and each subsequent year. In the case where a hospital received interim payments for its empirically justified Medicare DSH payments and uncompensated care payments for FY 2014 or a subsequent year on the basis of estimates prior to the payment year, but is determined to be ineligible for the empirically justified Medicare DSH payment at cost report settlement, the hospital would no longer be eligible for either payment and CMS would recoup those monies. For a hospital that did not receive interim payments for its empirically justified Medicare DSH payments and uncompensated care payments for FY 2014 or a subsequent year, but at cost report settlement is determined to be eligible for DSH payments, the uncompensated care payment for such a hospital is calculated based on the Factor 3 value determined prospectively for that fiscal year. . . . The reconciliations at cost report settlement would be based on the values for Factor 1, Factor 2, and Factor 3 that we have finalized prospectively for a Federal fiscal year.”

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50646), we provided an example in which a DSH eligible hospital has a cost reporting period of January 1, 2014 through December 31, 2014. We stated that this hospital would receive interim payments for its uncompensated care payments beginning on October 1, 2013. For cost reporting purposes, we stated that the uncompensated care payments for federal FY 2014 would be assigned to cost reporting periods beginning on or after October 1, 2013, and would be reconciled on those cost reports. Thus, in the example of the hospital with a cost reporting period beginning on January 1, 2014, if the hospital remained eligible for empirically justified DSH payments at cost report settlement, it would receive its full FY 2014 uncompensated care payment on its cost report for the cost reporting period beginning on January 1, 2014. Although we acknowledged that it is possible to align interim and final payments for the uncompensated care payment with an individual hospital's cost reporting periods, we believed it would be administratively efficient and practical to pay the uncompensated care payment on the basis of the Federal fiscal year because that is how it is determined, and to reconcile that amount in the cost reporting period that begins in the respective Federal fiscal year. We stated in the final rule (78 FR 50647) that we believed this methodology would not delay the full payment of FY 2014 payments to hospitals with cost reporting periods that begin after October 1, 2013.

However, as we prepared to implement the FY 2014 IPPS/LTCH PPS final rule, several difficulties regarding this approach that we had not previously considered came to our attention. We initially proposed to make interim uncompensated care payments on a biweekly basis, finalizing a different process to make interim uncompensated care payments on a per discharge basis in response to comments. In addition to proposing and finalizing a process for making interim uncompensated care payments, we also proposed and finalized a reconciliation process that would reconcile the uncompensated care payment for a given fiscal year, such as FY 2014, on the cost report for the cost reporting period beginning in that fiscal year (that is, for FY 2014, the cost report for the cost reporting period beginning in FY 2014). We proposed and finalized this approach because we believed it would be administratively efficient and practical. As indicated previously and in the FY 2014 IPPS/LTCH PPS final rule, we initially believed that this policy would neither delay nor substantially affect the disbursement of final uncompensated care payments; but, since the final rule was issued, we came to doubt these conclusions.

In the interim final rule with comment period, we stated that we had come to believe that the policy we adopted in the FY 2014 IPPS/LTCH PPS final rule was inconsistent with longstanding cost reporting requirements. As a general rule, payments for discharges are reported in the cost reporting period in which they occur, and all payments made for discharges during a cost reporting period are reconciled on the cost report for that period (PRM-I, Section 2805 and 42 CFR 412.1(a)). We did not specifically address or propose to change the cost reporting rules in either the FY 2014 IPPS/LTCH PPS proposed or final rules. However, for hospitals with cost reporting periods that were not concurrent with the Federal fiscal year, the policy adopted in the FY 2014 IPPS/LTCH PPS final rule departed from these cost reporting requirements by reconciling interim uncompensated care payments made for discharges occurring during the hospital's 2013 cost reporting period on the hospital's 2014 cost report. Under ordinary cost reporting requirements, those payments (having been made during the hospital's 2013 cost reporting period) would have to be treated as an overpayment on the hospital's 2013 cost report and therefore recouped. However, as finalized in the FY 2014 IPPS/LTCH PPS final rule, if the hospital was found to be eligible for DSH payments for its cost reporting period that began during FY 2014, we would pay the hospital its full FY 2014 uncompensated care payment during the settlement of the hospital's 2014 cost report (that is, we would repay the previously recouped uncompensated care payments when we reconciled the hospital's 2014 cost report). We stated that these administrative issues would effectively delay uncompensated care payments, frustrate our policy of making uncompensated care payments promptly, and would likely lead to serious cash flow difficulties for some hospitals. In summary, we did not believe the policy we finalized in the FY 2014 IPPS/LTCH PPS final rule of reconciling uncompensated care payments for hospitals with cost reporting periods that begin after October 1, 2013 would work as intended for the large majority of IPPS hospitals that have cost reporting periods that are not concurrent with the Federal fiscal year.

To effectuate a revised process, in the interim final rule with comment period, we sought to align final payments for the uncompensated care payment with each individual hospital's cost reporting periods and to reconcile interim uncompensated care payment amounts on the hospital's cost report for the proportion of the cost reporting period that overlaps a Federal fiscal year and in which the interim payments were made or should have been made. Therefore, the final uncompensated care payment amounts that would be included on a cost report spanning 2 Federal fiscal years would be the pro rata share of the uncompensated care payment associated with each Federal fiscal year. This pro rata share would be determined based on the proportion of the applicable Federal fiscal year that is included in that cost reporting period. We considered the same example from the FY 2014 IPPS/LTCH PPS final rule, where a hospital is estimated to be eligible for the empirically justified DSH payment and also an uncompensated care payment in FY 2014 and has a cost reporting period of January 1, 2014 through December 31, 2014. Under the revised process we adopted in the interim final rule with comment period, in that example, that hospital would still begin to receive interim payments for its uncompensated care on October 1, 2013. However, instead of having the entire FY 2014 payment reconciled on its cost report for the cost reporting period beginning on January 1, 2014 (which ends on December 31, 2014, and would therefore require the hospital to pay back monies received for the portion of its cost reporting period beginning on January 1, 2013, that occurs in Federal fiscal year 2014), we would reconcile the interim FY 2014 uncompensated care payments received for discharges from October 1, 2013 through December 31, 2013 on the hospital's cost report for the cost reporting period beginning on January 1, 2013 against a pro rata share of its FY 2014 uncompensated care payment. If this hospital were eligible for DSH on its cost report for the cost reporting period ending on December 31, 2013, it would receive a pro rata share of its FY 2014 uncompensated care payment. This pro rata share would be approximately three-twelfths (that is, the period of time from October 1, 2013 through December 31, 2013, divided by the period of time from January 1, 2013 through December 31, 2013) of the hospital's FY 2014 uncompensated care payment. If the hospital's subsequent cost reporting period is January 1, 2014 through December 31, 2014, we also would reconcile the interim FY 2014 uncompensated care payments received for discharges from January 1, 2014 through September 30, 2014 on the hospital's cost report for the cost reporting period beginning on January 1, 2014 against a pro rata share of its FY 2014 uncompensated care payment. We also would reconcile the interim FY 2015 uncompensated care payments received for discharges from October 1, 2014 through December 31, 2014 (that is, discharges occurring in FY 2015 during that hospital's cost reporting period) on the hospital's cost report for the cost reporting period beginning on January 1, 2014 against a pro rata share of its FY 2015 uncompensated care payment. Accordingly, for the hospital in this example, if it remained eligible for Medicare DSH on its cost report for the cost reporting period beginning on January 1, 2014, it would receive the sum of two pro rata shares of uncompensated care payments, one pro rata share equal to approximately nine-twelfths (that is, the period of time from January 1, 2014 through September 30, 2014 divided by the period of time from January 1, 2014 through December 31, 2014) of the hospital's FY 2014 uncompensated care payment and one pro rata share equal to approximately three-twelfths (that is, the period of time from October 1, 2014 through December 31, 2014 divided by the period of time from January 1, 2014 through December 31, 2014) of the hospital's FY 2015 uncompensated care payment.

Under the interim final rule with comment period, and in accordance with the policies we finalized in the FY 2014 IPPS/LTCH PPS final rule regarding eligibility for the uncompensated care payment, hospitals with cost reporting periods that span more than one Federal fiscal year will be eligible for the respective pro rata shares of their uncompensated care payment if they were eligible for DSH in that cost reporting period. If they were ineligible for DSH in that cost reporting period, they would be ineligible to receive the respective pro rata share of the uncompensated care payment for the respective Federal fiscal year (or years). We stated that we believed this approach remained fundamentally consistent with the policy we finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50622) where we stated that “our final determination on the hospital's eligibility for uncompensated care payments would be based on the hospital's actual DSH status on the cost report for that payment year.” However, it avoided the cost reporting difficulties that would have arisen from the reconciliation process originally adopted in the final rule.

Comment: Several commenters supported the modifications to align uncompensated care payments based on the Federal fiscal year, instead of based on a hospital's cost reporting period. Commenters supported the change in policy such that the final uncompensated care payment amounts that would be included on a hospital's cost report that spans 2 Federal fiscal years will be the pro rata share of the uncompensated care payment associated with each Federal fiscal year.

Response: We appreciate the commenters' support.

Comment: One commenter recommended that when CMS reconciles uncompensated care payments on a pro rata basis based on the portion of a hospital's cost reporting period that falls in the Federal fiscal year, CMS prorate on a calendar month basis as opposed to calendar day basis for administrative simplicity.

Response: We appreciate the commenter's recommendation. Under the policy finalized in the interim final rule with comment period, we determine a pro rata share based on the proportion of the applicable Federal fiscal year that is included in that cost reporting period. We intend to establish the pro rate share on a calendar day basis, as opposed to a calendar month basis. We believe we can more accurately account for the uncompensated care payment amounts when we reconcile on a calendar day basis, as we can easily obtain the number of days from a hospital's cost reporting period on the hospital's Medicare Hospital Cost Report. Therefore, this process will not be administratively burdensome. Furthermore, we disagree that it would be administratively easier or simpler to prorate on a monthly basis, particularly in cases where a hospital's cost report may end in the middle of the month.

b. Treatment of Indian Health Service Hospitals

In the FY 2014 IPPS/LTCH PPS final rule, we discussed the hospitals that are eligible to receive the uncompensated care payments under section 1886(r)(2) of the Act. Specifically, we stated (78 FR 50622) that the “new payment methodology under subsection (r) applies to `subsection (d) hospitals' that would otherwise receive a `disproportionate hospital share payment . . . made under subsection (d)(5)(F).' ” Therefore, eligibility for empirically justified Medicare DSH payments is unchanged under this new provision. Consistent with the law, hospitals must receive empirically justified Medicare DSH payments in FY 2014 or a subsequent year to receive an additional Medicare uncompensated care payment for that year.

In the FY 2014 IPPS/LTCH PPS final rule, we finalized our methodology for calculating the new uncompensated care payments. As we discussed in that final rule, section 1886(r)(2) of the Act provides that for each eligible hospital in FY 2014 and subsequent years, the new uncompensated care payment is the product of three factors. Factor 1 of that methodology is the “difference between our estimates of: (1) the amount that would have been paid in Medicare DSH payments for FY 2014 and subsequent years, in the absence of the new payment provision; and (2) the amount of empirically justified Medicare DSH payments that are made for FY 2014 and subsequent years, which takes into account the requirement to pay 25 percent of what would have otherwise been paid under section 1886(d)(5)(F) of the Act. In other words, this factor represents our estimate of 75 percent (100 percent minus 25 percent) of our estimate of Medicare DSH payments that would otherwise be made, in the absence of section 1886(r) of the Act, for FY 2014 and subsequent years” (78 FR 50627).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50630), we finalized our proposal to use the most recently available estimates, as calculated by the CMS Office of the Actuary, to determine both the aggregate amount of empirically justified DSH payments under section 1886(r)(1) of the Act and the aggregate amount of payments that would otherwise have been made under section 1886(d)(5)(F) of the Act. In order to calculate these estimates, the Office of the Actuary used the March 2013 update of the Medicare Hospital Cost Report Information System (HCRIS) and the proposed rule's IPPS Impact file. The estimate excluded Maryland hospitals, SCHs paid under their hospital-specific rate, and hospitals in the Rural Community Hospital Demonstration Program, as these hospitals do not receive a Medicare DSH payment. The CMS Office of the Actuary's final estimate for Medicare DSH payments for FY 2014 without regard to the application of section 1886(r)(1) of the Act, was approximately $12.772 billion. The estimate for empirically justified Medicare DSH payments for FY 2014, with the application of section 1886(r)(1) of the Act, was approximately $3.193 billion. Factor 1 is the difference of these two estimates by our Office of the Actuary; therefore, in the FY 2014 IPPS/LTCH PPS final rule, we calculated Factor 1 to be approximately $9.579 billion.

IHS hospitals are subsection (d) hospitals that can receive empirically justified Medicare DSH payments under section 1886(r)(1) of the Act if they meet the eligibility requirements under subsection (d)(5)(F). Therefore, eligible IHS hospitals also receive the new uncompensated care payment under subsection (r)(2). However, following the issuance of the FY 2014 IPPS/LTCH PPS final rule, it came to our attention that, although IHS hospitals can receive Medicare DSH payments, they submit Medicare hospital cost reports to CMS that are not uploaded in the HCRIS database. Therefore, their Medicare DSH payments were not included in the estimates by our Office of the Actuary that were used to calculate Factor 1. We stated in the interim final rule with comment period that because IHS hospitals are eligible to receive Medicare DSH payments and the new uncompensated care payments, we believe it is inappropriate to exclude the Medicare DSH payments to IHS hospitals from the estimates used to calculate Factor 1. In addition, we acknowledged that we did not intend to finalize a policy that specifically excludes DSH payments to IHS hospitals from our estimate of Medicare DSH payments for purposes of calculating Factor 1 in the calculation of the uncompensated care payment.

Therefore, in the interim final rule with comment period, we revised the policy originally adopted in the FY 2014 IPPS/LTCH PPS final rule in order to change the data that will be considered in calculating Factor 1 for FY 2014 and subsequent years. Specifically, in addition to the March 2013 update of HCRIS, we will also consider cost report data provided by IHS hospitals to CMS as of March 2013. We also will recalculate Factor 1, to reflect the Office of the Actuary's estimate of Medicare DSH payments to IHS hospitals, based on these cost report data. With the inclusion of the Medicare DSH payments to IHS hospitals, our Office of the Actuary's revised estimate of Medicare DSH payments for FY 2014 without regard to the application of 1886(r)(1) of the Act was approximately $12.791 billion (this revised estimate also includes the correction for Factor 1 made in the correcting document for the FY 2014 IPPS/LTCH PPS final rule that also appeared in the Federal Register on October 3, 2013 (78 FR 61198)). The CMS Office of the Actuary's revised estimate of empirically justified Medicare DSH payments for FY 2014, with the application of section 1886(r)(1) of the Act, was approximately $3.198 billion (this revised estimate also includes the correction for Factor 1 made in the correcting document for the FY 2014 IPPS/LTCH PPS final rule (78 FR 61198)). Factor 1 is the difference of these two estimates of our Office of the Actuary; therefore, in the interim final rule with comment period, we recalculated Factor 1 to be approximately $9.593 billion (this revised estimate also includes the correction for Factor 1 made in the correcting document for the FY 2014 IPPS/LTCH PPS final rule (78 FR 61198)). We noted that, based on the recalculation of Factor 1, the amount available for uncompensated care payments for FY 2014 would be approximately $9.046 billion (our determination of Factor 2 as finalized in the FY 2014 IPPS/LTCH PPS final rule of 0.943 times our revised Factor 1 estimate of $9.593 billion).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50634 through 50643), we discussed the methodology used to calculate Factor 3 in the calculation of the uncompensated care payment. Under the final policy adopted in that final rule, for FY 2014 we determined a DSH hospital's Factor 3 as the sum of its Medicaid days and SSI days (numerator) relative to the total number of Medicaid days and SSI days for all DSH hospitals (denominator). We determined a hospital's SSI days based on the most recent SSI fraction. As we stated in the FY 2014 IPPS/LTCH PPS final rule, the most recent SSI fractions available for making this determination for FY 2014 were the FY 2011 SSI fractions. The FY 2011 SSI fractions for each subsection (d) hospital were published on the CMS Web site on June 27, 2013. In addition, under the final policy adopted in the FY 2014 IPPS/LTCH PPS final rule, we determine a hospital's Medicaid days based on the Medicaid days reported on the 2011, or if not available, the 2010 Medicare Hospital Cost Report, using the March 2013 update of HCRIS.

Because the cost reports submitted by IHS hospitals are not uploaded into HCRIS, we did not include their Medicaid days in our calculation of Factor 3. Specifically, Medicaid days for IHS hospitals were excluded from the numerator of Factor 3 for those IHS hospitals and from the denominator of Factor 3 for all hospitals. As a result, in the interim final rule with comment period, we indicated that we believed that the Factor 3 that was calculated for each IHS hospital under the policies adopted in the 2014 IPPS/LTCH PPS final rule, based only on FY 2011 SSI days, significantly understated the actual amount of uncompensated care furnished by these hospitals. The uncompensated care payment amounts calculated for these hospitals were also significantly lower than they would have been had these days been included. We were concerned that, under the policy originally adopted in the FY 2014 IPPS/LTCH PPS final rule, IHS hospitals that serve a significant low income population would be subject to the 75-percent reduction to their Medicare DSH payments under section 1886(r)(1) of the Act but would receive reduced uncompensated care payments under section 1886(r)(2) of the Act due to their cost reports not being included in the HCRIS database. Given that we intended to base our estimate of the uncompensated care provided by IHS hospitals, in part, on the care they provide to Medicaid patients, we believed it was appropriate to make a change to the data that are considered in determining Factor 3 of the new uncompensated care payment to allow the Medicaid days for IHS hospitals to be included. This change would also help to ensure that eligible IHS hospitals receive an uncompensated care payment that does not significantly understate the amount of uncompensated care they provide. Accordingly, in the interim final rule with comment period, we revised the policy adopted in the FY 2014 IPPS/LTCH PPS final rule to permit us to consider cost report data submitted to CMS as of March 2013 only by IHS hospitals in addition to data reflected in the March 2013 update of HCRIS, in calculating Factor 3 of the uncompensated care payment. The Medicaid days for IHS hospitals that are reflected in the cost report data would be included in the numerator of the Factor 3 calculation for IHS hospitals and would be included in the denominator of Factor 3 for all hospitals eligible to receive the uncompensated care payment.

Comment: Several commenters supported the change in policy to incorporate hospital cost report data for IHS hospitals that was not included in the HCRIS database in the calculation of Factor 1 and Factor 3. Commenters agreed that it was inappropriate to exclude cost report data for IHS hospitals from the calculation of Factor 1 and supported the inclusion of cost report data for these hospitals in the calculation of Factor 1, which represents the Secretary's estimate of 75 percent of Medicare DSH payments in FY 2014. In addition, commenters supported using IHS cost report data to identify Medicaid days to incorporate into the calculation of Factor 3 for these IHS hospitals. One commenter sought clarification of the definition of an IHS hospital in order to clarify what category of hospitals are subject to the policies finalized in the interim final rule with comment period. The commenters sought confirmation that an IHS hospital includes “any hospital operated by an Indian Tribe or Tribal health program carrying out IHS programs under the Indian Self-Determination and Education Assistance Act (ISDEAA).” In other words, the commenters sought clarification that IHS hospitals include facilities that are either owned or leased by IHS or are deemed by CMS to be IHS facilities because they are operated by an Indian Tribe or Tribal organization under the ISDEAA. The commenters also sought clarification that CMS will treat cost reports from all such qualifying hospitals in the same way that it treats IHS directly operated hospitals in determining the amount of uncompensated care payments.

Response: We appreciate the commenters' support of our policy change. An IHS hospital is defined under section 1880 of the Act as a “hospital or skilled nursing facility of the Indian Health Service, whether operated by such Service or by an Indian tribe or tribal organization (as those terms are defined in section 4 of the Indian Health Care Improvement Act) . . . .” Therefore, with regard to the policies in the interim final rule with comment period related to IHS hospitals, we do not make a distinction with respect to whether the hospital is owned or operated by the Indian Health Service or by an Indian tribe or tribal organization. Cost report data submitted both by hospitals operated by the IHS and by hospitals operated by an Indian tribe or tribal organization are excluded from HCRIS. Therefore, the policies described in the interim final rule with comment period regarding the use of cost report data submitted by IHS hospitals in order to calculate Factor 1 and Factor 3 apply to all IHS hospitals whether they are owned or operated by the Indian Health Service or by an Indian tribe or tribal organization.

Comment: Several commenters raised concerns that CMS did not adequately address mergers in the calculation of the uncompensated care payment. The commenters disagreed with CMS' treatment of mergers under the policy finalized in the FY 2014 IPPS/LTCH PPS final rule that if one DSH-eligible hospital merges with another DSH-eligible hospital, only the data associated with the surviving hospital is used to calculate the hospital's share of uncompensated care payments. Commenters asserted that the policy on mergers understates uncompensated care payments for merged providers and does not accurately reflect the merged hospital's uncompensated care costs.

Response: We thank the commenters for these comments. However, we consider these comments to be out of the scope of the policies addressed in the interim final rule with comment period. We discuss our policies regarding the treatment of mergers in the calculation of the Factor 3 in section IV.F. of the preamble of this FY 2015 IPPS/LTCH PPS final rule.

Comment: One commenter suggested that, for hospitals that had low-income insured days calculated using a cost report for a cost reporting period that was less than 12 months, CMS should use low-income insured days based on an older cost reporting period that was 12 months.

Response: We thank the commenter for this comment. However, we consider this comment to be out of the scope of the policies addressed in the interim final rule with comment period. We discuss our methodology to calculate Factor 3, including our treatment of short cost reporting periods, in section IV.F. of the preamble of this FY 2015 IPPS/LTCH PPS final rule.

Comment: One commenter addressed the calculation of the interim per claim uncompensated care payment amounts that are paid to hospitals projected to be eligible to receive DSH payments in a Federal fiscal year; the per claim amount is based on a hospital's total uncompensated care amount divided by the hospital's average number of claims from the most recent 3 years of data. Specifically, the commenter requested that CMS use the claims published in the case-mix files to calculate the 3-year average because, in the commenter's view, CMS' data source understates the average number of claims potentially resulting in an overpayment on a per claim basis. The commenter also recommended that CMS use a growth factor to account for new enrollees that may increase the number of claims in the calculation of the three year average number of claims. The commenter also noted the wide variation in per claim amounts from approximately $9 to $167,000 and requested that CMS place a cap on the per claim amount to minimize swings in cash flow at cost report settlement and because it did not make sense that Medicare or MA plans pay such a high amount.

Response: We thank the commenter for this comment. However, we consider the issues raised in the comment to be out of the scope of the policies addressed in the interim final rule with comment period. We received a similar comment on the FY 2015 IPPS/LTCH PPS proposed rule and address the comment in section IV.F. of the preamble of this FY 2015 IPPS/LTCH PPS final rule.

After consideration of the public comments we received, we are finalizing all of the provisions set forth in the interim final rule with comment period without modification, to allow final uncompensated care payments to be paid on the same schedule as Medicare DSH payments, so that both the uncompensated care payment and Medicare DSH payments will be paid and reconciled on a hospital's cost report, based on the applicable Federal fiscal year(s) and to allow information included in the cost reports submitted by IHS hospitals to be used in the calculation of Factor 1 and Factor 3. For information on the estimated change in payments to hospitals in FY 2014 as a result of the provisions set forth in the interim final rule with comment period, we refer readers to the regulatory impact statement in the interim final rule with comment period (78 FR 61197). We note that the impact of our decision to finalize the interim final rule with comment period is included in the regulatory impact statement in Appendix A of this FY 2015 IPPS/LTCH PPS final rule as part of the discussion of the estimated change in payments to hospitals in FY 2015 as a result of the policies regarding Medicare DSH payments and uncompensated care payments that we are adopting in the final rule.

V. Changes to the IPPS for Capital-Related Costs

A. Overview

Section 1886(g) of the Act requires the Secretary to pay for the capital-related costs of inpatient acute hospital services “in accordance with a prospective payment system established by the Secretary.” Under the statute, the Secretary has broad authority in establishing and implementing the IPPS for acute care hospital inpatient capital-related costs. The IPPS for capital-related costs was initially implemented in the Federal fiscal year (FY) 1992 IPPS final rule (56 FR 43358), in which we established a 10-year transition period to change the payment methodology for Medicare hospital inpatient capital-related costs from a reasonable cost-based methodology to a prospective methodology (based fully on the Federal rate).

FY 2001 was the last year of the 10-year transition period established to phase in the IPPS for hospital inpatient capital-related costs. For cost reporting periods beginning in FY 2002, capital IPPS payments are based solely on the Federal rate for almost all acute care hospitals (other than hospitals receiving certain exception payments and certain new hospitals). (We refer readers to the FY 2002 IPPS final rule (66 FR 39910 through 39914) for additional information on the methodology used to determine capital IPPS payments to hospitals both during and after the transition period.)

The basic methodology for determining capital prospective payments using the Federal rate is set forth in § 412.312 of the regulations. For the purpose of calculating capital payments for each discharge, the standard Federal rate is adjusted as follows:

(Standard Federal Rate) × (DRG Weight) × (Geographic Adjustment Factor (GAF)) × (COLA for hospitals located in Alaska and Hawaii) × (1 + Capital DSH Adjustment Factor + Capital IME Adjustment Factor, if applicable).

In addition, under § 412.312(c), hospitals also may receive outlier payments under the capital IPPS for extraordinarily high-cost cases that qualify under the thresholds established for each fiscal year.

B. Additional Provisions

1. Exception Payments

The regulations at § 412.348 provide for certain exception payments under the capital IPPS. The regular exception payments provided under §§ 412.348(b) through (e) were available only during the 10-year transition period. For a certain period after the transition period, eligible hospitals may have received additional payments under the special exceptions provisions at § 412.348(g). However, FY 2012 was the final year hospitals could receive special exceptions payments. For additional details regarding these exceptions policies, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51725).

Under § 412.348(f), a hospital may request an additional payment if the hospital incurs unanticipated capital expenditures in excess of $5 million due to extraordinary circumstances beyond the hospital's control. Additional information on the exception payment for extraordinary circumstances in § 412.348(f) can be found in the FY 2005 IPPS final rule (69 FR 49185 and 49186).

2. New Hospitals

Under the capital IPPS, § 412.300(b) of the regulations defines a new hospital as a hospital that has operated (under previous or current ownership) for less than 2 years and lists examples of hospitals that are not considered new hospitals. In accordance with § 412.304(c)(2), under the capital IPPS a new hospital is paid 85 percent of its allowable Medicare inpatient hospital capital-related costs through its first 2 years of operation, unless the new hospital elects to receive full prospective payment based on 100 percent of the Federal rate. We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51725) for additional information on payments to new hospitals under the capital IPPS.

3. Hospitals Located in Puerto Rico

Section 412.374 of the regulations provides for the use of a blended payment amount for prospective payments for capital-related costs to hospitals located in Puerto Rico. Accordingly, under the capital IPPS, we compute a separate payment rate specific to Puerto Rico hospitals using the same methodology used to compute the national Federal rate for capital-related costs. In general, hospitals located in Puerto Rico are paid a blend of the applicable capital IPPS Puerto Rico rate and the applicable capital IPPS Federal rate. Capital IPPS payments to hospitals located in Puerto Rico are computed based on a blend of 25 percent of the capital IPPS Puerto Rico rate and 75 percent of the capital IPPS Federal rate. For additional details on capital IPPS payments to hospitals located in Puerto Rico, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51725).

C. Annual Update for FY 2015

The annual update to the capital PPS Federal and Puerto Rico-specific rates, as provided for at § 412.308(c), for FY 2015 is discussed in section III. of the Addendum to this final rule.

We note that, in section II.D. of the preamble of this final rule, we present a discussion of the MS-DRG documentation and coding adjustment, including previously finalized policies and historical adjustments, as well as the recoupment adjustment to the standardized amounts under section 1886(d) of the Act that we are finalizing for FY 2015 in accordance with the amendments made to section 7(b)(1)(B) of Public Law 110-90 by section 631 of the ATRA. As we discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28171), because section 631 of the ATRA requires CMS to make a recoupment adjustment only to the operating IPPS standardized amount, we are not making a similar adjustment to the national or Puerto Rico capital IPPS rates (or to the operating IPPS hospital-specific rates or Puerto Rico-specific standardized amount). This approach is consistent with our historical approach regarding the application of the recoupment adjustment authorized by section 7(b)(1)(B) of Public Law 110-90.

In section II.D.7. of the preamble of this final rule, we also note our discussion in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50747) of the possibility of applying an additional prospective adjustment to account for the cumulative MS-DRG documentation and coding effect through FY 2010. In that same final rule (78 FR 50515 through 50517 and 50747), we stated that if we were to apply an additional prospective adjustment for the cumulative MS-DRG documentation and coding effect through FY 2010, we believe the most appropriate additional adjustment is −0.55 percent. We did not apply an additional prospective adjustment in FY 2014 for the cumulative MS-DRG documentation and coding effect through FY 2010, consistent with the approach taken for the operating IPPS standardized amount (and hospital-specific rates). We continue to believe that if we were to apply an additional prospective adjustment for the cumulative MS-DRG documentation and coding effect through FY 2010, the most appropriate additional adjustment is −0.55 percent. However, we did not propose such an adjustment to the capital Federal rate in FY 2015, consistent with the approach taken for the operating IPPS standardized amount (and hospital-specific rates) as discussed in section II.D.7. of the preamble of this final rule. We will consider whether such an adjustment to the capital IPPS Federal rate is appropriate in future years' rulemaking.

VI. Changes for Hospitals Excluded From the IPPS

A. Rate-of-Increase in Payments to Excluded Hospitals for FY 2015

Certain hospitals excluded from a prospective payment system, including children's hospitals, 11 cancer hospitals, and hospitals located outside the 50 States, the District of Columbia, and Puerto Rico (that is, hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa) receive payment for inpatient hospital services they furnish on the basis of reasonable costs, subject to a rate-of-increase ceiling. A per discharge limit (the target amount as defined in § 413.40(a) of the regulations) is set for each hospital based on the hospital's own cost experience in its base year, and updated annually by a rate-of-increase percentage. For each cost reporting period, the updated target amount is multiplied by total Medicare discharges during that period and applies as an aggregate upper limit (the ceiling as defined in § 413.40(a)) of Medicare reimbursement for total inpatient operating costs for a hospital's cost reporting period. In accordance with § 403.752(a) of the regulations, RNHCIs also are subject to the rate-of-increase limits established under § 413.40 of the regulations discussed above.

As explained in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50747), beginning with FY 2006, we have used the percentage increase in the IPPS operating market basket to update the target amounts for children's hospitals, cancer hospitals, and RNHCIs. Consistent with §§ 412.23(g), 413.40(a)(2)(ii)(A), and 413.40(c)(3)(viii), we also have used the percentage increase in the IPPS operating market basket to update the target amounts for short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa. For the reasons explained in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50747), we proposed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28171) to continue to use the percentage increase in the IPPS operating market basket to update the target amounts for children's hospitals, cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa for FY 2015 and subsequent fiscal years.

In addition, because we have revised and rebased the IPPS operating market basket to a FY 2010 base year, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28172) we proposed to continue to use the percentage increase in the FY 2010-based IPPS operating market basket to update these target amounts for FY 2015 and subsequent fiscal years. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50596 through 50603) for a further discussion of the revision and rebasing of the IPPS operating market to a FY-2010 base year.)

We did not receive any public comments on these proposals. Therefore, we are finalizing our proposals as discussed above and in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28171 through 28172). Accordingly, for FY 2015, the rate-of-increase percentage to be applied to the target amount for these children's hospitals, cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa is the FY 2015 percentage increase in the FY 2010-based IPPS operating market basket.

For the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28172), based on IHS Global Insight, Inc.'s 2014 first quarter forecast, we estimated that the FY 2010-based IPPS operating market basket update for FY 2015 was 2.7 percent (that is, the estimate of the market basket rate-of-increase). We indicated in the proposed rule that if more recent data became available for the final rule, we would use them to calculate the IPPS operating market basket update for FY 2015. For this FY 2015 IPPS/LTCH PPS final rule, based on IHS Global Insight, Inc.'s 2014 second quarter forecast (which is the most recent data available), we calculated the FY 2010-based IPPS operating market basket update for FY 2015 to be 2.9 percent. Therefore, the FY 2015 rate-of-increase percentage that is applied to the FY 2014 target amounts in order to calculate the final FY 2015 target amounts for children's hospitals, cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa is 2.9 percent, in accordance with the applicable regulations at 42 CFR 413.40.

B. Report on Adjustment (Exceptions) Payments

Section 4419(b) of Public Law 105-33 requires the Secretary to publish annually in the Federal Register a report describing the total amount of adjustment payments made to excluded hospitals and hospital units by reason of section 1886(b)(4) of the Act during the previous fiscal year.

The process of requesting, adjusting, and awarding an adjustment payment is likely to occur over a 2-year period or longer. First, generally, an excluded hospital must file its cost report for a fiscal year in accordance with § 413.24(f)(2). The MAC reviews the cost report and issues a notice of program reimbursement (NPR). Once the hospital receives the NPR, if its operating costs are in excess of the ceiling, the hospital may file a request for an adjustment payment. After the MAC receives the hospital's request in accordance with applicable regulations, the MAC or CMS, depending on the type of adjustment requested, reviews the request and determines if an adjustment payment is warranted. This determination is sometimes not made until more than 180 days after the date the request is filed because there are times when the applications are incomplete and additional information must be requested in order to have a completed application. However, in an attempt to provide interested parties with data on the most recent adjustments for which we do have data, we are publishing data on adjustment payments that were processed by the MAC or CMS during FY 2013.

The table below includes the most recent data available from the MACs and CMS on adjustment payments that were adjudicated during FY 2013. As indicated above, the adjustments made during FY 2013 only pertain to cost reporting periods ending in years prior to FY 2012. Total adjustment payments given to excluded hospitals during FY 2013 are $1,829,578. The table depicts for each class of hospitals, in the aggregate, the number of adjustment requests adjudicated, the excess operating costs over the ceiling, and the amount of the adjustment payments.

Class of hospitalNumberExcess cost over ceilingAdjustment payments
Children's4$2,032,227$1,182,011
Cancer0N/AN/A
Religious Nonmedical Health Care Institution (RNHCI)31,056,142647,567
Total1,829,578

C. Updates to the Reasonable Compensation Equivalent (RCE) Limits on Compensation for Physician Services Provided in Providers (§ 415.70)

1. Background

Under section 1848 of the Act and 42 CFR Parts 414 and 415, medical or surgical services furnished by physicians to individual Medicare beneficiaries generally are billed and paid under Medicare Part B on a fee-for-service basis under the Medicare Physician Fee Schedule (MPFS). As required by section 1887(a)(2)(B) of the Act, the amount of allowable compensation for services furnished by physicians to providers that are paid by Medicare on a reasonable cost basis is subject to reasonable compensation equivalent (RCE) limits. Under these limits, Medicare recognizes as reasonable, for purposes of payment to the provider, the lower of the actual cost of the services furnished by the physician to the provider (that is, any form of compensation to the physician) or an RCE. The allowable compensation costs for physicians' services to a provider are described in § 415.55 of the regulations. Under § 415.60(a) of the regulations, for purposes of applying the RCE limits, physician compensation costs means monetary payments, fringe benefits, deferred compensation, and any other items of value (excluding office space and billing and collection services) that a provider or other organization furnishes a physician in return for the physician's services to the provider.

On March 2, 1983, we published a final rule in the Federal Register that codified regulations to implement section 1887(a)(2)(B) of the Act (currently at 42 CFR 415.70) and established the first set of RCE limits (48 FR 8902). In accordance with § 415.70(a)(2), RCE limits do not apply to the costs of physician compensation attributable to furnishing inpatient hospital services for which payment is made under the IPPS or to the costs of physician compensation attributable to approved GME programs that are payable under §§ 413.75 through 413.83 of the regulations. In addition, under § 415.70(a)(3), compensation that a physician receives for activities that may not be paid for under either Medicare Part A or Part B is not considered in applying these RCE limits. Furthermore, in accordance with § 413.70, RCE limits are not used in determining the reasonable costs that CAHs incur in compensating physicians for services furnished to the CAH.

The RCE limits apply equally to all physicians' services to providers that are payable on a reasonable cost basis under Medicare. If a physician receives any compensation from one or more providers for his or her services to the provider (that is, those services that benefit patients generally), payment to those providers for the costs of such compensation is subject to the RCE limits. The RCE limits are not applied to payment for services that are identifiable medical or surgical services to individual patients and paid under the MPFS, even if the physician agrees to accept compensation (for example, from a hospital) for those services. Payments to teaching hospitals that have elected cost reimbursement for their physicians' direct medical and surgical services in accordance with section 1861(b)(7) of the Act are subject to the RCE limits (68 FR 45458).

2. Overview of the Current RCE Limits

a. Application of the RCE Limits

Currently, we use the RCE limits to compute Medicare payments when a physician is compensated by a provider that is subject to the RCE limits. We also use these limits when the physician is compensated by any other provider-related organization for physician administrative, supervisory, and other services to the provider under Medicare. In applying the RCE limits, we compute the Medicare payments using information submitted on the cost report, and ensure that each compensated physician is assigned to the most appropriate specialty category. The current physician specialty categories for RCE limits are General/Family Practice, Internal Medicine, Surgery, Pediatrics, OB/GYN, Radiology, Psychiatry, Anesthesiology, Pathology, and Total. If there is no specific specialty category (for example, for an emergency room physician), we use the “Total” category, for which the RCE limits are calculated based on mean annual income data for all physicians.

If the physician's contractual compensation covers all duties, activities, and services furnished to the provider and, under a reassignment, all physicians' services furnished to individual patients of the provider, and the physician is employed by the provider full time, we use the RCE limit for the appropriate specialty, adjusted by the physician's allocation agreement (which reflects the percentage of total time spent performing services furnished to the provider) to arrive at the Medicare program's share of the provider's allowable physician compensation costs (§ 415.60). In the absence of an allocation agreement, we would assume that 100 percent of the compensation paid to the physician by the provider is related to physicians' services for which payment is made under the MPFS and that there are no allowable physician compensation costs to the provider (§ 415.60(f)(2)).

If a physician's compensation from the provider represents payment only for services that benefit patients generally (that is, the physician bills for all services furnished to individual patients), we use the appropriate specialty RCE limit. If a physician is employed by a provider to furnish services of general benefit to patients on other than a full-time basis, the RCE limit will be adjusted to reflect the hours the physician actually worked, as reported on the provider's cost report, related to a full work year of 2,080 hours.

b. Exceptions to the RCE Limits

Some providers such as small or rural hospitals may be unable to recruit or maintain an adequate number of physicians at a compensation level within the prescribed RCE limits. In accordance with section 1887(a)(2)(C) of the Act and § 415.70(e) of the regulations, if a provider can demonstrate to the MAC its inability to recruit or maintain physicians at a compensation level allowable under the RCE limits (as documented, for example, by unsuccessful advertising through national medical or health care publications), the MAC may grant the provider an exception to the RCE limits established under these rules. Such exceptions would allow the provider to be paid based on costs for compensation higher than the RCE limit.

c. Methodology for Establishing the RCE Limits

In the March 2, 1983 final rule with comment period (48 FR 8902), we published the initial RCE limits, along with the methodology used to calculate those limits, that were applicable to cost reporting periods beginning during CYs 1982 and 1983. As part of that same rule, we established regulations that outline our general authority to develop, publish, and apply RCE limits (currently at § 415.70). Section 415.70(b) of the regulations specifies that we establish the methodology for determining annual RCE limits, considering, to the extent possible, average physician incomes by specialty and type of location, using the best available data.

The methodology for establishing the initial RCE limits was based on the analysis contained in an internal working paper, “A Methodology for Determination of Reasonable FTE Compensation for Hospital-Based Physicians.” (Copies of this working paper are available on the CMS Web site at: http://www.cms.gov/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html.) As outlined in this working paper, our methodology for establishing the initial reasonable levels of compensation includes the following five steps (for additional discussion of this methodology, we refer readers to the March 2, 1983 final rule with comment period (48 FR 8902)):

Cantwell, James R. and Sobaski, William J., A Methodology for Determination of Reasonable FTE Compensation for Hospital-Based Physicians, Working Paper No. OR-32, revised December 1982.

Step 1: We estimated the national average (mean) income for all physicians using 1979 physician net incomes from the American Medical Association (AMA) Periodic Survey of Physicians (PSP), published by the AMA in its Profile of Medical Practices, 1981.

Step 2: We projected physicians' 1979 base net income levels to the appropriate future year to account for changes in net income levels occurring after the period for which we have data using the Consumer Price Index for All Urban Consumers (CPI-U), and projected the results using forecasts of the CPI-U for future years.

Step 3: We determined the relationship between average net income for all physicians (estimated in the first step above) and net income of certain categories of specialist physicians that are commonly compensated by providers for services that generally benefit Medicare beneficiaries resulting in separate specialty adjusters for nine physician specialties as well as the adjuster for the “Total” category.

Step 4: We also adjusted each of these specialty (including the “Total”) adjusters for differences in costs between types of geographic locations using Standard Metropolitan Statistical Areas (SMSAs) as defined by the Office of Management and Budget (OMB).

Step 5: Using the AMA PSP data, we calculated the average hours practiced per year for each specialty and location adjuster combination, which we then related to a standard full-time equivalent (FTE) work year of 2,080 hours. We used these ratios to weight the specialty-location adjusters from the previous step.

This same methodology was used to update the RCE limits published in a notice in the Federal Register on May 5, 1997 (62 FR 24483). These updated RCE limits were effective for cost reporting periods beginning on or after May 5, 1997.

For RCE limits established prior to January 1, 1998, we used the CPI-U to update the RCE limits. In a final rule with comment period published in the Federal Register on October 31, 1997 (62 FR 59075), we finalized a policy to use the Medicare Economic Index (MEI) to update the RCE limits (rather than the CPI-U), effective for cost reporting periods beginning on or after January 1, 1998. We adopted the MEI as the applicable update factor in order to achieve a measure of consistency in the methodologies used to determine payments to physicians for direct medical and surgical services furnished to individual patients and reasonable compensation levels for services that are of general benefit to a provider's patients. However, we did not update the RCE limits at that time.

In the FY 2004 IPPS final rule published in the Federal Register on August 1, 2003 (68 FR 45458), we published updated RCE limits that were effective for cost reporting periods beginning on or after January 1, 2004. We updated the RCE limits using the CPI-U to adjust the data to 1997, and the MEI to adjust the data from 1998 to 2004. In addition, we continued to adjust the RCE limits to account for differences in salary levels by location, as well as by specialty. For the location adjustment, we continued to base the geographical classifications of the providers on Metropolitan Statistical Areas (MSAs) (the OMB changed the area name to describe metropolitan areas in the 1980's from SMSAs to MSAs, but the definition of MSAs differed only slightly from the previously used SMSAs).

3. Changes to the RCE Limits

In accordance with § 415.70(b), when establishing the methodology to determine the RCE limits, we consider, to the extent possible, the average physician incomes by specialty and type of location using the best available data. Since the initial RCE limits were developed, we have adjusted the RCE data to account for specialty and location (as discussed earlier in this section). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28173), we proposed to use the most recent MEI data to update the RCE limits and to replace the RCE limits that have been in effect since January 1, 2004. We believed that doing so will enhance the accuracy of the RCE limits. In addition, for the reasons discussed below, we proposed to eliminate the location adjustment to the RCE data, while continuing to adjust the RCE limits by specialty. We did not propose changes to any of the other existing policies with respect to the application of and exceptions to the RCE limits.

We invited public comments on our proposals to update the RCE limits and to eliminate the location adjustment for the RCE limits for cost reporting periods beginning on or after January 1, 2015. In addition, we invited public comments on our proposal to revise § 415.70(b) of the regulations to eliminate consideration of the type of location as part of the methodology to establish RCE limits for cost reporting periods beginning on or after January 1, 2015.

Comment: One commenter expressed appreciation that CMS is updating the RCE limits and suggested that CMS update the RCEs on an annual basis. The commenter also requested that the proposed RCEs be effective for cost reporting periods beginning on or after January 1, 2014, instead of cost reporting periods beginning on or after January 1, 2015.

Response: We will continue to review the RCE limits on a regular basis by applying the most recent economic index data and publish updates as necessary. We plan to keep the proposed effective date for the updated RCEs, as we do not believe it would be appropriate in this situation to make this provision retroactively effective.

As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28173 through 28175), in establishing the initial and subsequently updated RCE limits, we included an adjustment to account for differences in salary levels based on the location of the provider using geographic classifications based on the MSAs as defined by the OMB. We assigned an appropriate MSA designation based on the State/county in which the provider is located. We included a table in each of the previous RCE limit notices and rules, whereby each MSA designation was grouped into one of three categories: Metropolitan areas with a population greater than 1 million, metropolitan areas with a population less than 1 million, and nonmetropolitan areas. The MSA designation of the provider is then used to identify the appropriate RCE limit.

To update the current RCE limits by location under the current methodology, we would need to use, as in past updates, the MSA designations that correspond with the update period. However, since 2003, the OMB no longer updates or uses MSAs. We considered continuing to use the MSA designations, as we have in the past, but we would have no way to account for shifts in populations among MSAs because the OMB no longer updates geographic classifications based on MSA designations. The OMB regularly updates the geographic definitions, and the counties included in each area, to account for population shifts due to migrations, birth, and death rates but currently the OMB uses Core-Based Statistical Area (CBSA) designations rather than MSAs. If we were to continue to use the MSA designation, providers could potentially be underpaid or overpaid if the population of their MSA changed significantly from 2004. Therefore, we determined that, because the MSA designations are no longer updated, it would not be appropriate to continue using the previous location adjustment methodology. The most recent geographic delineations used by the OMB are CBSAs, a term used to refer to both Metropolitan and Micropolitan Statistical Areas. However, CBSA delineations do not match the MSA definitions that were used to develop the initial and subsequently updated RCE limits. As noted above, we have used the AMA PSP data to develop previous and current RCE limits. The AMA PSP data were collected from 1970 to 1980 and included physicians' income, hours worked, and MSA-based population information. The data that have been used to develop and update the RCE limits were developed using MSAs as the geographic unit. It is not possible to exactly crosswalk the MSA designations to the CBSA designations in order to update the RCE limits using the current location adjustment methodology. Even if it was possible to crosswalk the MSAs to the CBSAs, it would not be appropriate to use the MSA-based AMA PSP data to develop CBSA-based RCE limits. There have been significant changes in the populations of the MSA-based locations contained in the AMA PSP data that could not be translated into CBSAs. As such, that data would no longer be valid as the basis to develop RCE limits based on CBSAs.

The OMB has cautioned users about using the new CBSA designations. For instance, in OMB's 2010 “Standards for Delineating Metropolitan and Micropolitan Statistical Areas (CBSAs)” published on June 28, 2010 in the Federal Register (75 FR 37246), OMB states:

“OMB establishes and maintains these areas solely for statistical purposes. In reviewing and revising these areas, OMB does not take into account or attempt to anticipate any public or private sector nonstatistical uses that may be made of the delineations. These areas are not designed to serve as a general-purpose geographic framework applicable for nonstatistical activities or for use in program funding formulas.

“Furthermore, the Metropolitan and Micropolitan Statistical Area Standards do not produce an urban-rural classification, and confusion of these concepts can lead to difficulties in program implementation. Counties included in Metropolitan and Micropolitan Statistical Areas and many other counties may contain both urban and rural territory and populations. . . . OMB urges agencies, organizations, and policy makers to review carefully the goals of nonstatistical programs and policies to ensure that appropriate geographic entities are used to determine eligibility for the allocation of Federal funds.” (Emphasis in original.)

For CMS to accurately update the location-adjusted RCE limits using the CBSAs, as we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28174), we believe it would be necessary to use a new data source for information on physician salaries, specialties, location, and hours worked; and the data would need to be allocated to different geographic areas based on CBSAs. The AMA PSP collected data from a large sample of office-based physicians. We considered using data that are currently collected and publicly available. We could not find a reliable dataset that contained all of the necessary data elements needed to update the location-adjusted RCE limits based on CBSAs. The most reliable data we could find came from the Bureau of Labor Statistics (BLS) Occupational Employment Statistics (OES). The BLS OES data are collected annually, and capture a large and diverse population of physicians and corresponding CBSAs. We believe the BLS OES data are the most current, reliable source of income data for physicians. Although, the BLS OES is very reliable and collects data points for physician specialties, salary, and location, it does not collect detailed information for all 10 specialties; the “Radiology” and “Pathology” specialties are not separately captured. As such, we did not believe it was appropriate to use the BLS OES data to create an updated RCE limit if we would not have data available for two specialties.

We also weighed the benefit of collecting updated information from physicians (through use of a new nationwide survey) in order to obtain the data necessary for application of an appropriate locality adjustment based on CBSAs against the burden placed on such physicians in providing such data. In order to have a dataset that could accurately capture all the necessary information, we would need to collect data from a large population of physicians, including a sufficient sample size for each physician specialty in each CBSA. We weighed the burden that such a nationwide survey would entail for all physicians, including office-based physicians, to be asked to respond to an in-depth survey regarding their salary, specialty, location, hours worked, and other practice information against the benefit of using updated, CBSA-based information to include a location adjustment for the providers that are subject to the RCE limits.

When the RCE limits were developed in 1983, other than inpatient acute care hospitals paid under the IPPS, most provider types were reimbursed on a reasonable cost basis. Since then, providers such as skilled nursing facilities (SNFs), long-term care hospitals (LTCHs), inpatient rehabilitation facilities (IRFs), inpatient psychiatric facilities (IPFs), and home health agencies (HHAs) that previously were paid on a reasonable cost basis have transitioned to prospective payment systems and are no longer subject to the RCE limits. As of FY 2011 (the most recent cost report year for which we have complete data), our data show that there were only 59 children's hospitals and cancer hospitals and 46 teaching hospitals (that have elected cost reimbursement for their physicians' direct medical and surgical services) that are subject to the RCE limits. As such, we believe the benefit that could be gained by gathering the new data that would be necessary to maintain a location adjustment for the RCE limits is outweighed by the burden of conducting such a comprehensive survey of physicians.

Furthermore, we analyzed how the elimination of the location adjustment would affect the accuracy and appropriateness of the proposed RCE limits. To perform this analysis, we needed a reliable source of physician income data (without a location adjustment) which could be compared to the RCE limits without a location adjustment. We determined that the best available source of physician income data is the mean annual income data for similar RCE physician specialties collected by the BLS OES. As mentioned above, the BLS OES data are collected annually and capture a large and diverse population of physicians. These data are the most current, reliable source of income data by physician specialties. In addition, when comparing salaries, it is important to compare salary amounts that reflect the same number of hours worked per year. Because many physicians do not work a 2,080 hour work year, their salary may seem higher or lower due to the number of hours actually worked. The RCE limits are based on physicians who worked a 2,080 hour work year. The BLS OES data also are based on a 2,080 hour work year; therefore, we believe that comparing the RCE limits to these BLS OES data is appropriate for purposes of our analysis.

We performed an analysis comparing RCE limits for 2012, calculated without a location adjustment and solely for purposes of the analysis, to the most recently published (at the time of the analysis) BLS OES physician mean annual income data for the same year, to determine whether RCE limits based on the AMA PSP data, but without a location adjustment, would continue to reasonably reflect mean annual physician income data. For 2012, the BLS OES had income information for 8 of the 10 RCE specialties, which include the “Total” category; the BLS OES data did not capture the “Radiology” and “Pathology” specialties. We searched for another reliable data source for “Radiology” and “Pathology” but we could not find one with sufficient data elements to compare with the RCE limits. We used the MEI to update the RCE limits for these eight specialties to 2012 without including the location factor. We then compared these 2012 RCE limits to the 2012 BLS OES data for these same eight specialties. As shown in the table below, we found that the RCE limits ranged from 10.41 percent above the BLS OES mean annual income data to 3.58 percent below the BLS OES data. Only three of the eight specialties had RCE limits slightly less than the then-current BLS OES mean annual wages for their specialty. The remaining five specialties had RCE limits above the current BLS OES mean annual wages for the specialties.

Analysis Chart

SpecialtyRCE Limits updated to 2012*BLS OES Mean 2012 annual wagePercent difference
Total$206,300$184,82010.41%
General/Family Practice174,600180,850−3.58%
Internal Medicine192,700191,5200.61%
Surgery240,300230,5404.06%
Pediatrics165,500167,640−1.29%
OB/GYN231,200216,7606.25%
Radiology265,200N/AN/A
Psychiatry176,800177,520−0.41%
Anesthesiology233,500232,8200.29%
Pathology253,900N/AN/A
* These limits were calculated using the methodology only for purposes of this impact analysis.

The RCE amounts updated to 2012 and the BLS OES numbers for 2012 varied only slightly, and in most cases, the RCE limit was higher than the BLS OES mean annual wage. Based on this analysis, as we stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28175), we believe that RCE limits calculated using the AMA PSP data, and our proposed elimination of the location adjustment for the updated RCE limits, would result in RCE limits that are a reasonable reflection of mean annual physician income and would continue to ensure that providers subject to the RCE limits are paid in a fair and accurate manner.

Because there are a relatively small number of providers currently affected by the RCE limits and because, as discussed above, we believe the revised RCE limits without a location adjustment would continue to ensure appropriate payment to such providers, we believe that eliminating the location adjustment would have a minimal overall effect on providers subject to the RCE limits and on the industry as a whole.

For the reasons discussed above, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28175), we proposed to eliminate the location adjustment under the RCE limit methodology, and to revise § 415.70(b) of the regulations to remove consideration of the “type of location” as part of the methodology used to establish RCE limits.

Comment: One commenter suggested CMS work with the BLS to obtain the information needed to calculate the RCE limits with a location adjustment. One commenter suggested that CMS develop an alternative method of establishing a location adjustment.

Response: We plan to evaluate the BLS Occupational Employment Statistics and any other alternative data sources to further determine if a location adjustment is a viable option for future RCE updates.

Comment: A few commenters suggested that CMS keep the location adjustment as part of the RCE limits. They stated that location-adjusted RCE limits continue to be important in capturing accurate physician salary costs for all providers because all hospitals apply the RCE limits to physician salaries on Worksheet A-8-2 of the Medicare cost report. A few commenters expressed concern over the accuracy of costs, such as GME costs, that would result from applying RCE limits without a location adjustment.

Response: RCE limits currently have a payment impact on 105 Medicare providers, including 8 cancer hospitals, 51 children's hospitals and 46 electing teaching amendment (ETA) hospitals that elected cost reimbursement for their physicians' direct medical and surgical services. While it is true that all hospitals use the RCE limits on Worksheet A-8-2, for hospitals paid under the IPPS, the application of the RCE limits on Worksheet A-8-2 does not have a Medicare payment impact. Specifically, Worksheet C that is used for payment purposes calculates cost-to-charge ratios for IPPS hospitals using data prior to the application of the RCE limits on Worksheet A-8-2. Therefore, RCE limits have no effect on payments to providers paid under the IPPS. For the 46 ETA hospitals, Worksheet D-5 is used to apply the RCE limits to determine the proper payment on a reasonable cost basis of direct medical and surgical services of the physician. Given the current limitations of the location designation data described in the proposed rule, we believe it is appropriate to eliminate the location adjustment to the RCE limits. Based on the analysis discussed above and in the proposed rule, we believe that the RCE limits calculated without a location adjustment are a reasonable reflection of mean annual physician income and will continue to ensure that providers subject to the RCE limits are paid in a fair and accurate manner. Because of this, and because the RCE limits impact a relatively small number of providers, we believe that eliminating the location adjustment will have a minimal overall effect on providers subject to the RCE limits and on the industry as a whole. While a few commenters expressed concern over the accuracy of GME costs, we note that, under § 415.70(a)(2) of the regulations, RCE limits do not apply to costs of physician compensation attributable to approved GME programs that are payable under §§ 413.75 through 413.83.

After consideration of the public comments we received, in this final rule, we are adopting as final the proposed methodology for establishing the RCE limits. We are setting forth the final updated RCE limits on the amount of allowable compensation for services furnished by physicians to providers (and for ETA hospitals, for direct medical and surgical services of physicians) for cost reporting periods beginning on or after January 1, 2015. To calculate these final RCE limits, we used the same methodology that was used to calculate the original and previous updates to the RCE limits, but did not apply an adjustment based on geographical classification. As noted earlier, this methodology was derived from the 1982 working paper. We used the mean physician income by specialty from that working paper to calculate the RCE limits without adjusting for geographical classification. We then updated these data by the CPI-U (from 1982 to 1997) and then by the MEI (from 1998 to 2015) to compute the updated RCE limits. The RCE limits implemented by this final rule vary slightly from those in the proposed rule due to a more recent estimate of the MEI for 2015.

The chart below sets forth the final updated RCE limits on the amount of allowable compensation for services furnished by physicians to providers for cost reporting periods beginning on or after January 1, 2015, established using the same methodology that was used to calculate the original and previous updates to the RCE limits, but not applying an adjustment based on geographical classification.

Final CY 2015 RCE Limits

Total$211,500
General/Family Practice179,000
Internal Medicine197,500
Surgery246,400
Pediatrics169,700
OB/GYN237,100
Radiology271,900
Psychiatry181,300
Anesthesiology239,400
Pathology260,300

In addition, we are adopting as final our proposed revision of § 415.70(b) of the regulations to eliminate consideration of the type of location as part of the methodology to establish RCE limits for cost reporting periods beginning on or after January 1, 2015.

D. Critical Access Hospitals (CAHs)

1. Background

Sections 1820 and 1861(mm) of the Act, as amended by section 4201 of the Balanced Budget Act (BBA) of 1997, replaced the Essential Access Community Hospitals and Rural Primary Care Hospitals (EACH/RPCH) program with the Medicare Rural Hospital Flexibility Program (MRHFP), under which a qualifying facility can be designated as a CAH. CAHs participating in the MRHFP must meet the conditions for designation by the State and be certified by the Secretary in accordance with section 1820 of the Act. Further, in accordance with section 1820(e)(3) of the Act, a CAH must meet other criteria that the Secretary specifies.

The regulations that govern the conditions of participation (CoPs) for CAHs under the statutory requirements of section 1820 are codified at 42 CFR Part 485, Subpart F.

2. Proposed and Final Policy Changes Related to Reclassification as Rural for CAHs

Under section 1820(c)(2)(B)(i) of the Act, a facility is eligible for designation as a CAH only if it is located in a county or equivalent unit of local government in a rural area (as defined in section 1886(d)(2)(D) of the Act), or is being treated as being located in a rural area in accordance with section 1886(d)(8)(E) of the Act. The regulations implementing this location requirement are located at § 485.610(b). The regulations governing the process for a facility located in an urban area to apply for reclassification as a rural facility under section 1886(d)(8)(E) of the Act are located at § 412.103.

As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28054 through 28064), we proposed to implement the most recently published OMB delineations announced in OMB Bulletin No. 13-01. (We refer readers to section III.B. of the preamble of this final rule for a discussion of our final decision to implement the new OMB delineations announced in OMB Bulletin No. 13-01.) As previously stated, a facility must be located in a rural area in order to be eligible for designation as a CAH. Therefore, a new OMB delineation that redesignates an area from rural to urban, affects the status of a facility that is currently a CAH and had met the CAH location requirements prior to implementation of the new OMB delineation. A facility that is located in an urban area cannot remain a CAH unless it is reclassified as rural under § 412.103 of the regulations. In both the FY 2005 IPPS final rule (69 FR 49221 through 49222 and 69 FR 60242 and 60252) and the FY 2010 IPPS/LTCH PPS final rule (74 FR 43939 through 43940), we amended the regulations at § 412.103(a) and § 485.610(b) to provide for a transition period during which CAHs that had previously been located in rural areas but, as a result of new OMB delineations, were now located in urban areas, could reclassify as rural under § 412.103. Specifically, in both the FY 2005 IPPS final rule and the FY 2010 IPPS/LTCH PPS final rule, we provided for a 2-year period during which a CAH located in an urban area as a result of the new OMB delineations could continue participating without interruption as a CAH, thereby allowing the CAH sufficient time to reclassify as rural under § 412.103. If the facility did not reclassify as a rural facility by the end of that 2-year period, the CAH would not be able to retain its CAH status beyond that 2-year period. However, under the FY 2005 IPPS final rule and the FY 2010 IPPS/LTCH PPS final rule, the application of the regulation was limited to October 1, 2004 through September 30, 2006, and October 1, 2009 through September 30, 2011, respectively. As a result, in the absence of a new amendment to the regulations each time there are new OMB delineations, a CAH that becomes located in an urban area as a result of those OMB delineations would not be given 2 years to reclassify as rural under § 412.103 of the regulations.

In the FY 2010 IPPS/LTCH PPS final rule (74 FR 43940), we stated that we would consider whether it would be appropriate to propose, in future IPPS rulemaking, to revise § 485.610 and § 412.103 to provide for a transition period any time a CAH that was formerly located in a rural area is designated as being located in an urban area as a result of the redesignation of its county from rural to urban. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28176), we stated that after further consideration, we believe that it is appropriate to propose to change the regulations to provide for a transition period that is not restricted to a timeframe, but rather can be applied any time a facility that is currently designated as a CAH becomes located in an urban area as a result of a new OMB delineation.

Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28176), we proposed that, effective October 1, 2014, a CAH that was previously located in a rural area but is now located in an urban area as a result of a new OMB labor market area delineation will continue to be treated as rural for 2 years from the date the OMB delineation is implemented. Accordingly, we stated in the proposed rule that if the OMB delineations announced in OMB Bulletin No. 13-01 on February 28, 2013 discussed in section III.B. of the preamble of the proposed rule are implemented in this FY 2015 IPPS/LTCH PPS final rule, effective October 1, 2014, any CAH affected by the new OMB delineations in OMB Bulletin No. 13-01 would retain its rural status through September 30, 2016. An affected CAH would be required to reclassify as a rural facility under § 412.103 within that 2-year period in order to continue participating in the Medicare program as a CAH after the 2-year transition period ends. Therefore, taking into consideration the example above, any CAH affected by a new OMB delineation that is implemented in this FY 2015 IPPS/LTCH PPS final rule would be required to reclassify as rural by September 30, 2016, in order to retain its CAH status after September 30, 2016.

To implement this proposed change, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28176), we proposed to revise § 412.103 by adding a new paragraph (a)(6), and to revise § 485.610 by making a conforming change to the introductory text of paragraph (b) and adding a new paragraph (b)(5) to provide for a 2-year transition period that will apply any time a new OMB delineation causes a facility that was previously located in a rural area and is designated as a CAH to be located in an urban area. We stated we believe that this proposal to revise the regulations to automatically provide for a 2-year transition period following the implementation of new OMB delineations is more efficient than providing for a regulatory change limited to a timeframe, and, as a result, will be more effective in reducing any disruption caused by new OMB delineations.

Comment: Commenters supported CMS' proposal to provide for a 2-year transition period to allow CAHs affected by the implementation of new OMB delineations time to reclassify as rural in order to retain their CAH status after the 2-year transition period ends. Several commenters requested that CMS work with and provide notification to affected CAHs to alert them to the need to reclassify as rural in order to retain their CAH status. One commenter asked how new OMB delineations would impact necessary provider CAHs previously reclassified under prior updates.

Another commenter requested that CMS provide for a 3-year transition period to allow affected CAHs additional time to reclassify as rural or to prepare to transition to urban PPS facilities. The commenter stated that the size of CAHs and the number of regulations they must follow make it difficult for these facilities to process and respond to new requirements. The commenter stated that although only a small number of CAHs are affected by the new OMB delineations, those affected require considerable time to locate applicable State law, examine Rural Urban Commuting Area (RUCA) scores, and in general determine whether they are eligible to reclassify as rural facilities. The commenter stated that CMS has a precedent for providing a 3-year transition period because it proposed to apply such a grace period to urban facilities redesignated as rural so that these facilities have time to prepare for lower reimbursement resulting from several factors, including a lower wage index. The commenter stated that CAHs that lose their CAH status would also be subject to these lower payment rates and therefore would also benefit from being provided with a 3-year transition period.

Response: We appreciate the commenters' support of our proposal to provide CAHs affected by new OMB delineations with a 2-year transition period to reclassify as rural in order to retain their CAH status after the 2-year transition period ends. In response to the commenters' request that CMS notify each CAH affected by a change in OMB delineations, we encourage CAHs to contact CMS if they have questions regarding their rural status and whether this status has changed as a result of the implementation of the new OMB delineations as discussed in section III.B. of the preamble of this final rule.

In response to the question concerning necessary provider CAHs, section 1820(c)(2)(B)(i) of the Act requires that in order for a facility to be certified as a CAH, it must be located in a rural area or have reclassified as a rural facility. Therefore, if a necessary provider CAH is located in an urban area as a result of the new OMB delineations implemented in this final rule effective October 1, 2014, that CAH must now reclassify as rural in order to keep its CAH status after September 30, 2016. If a necessary provider CAH had previously reclassified as rural due to a prior change in OMB delineations, that CAH's rural status remains unchanged.

In response to the request to provide affected CAHs with a 3-year transition period during which they could either reclassify as rural or prepare to transition to an PPS facility, we continue to believe that 2 years is the appropriate amount of time for such a transition period. Consistent with the regulation changes made in FY 2005 and FY 2010 final rules (69 FR 49221through 49222, 69 FR 60242 and 60252, and 74 FR 43939 through 43940), we believe 2 years is a sufficient period of time in order for the CAH to work with its State to be designated as rural and engage in any other research it believes is necessary to determine whether it should reclassify as rural. Therefore, we are finalizing our proposal to provide CAHs affected by our implementation of the new OMB delineations with a 2-year transition period during which they must reclassify as rural in order to retain their CAH status after the 2-year period ends.

Comment: Commenters requested that, in addition to providing CAHs affected by the implementation of a new OMB delineation with a 2-year transition period to reclassify as rural, SCHs and MDHs affected by the implementation of a new OMB delineation also be provided with a transition period to reclassify as rural. One commenter requested that CMS clarify that a hospital's SCH status would not be affected by a CAH that is now located in an urban area as a result of a new OMB delineation while that CAH is in its 2-year transition period to reclassify as rural. Specifically, the commenter requested that a CAH not be considered a “like hospital” as defined at § 412.92(c)(2) during its transition period.

Response: We are clarifying that during an affected CAH's 2-year transition period, the facility will continue to be considered a CAH. We respond to the public comments related to transition periods for SCHs and MDHs in sections IV.G.4. and IV.G.5. of the preamble of this final rule.

After consideration of the public comments we received, we are finalizing our policy as proposed to provide for a 2-year transition period for CAHs affected by the implementation of a new OMB delineation during which the CAH must reclassify as rural in order to retain its CAH status after the 2-year transition period ends. To implement this change, we are revising § 412.103 by adding a new paragraph (a)(6), and revising § 485.610 by making a conforming change to the introductory text of paragraph (b) and adding a new paragraph (b)(5) to provide for a 2-year transition period that will apply any time the implementation a new OMB delineation causes a facility that was previously located in a rural area and is designated as a CAH to be located in an urban area. These regulation changes are effective October 1, 2014. For purposes of applying these regulation changes to the new OMB delineations implemented in this final rule effective October 1, 2014, CAHs affected by these most recent OMB delineations will be treated as CAHs through September 30, 2016 and will have until September 30, 2016, to reclassify as rural in order to keep their CAH status after September 30, 2016.

3. Revision of the Requirements for Physician Certification of CAH Inpatient Services

For inpatient CAH services to be payable under Medicare Part A, section 1814(a)(8) of the Act requires that a physician certify “that the individual may reasonably be expected to be discharged or transferred to a hospital within 96 hours after admission to the critical access hospital.” The regulations implementing this statutory requirement are located at § 424.15.

Prior to FY 2014, this physician certification was required no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted. In the FY 2014 IPPS/LTCH PPS final rule, we revised the CAH regulations concerning the timing requirements for certification of inpatient CAH services. Specifically, we revised § 424.15(b) to state that certification begins with the order for inpatient admission. The certification must be completed, signed, and documented in the medical record prior to discharge (78 FR 50970). This change was effective October 1, 2013.

However, in order to provide CAHs with greater flexibility in meeting this certification requirement, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28176 through 28177), we proposed to amend the regulations governing the timing of the 96-hour certification requirement at § 424.15(b) such that physician certification is required no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted. That is, we proposed to remove the requirement that certification of the 96-hour requirement must be completed prior to discharge and we proposed to reinstate the timing requirement that was in place prior to October 1, 2013.

We proposed to revise § 424.15(b) to remove the phrase “prior to discharge” and replace it with “no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted”. In addition, we proposed to make a conforming amendment to § 424.11(d)(5). Section 424.11(d)(5) states that for all inpatient hospital or critical access hospital inpatient services, including inpatient psychiatric facility services, a delayed certification may not extend past discharge. Because we proposed to change the timing requirement for physician certification of CAH inpatient services at § 424.15(b), such that the certification could be completed past discharge, we proposed to revise § 424.11(d)(5) to remove the phrase “or critical access hospital inpatient”. We sought public comment on these proposed changes to the regulations governing the requirement for physician certification of CAH inpatient services.

Comment: Most commenters supported the proposed change to allow physician certification to be completed 1 day prior to when the claim for the inpatient service is submitted. Commenters requested that CMS provide additional flexibility and avoid further confusion by clarifying that CAHs have until no later than 1 day prior to the day on which the claim for the inpatient service is submitted to complete all certification requirements. One commenter stated that the proposed change could cause inaccurate and delayed chart entries because the certification may take place 30, 60, or 90 days after the inpatient is discharged. The commenter recommended that physician certification be completed within 24 hours of admission and that the medical record be used to meet all certification requirements. One commenter stated that asking a physician to certify his or her expectation for an individual's length of stay after the individual's inpatient stay has exceeded 96 hours will create additional confusion and will be met with greater resistance from physicians. Commenters asked for clarification in understanding how the proposal would help CAHs if the certification is still required to state that the individual will be discharged or transferred to another hospital within 96 hours after admission to the CAH.

Response: We appreciate the commenters' support of our proposal. In response to commenters who requested that CMS clarify that all certification requirements can be met no later than 1 day prior to when the claim is submitted, we are revising our proposed amendment to § 424.15(b) to provide that a CAH has until 1 day prior to when the claim for the inpatient service is submitted to complete all certification requirements. In order to finalize this policy, we are amending the regulation text at § 424.11(d)(5) to remove the phrase “or critical access hospital inpatient.” In addition we are revising the regulations at § 424.15(b) to state that certification begins with the order for inpatient admission. All certification requirements must be completed, signed, and documented in the medical record no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted. We believe these changes are consistent with the 96-hour certification requirement and the existing CoP requirements.

In response to commenters' concerns about providing a delayed certification, the policy finalized in this rule requires that all certification requirements be completed no later than 1 day prior to when the claim for the inpatient service is submitted. Therefore, CAHs are not precluded from completing these certification requirements in advance of this deadline if they believe an earlier completion of certification requirements is appropriate. We note that we are not making any changes related to the order requirements for admission and that in accordance with § 412.3, an order is required before or at the time of admission to admit an individual as an inpatient. In addition, we refer readers to the CY 2015 OPPS/ASC proposed rule, specifically section “XVI. Proposed Revision of the Requirements for Physician Certification of Hospital Inpatient Services Other Psychiatric Inpatient Services” (79 FR 41056 through 41058). In the CY 2015 OPPS/ASC proposed rule, we proposed to require inpatient admission orders as a condition of payment based upon our general rulemaking authority under section 1871 of the Act rather than as an element of the physician certification under section 1814(a)(3) of the Act. In addition, in the CY 2015 OPPS/ASC proposed rule, we proposed to change our interpretation of section 1814(a)(3) of the Act to require a physician certification only for long-stay cases and outlier cases. In that rule, we proposed that 20 days is an appropriate minimum threshold for physician certification and we proposed to define long-stay cases as cases with stays of 20 days or longer. These proposed changes refer to the general physician certification requirements under section 1814(a)(3) of the Act and do not address the 96-hour certification requirement at section 1814(a)(8) of the Act

Comment: Although many commenters supported the proposed change, many commenters indicated that they continue to have significant concerns with the 96-hour certification requirement and that the proposed change does not do enough to alleviate these concerns. Commenters stated they continue to support the Critical Access Hospital Relief Act of 2014, which would remove the 96-hour certification requirement for payment. Commenters requested that CMS exercise its discretion and make clear it will not enforce the 96-hour certification requirement because as long as this requirement is enforced, CAHs may not be eligible for Medicare payment. One commenter stated that occasionally admitting a patient who is expected to stay longer than 96 hours is permissible and should be paid. Commenters stated that physicians have been given the impossible task of coordinating the 96-hour certification requirement for payment with the 2-midnight policy and that, in some cases, the physician must certify that the patient will be transferred or discharged within a 49-hour timeframe. Another commenter stated that the 96-hour certification requirement is obsolete and does not recognize advancements in services which CAHs provide, including telehealth services. Commenters requested that CMS seek a legislative change that would align the certification requirement for payment with the CAH CoP requirement, which requires an annual average length of stay of 96 hours. Commenters stated that the certification requirement for payment could be met by requiring that the CAH certify that it has the appropriate resources and staff to treat the inpatient. Commenters stated that the CAH program was established to provide individuals living in rural areas with access to critical health care services so that these individuals can receive high quality and cost efficient care close to home even though providing this type of care may prove to be unprofitable for a CAH. Commenters stated that CAHs provide services that may require longer lengths of stay, and while the provision of these services does not violate the CoP requirement for an annual average length of stay of 96 hours, CAHs are prevented from providing these types of services because they cannot meet the 96-hour certification requirement for payment. Commenters stated they are concerned about their ability to treat patients, employ new providers, and maintain services essential to their community.

Commenters expressed concern about the impact of the 96-hour certification requirement for payment on surgical procedures. Commenters stated CAHs have put much effort into providing these procedures so that beneficiaries, particularly elderly individuals, can receive these services close to home. One commenter stated that surgeons who practice in rural areas rely on performing specific surgical procedures such as colon resections. The commenter stated that if these surgeons are only able to provide short-stay procedures and can no longer provide procedures that require longer lengths of stay, they would likely discontinue practicing at CAHs. One commenter stated that delaying the 96-hour certification requirement is not a resolution because it does not eliminate the fact that a surgeon will be unable to admit an individual to a CAH if he or she ethically believes that the individual will need 5 days as an inpatient.

One commenter recommended CMS withdraw the policy related to the 96-hour certification requirement for payment in the final rule for several reasons. The commenter stated that the policy jeopardizes a physician's ability to care for his or her patient as required by the patient's condition because admission should be based on medical judgment once an individual's condition and symptoms are evaluated. The commenter stated that implementation of the policy will result in dissatisfaction and confusion because patients will have to become accustomed to new hospitals and new medical staff and a decline in patient satisfaction scores is something from which a hospital may not be able to recover. The commenter stated that although the 96-hour certification requirement is in statute, it was not enforced by CMS until FY 2014 and that CAHs were not given advance notification of the enforcement and there has been little preparation, training or guidance from CMS until very recently. The commenter noted that medical staff of its member CAHs are angry and frustrated especially because of the detrimental effect of the 96-hour certification requirement on their patients.

Response: As stated earlier in this preamble, we believe the policy we are finalizing in this rule is consistent with the 96-hour certification requirement and the existing CoP requirements. The remainder of this response provides a review of the 96-hour certification requirement.

For inpatient CAH services, section 1814(a)(8) of the Act requires for Medicare Part A payment that “in the case of inpatient critical access hospital services, a physician certifies that the individual may reasonably be expected to be discharged or transferred to a hospital within 96 hours after admission to the critical access hospital.” Because this statutory requirement is based on an expectation, if a physician certifies in good faith, that an individual may reasonably be expected to be discharged or transferred to a hospital within 96 hours after admission to the CAH and then something unforeseen occurs that causes the individual to stay longer at the CAH, Medicare will pay for the costs of treating that patient and there would not be a problem with regard to the CAH designation as long as that individual's stay does not cause the CAH to exceed its 96-hour annual average CoP requirement. However, if a physician cannot in good faith certify that an individual may reasonably be expected to be discharged or transferred within 96 hours after admission to the CAH, the CAH will not receive Medicare Part A payment for any portion of that individual's inpatient stay.

In addition, time as an outpatient at the CAH is not included in applying the 96-hour requirement, nor does time in a CAH swing bed, which is being used to provide skilled nursing services, count towards the 96-hour requirement. The clock for the 96 hours only begins once the individual is admitted to the CAH as an inpatient.

After consideration of the public comments we received, we are finalizing a policy that a CAH is required to complete all physician certification requirements no later than 1 day before the date on which the claim for the inpatient service is submitted. In order to finalize this change, we are amending the regulation text at § 424.11(d)(5) to remove the phrase “or critical access hospital inpatient.” In addition, we are revising the regulations at § 424.15(b) to state that certification begins with the order for inpatient admission. All certification requirements must be completed, signed, and documented in the medical record no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted. These changes are effective October 1, 2014.

VII. Changes to the Long-Term Care Hospital Prospective Payment System (LTCH PPS) for FY 2015

A. Background of the LTCH PPS

1. Legislative and Regulatory Authority

Section 123 of the Medicare, Medicaid, and SCHIP (State Children's Health Insurance Program) Balanced Budget Refinement Act of 1999 (BBRA) (Pub. L. 106-113) as amended by section 307(b) of the Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act of 2000 (BIPA) (Pub. L. 106-554) provides for payment for both the operating and capital-related costs of hospital inpatient stays in long-term care hospitals (LTCHs) under Medicare Part A based on prospectively set rates. The Medicare prospective payment system (PPS) for LTCHs applies to hospitals that are described in section 1886(d)(1)(B)(iv) of the Act, effective for cost reporting periods beginning on or after October 1, 2002.

Section 1886(d)(1)(B)(iv)(I) of the Act defines a LTCH as “a hospital which has an average inpatient length of stay (as determined by the Secretary) of greater than 25 days.” Section 1886(d)(1)(B)(iv)(II) of the Act also provides an alternative definition of LTCHs: specifically, a hospital that first received payment under section 1886(d) of the Act in 1986 and has an average inpatient length of stay (LOS) (as determined by the Secretary of Health and Human Services (the Secretary)) of greater than 20 days and has 80 percent or more of its annual Medicare inpatient discharges with a principal diagnosis that reflects a finding of neoplastic disease in the 12-month cost reporting period ending in FY 1997.

Section 123 of the BBRA requires the PPS for LTCHs to be a “per discharge” system with a diagnosis-related group (DRG) based patient classification system that reflects the differences in patient resources and costs in LTCHs.

Section 307(b)(1) of the BIPA, among other things, mandates that the Secretary shall examine, and may provide for, adjustments to payments under the LTCH PPS, including adjustments to DRG weights, area wage adjustments, geographic reclassification, outliers, updates, and a disproportionate share adjustment.

In the August 30, 2002 Federal Register, we issued a final rule that implemented the LTCH PPS authorized under the BBRA and BIPA (67 FR 55954). For the initial implementation of the LTCH PPS (FYs 2003 through FY 2007), the system used information from LTCH patient records to classify patients into distinct long-term care diagnosis-related groups (LTC-DRGs) based on clinical characteristics and expected resource needs. Beginning in FY 2008, we adopted the Medicare severity long-term care diagnosis-related groups (MS-LTC-DRGs) as the patient classification system used under the LTCH PPS. Payments are calculated for each MS-LTC-DRG and provisions are made for appropriate payment adjustments. Payment rates under the LTCH PPS are updated annually and published in the Federal Register.

The LTCH PPS replaced the reasonable cost-based payment system under the Tax Equity and Fiscal Responsibility Act of 1982 (TEFRA) (Pub. L. 97-248) for payments for inpatient services provided by a LTCH with a cost reporting period beginning on or after October 1, 2002. (The regulations implementing the TEFRA reasonable cost-based payment provisions are located at 42 CFR Part 413.) With the implementation of the PPS for acute care hospitals authorized by the Social Security Amendments of 1983 (Pub. L. 98-21), which added section 1886(d) to the Act, certain hospitals, including LTCHs, were excluded from the PPS for acute care hospitals and were paid their reasonable costs for inpatient services subject to a per discharge limitation or target amount under the TEFRA system. For each cost reporting period, a hospital-specific ceiling on payments was determined by multiplying the hospital's updated target amount by the number of total current year Medicare discharges. (Generally, in section VII. of the preamble of this final rule, when we refer to discharges, we describe Medicare discharges.) The August 30, 2002 final rule further details the payment policy under the TEFRA system (67 FR 55954).

In the August 30, 2002 final rule, we provided for a 5-year transition period from payments under the TEFRA system to payments under the LTCH PPS. During this 5-year transition period, a LTCH's total payment under the PPS was based on an increasing percentage of the Federal rate with a corresponding decrease in the percentage of the LTCH PPS payment that is based on reasonable cost concepts, unless a LTCH made a one-time election to be paid based on 100 percent of the Federal rate. Beginning with LTCHs' cost reporting periods beginning on or after October 1, 2006, total LTCH PPS payments are based on 100 percent of the Federal rate.

In addition, in the August 30, 2002 final rule, we presented an in-depth discussion of the LTCH PPS, including the patient classification system, relative weights, payment rates, additional payments, and the budget neutrality requirements mandated by section 123 of the BBRA. The same final rule that established regulations for the LTCH PPS under 42 CFR Part 412, Subpart O, also contained LTCH provisions related to covered inpatient services, limitation on charges to beneficiaries, medical review requirements, furnishing of inpatient hospital services directly or under arrangement, and reporting and recordkeeping requirements. We refer readers to the August 30, 2002 final rule for a comprehensive discussion of the research and data that supported the establishment of the LTCH PPS (67 FR 55954).

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51733 through 51743) for a chronological summary of the main legislative and regulatory developments affecting the LTCH PPS through the annual update cycles prior to the FY 2014 rulemaking cycle. In addition, in this final rule, we discuss the provisions of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67), enacted on December 26, 2013, that affect the LTCH PPS. In section VII.I.2. of the preamble of this final rule, we discuss the provisions of section 1206(a) of Public Law 113-67, which amended section 1886(m) of the Act by adding paragraph (6) and established, among other things, patient-level criteria for payments under the LTCH PPS for implementation beginning with FY 2016. In section VII.E. of the preamble of this final rule, we discuss the provisions of section 1206(b)(1) of Public Law 113-67, which provide for the retroactive reinstatement and extension, for an additional 4 years, of the moratorium on the full implementation of the 25-percent threshold payment adjustment policy (except for “grandfathered” hospitals-within-hospitals (HwHs), which are permanently exempt from this policy). In section VII.G. of the preamble of this final rule, we discuss the provisions of section 1206(b)(2) of Public Law 113-67 (as amended by section 112(b) of the Protecting Access to Medicare Act (Pub. L. 113-93), which, subject to certain defined exceptions, provide for statutory moratoria on the establishment of new LTCHs and LTCH satellite facilities and a new statutory moratorium on the increase in the number of hospital beds in LTCHs or LTCH satellite facilities for the period beginning April 1, 2014 and ending September 30, 2017. In section IX.C. of the preamble of this final rule, we discuss the provisions of section 1206(c) of Public Law 113-67, which amended the LTCH Quality Reporting Program established under section 1886(m)(5) of the Act by requiring the Secretary to establish a functional status quality measure to evaluate the in mobility among inpatients requiring ventilator support no later than October 1, 2015. In section VII.H. of the preamble of this final rule, we discuss the findings of a review of payments to certain LTCHs (that is, LTCHs classified under subclause (II) of section 1886(d)(1)(B)(iv) of the Act) that was conducted in accordance with section 1206(d) of Public Law 113-67, and finalize a policy to apply a payment adjustment under the LTCH PPS to “subclause (II)” LTCHs beginning in FY 2015 that will result in payments to this type of LTCH resembling payments under the reasonable cost TEFRA payment system model.

2. Criteria for Classification as an LTCH

a. Classification as an LTCH

Under the regulations at § 412.23(e)(1), to qualify to be paid under the LTCH PPS, a hospital must have a provider agreement with Medicare. Furthermore, § 412.23(e)(2)(i), which implements section 1886(d)(1)(B)(iv)(I) of the Act, requires that a hospital have an average Medicare inpatient length of stay of greater than 25 days to be paid under the LTCH PPS. Alternatively, § 412.23(e)(2)(ii) states that, for cost reporting periods beginning on or after August 5, 1997, a hospital that was first excluded from the PPS in 1986 and can demonstrate that at least 80 percent of its annual Medicare inpatient discharges in the 12-month cost reporting period ending in FY 1997 have a principal diagnosis that reflects a finding of neoplastic disease must have an average inpatient length of stay for all patients, including both Medicare and non-Medicare inpatients, of greater than 20 days.

b. Hospitals Excluded From the LTCH PPS

The following hospitals are paid under special payment provisions, as described in § 412.22(c) and, therefore, are not subject to the LTCH PPS rules:

  • Veterans Administration hospitals.
  • Hospitals that are reimbursed under State cost control systems approved under 42 CFR Part 403.
  • Hospitals that are reimbursed in accordance with demonstration projects authorized under section 402(a) of the Social Security Amendments of 1967 (Pub. L. 90-248) (42 U.S.C. 1395b-1) or section 222(a) of the Social Security Amendments of 1972 (Pub. L. 92-603) (42 U.S.C. 1395b-1 (note)) (Statewide all-payer systems, subject to the rate-of-increase test at section 1814(b) of the Act).
  • Nonparticipating hospitals furnishing emergency services to Medicare beneficiaries.

3. Limitation on Charges to Beneficiaries

In the August 30, 2002 final rule, we presented an in-depth discussion of beneficiary liability under the LTCH PPS (67 FR 55974 through 55975). In the RY 2005 LTCH PPS final rule (69 FR 25676), we clarified that the discussion of beneficiary liability in the August 30, 2002 final rule was not meant to establish rates or payments for, or define Medicare-eligible expenses. Under § 412.507, if the Medicare payment to the LTCH is the full LTC-DRG payment amount, consistent with other established hospital prospective payment systems, a LTCH may not bill a Medicare beneficiary for more than the deductible and coinsurance amounts as specified under §§ 409.82, 409.83, and 409.87 and for items and services specified under § 489.30(a). However, under the LTCH PPS, Medicare will only pay for days for which the beneficiary has coverage until the short-stay outlier (SSO) threshold is exceeded. Therefore, if the Medicare payment was for a SSO case (§ 412.529) that was less than the full LTC-DRG payment amount because the beneficiary had insufficient remaining Medicare days, the LTCH could also charge the beneficiary for services delivered on those uncovered days (§ 412.507).

4. Administrative Simplification Compliance Act (ASCA) and Health Insurance Portability and Accountability Act (HIPAA) Compliance

Claims submitted to Medicare must comply with both the Administrative Simplification Compliance Act (ASCA) (Pub. L. 107-105), and the Health Insurance Portability and Accountability Act of 1996 (HIPAA) (Pub. L. 104-191). Section 3 of the ASCA requires that the Medicare Program deny payment under Part A or Part B for any expenses incurred for items or services “for which a claim is submitted other than in an electronic form specified by the Secretary.” Section 1862(h) of the Act (as added by section 3(a) of the ASCA) provides that the Secretary shall waive such denial in two specific types of cases and may also waive such denial “in such unusual cases as the Secretary finds appropriate” (68 FR 48805). Section 3 of the ASCA operates in the context of the HIPAA regulations, which include, among other provisions, the transactions and code sets standards requirements codified under 45 CFR Parts 160 and 162 (generally known as the Transactions Rule). The Transactions Rule requires covered entities, including covered health care providers, to conduct certain electronic health care transactions according to the applicable transactions and code sets standards.

The Department of Health and Human Services has a number of initiatives designed to encourage and support the adoption of health information technology and promote nationwide health information exchange to improve health care. The Office of the National Coordinator for Health Information Technology (ONC) leads these efforts in collaboration with other agencies, including CMS and the Office of the Assistant Secretary for Planning and Evaluation (ASPE). Through a number of activities, including several open government initiatives, HHS is promoting the adoption of electronic health record (EHR) technology certified under the ONC Health Information Technology (HIT) Certification Program developed to support secure, interoperable, health information exchange. The HIT Policy Committee (a Federal Advisory Committee) has recommended areas in which HIT certification under the ONC HIT Certification Program would help support providers that are eligible for the Medicare and Medicaid EHR Incentive Programs, such as long-term and postacute care (including LTCHs) and behavioral health care providers. We believe that the use of certified EHRs by LTCHs (and other types of providers that are ineligible for the Medicare and Medicaid EHR Incentive Programs) can effectively and efficiently help providers improve internal care delivery practices, support the exchange of important information across care partners and during transitions of care, and could enable the reporting of electronically specified clinical quality measures (eCQMs) (as described elsewhere in this rule). More information on the ONC HIT Certification Program and efforts to develop standards applicable to LTCHs can be found by accessing the following Web sites and resources:

B. Medicare Severity Long-Term Care Diagnosis-Related Group (MS-LTC-DRG) Classifications and Relative Weights for FY 2015

1. Background

Section 123 of the BBRA requires that the Secretary implement a PPS for LTCHs (that is, a per discharge system with a diagnosis-related group (DRG)-based patient classification system reflecting the differences in patient resources and costs). Section 307(b)(1) of the BIPA modified the requirements of section 123 of the BBRA by requiring that the Secretary examine “the feasibility and the impact of basing payment under such a system [the long-term care hospital (LTCH) PPS] on the use of existing (or refined) hospital DRGs that have been modified to account for different resource use of LTCH patients, as well as the use of the most recently available hospital discharge data.”

When the LTCH PPS was implemented for cost reporting periods beginning on or after October 1, 2002, we adopted the same DRG patient classification system (that is, the CMS DRGs) that was utilized at that time under the IPPS. As a component of the LTCH PPS, we refer to this patient classification system as the “long-term care diagnosis-related groups (LTC-DRGs).” Although the patient classification system used under both the LTCH PPS and the IPPS are the same, the relative weights are different. The established relative weight methodology and data used under the LTCH PPS result in relative weights under the LTCH PPS that reflect “the differences in patient resource use . . .” of LTCH patients (section 123(a)(1) of the BBRA (Pub. L. 106-113)).

As part of our efforts to better recognize severity of illness among patients, in the FY 2008 IPPS final rule with comment period (72 FR 47130), the MS-DRGs and the Medicare severity long-term care diagnosis-related groups (MS-LTC-DRGs) were adopted under the IPPS and the LTCH PPS, respectively, effective beginning October 1, 2007 (FY 2008). For a full description of the development, implementation, and rationale for the use of the MS-DRGs and MS-LTC-DRGs, we refer readers to the FY 2008 IPPS final rule with comment period (72 FR 47141 through 47175 and 47277 through 47299). (We note that, in that same final rule, we revised the regulations at § 412.503 to specify that for LTCH discharges occurring on or after October 1, 2007, when applying the provisions of 42 CFR Part 412, Subpart O applicable to LTCHs for policy descriptions and payment calculations, all references to LTC-DRGs would be considered a reference to MS-LTC-DRGs. For the remainder of this section, we present the discussion in terms of the current MS-LTC-DRG patient classification system unless specifically referring to the previous LTC-DRG patient classification system that was in effect before October 1, 2007.)

The MS-DRGs adopted in FY 2008 represent an increase in the number of DRGs by 207 (that is, from 538 to 745) (72 FR 47171). The MS-DRG classifications are updated annually. There are currently 751 MS-DRG groupings. After finalizing the proposed changes to the MS-DRG groupings described in section II.G. of this preamble, there are a total of 753 MS-DRG groupings for FY 2015. Consistent with section 123 of the BBRA, as amended by section 307(b)(1) of the BIPA, and § 412.515 of the regulations, we used information derived from LTCH PPS patient records to classify LTCH discharges into distinct MS-LTC-DRGs based on clinical characteristics and estimated resource needs. We then assigned an appropriate weight to the MS-LTC-DRGs to account for the difference in resource use by patients exhibiting the case complexity and multiple medical problems characteristic of LTCHs. Below we provide a general summary of our existing methodology for determining the FY 2015 MS-LTC-DRG relative weights under the LTCH PPS.

In a departure from the IPPS, and as discussed in greater detail below in section VII.B.3.f. of this preamble, we are continuing to use low-volume MS-LTC-DRGs (that is, MS-LTC-DRGs with less than 25 LTCH cases) in determining the MS-LTC-DRG relative weights because LTCHs do not typically treat the full range of diagnoses as do acute care hospitals. For purposes of determining the relative weights for the large number of low-volume MS-LTC-DRGs, we grouped all of the low-volume MS-LTC-DRGs into five quintiles based on average charge per discharge. (A detailed discussion of the initial development and application of the quintile methodology appears in the August 30, 2002 LTCH PPS final rule (67 FR 55978).) Under our existing methodology, we accounted for adjustments to payments for short-stay outlier (SSO) cases (that is, cases where the covered length of stay at the LTCH is less than or equal to five-sixths of the geometric average length of stay for the MS-LTC-DRG). Furthermore, we made adjustments to account for nonmonotonically increasing weights, when necessary. That is, theoretically, cases under the MS-LTC-DRG system that are more severe require greater expenditure of medical care resources and will result in higher average charges such that, in the severity levels within a base MS-LTC-DRG, the relative weights should increase monotonically with severity from the lowest to highest severity level. (We discuss nonmonotonicity in greater detail and our methodology to adjust the MS-LTC-DRG relative weights to account for nonmonotonically increasing relative weights in section VII.B.3.g. (Step 6) of this preamble.)

2. Patient Classifications into MS-LTC-DRGs

a. Background

The MS-DRGs (used under the IPPS) and the MS-LTC-DRGs (used under the LTCH PPS) are based on the CMS DRG structure. As noted above in this section, we refer to the DRGs under the LTCH PPS as MS-LTC-DRGs although they are structurally identical to the MS-DRGs used under the IPPS.

The MS-DRGs are organized into 25 major diagnostic categories (MDCs), most of which are based on a particular organ system of the body; the remainder involve multiple organ systems (such as MDC 22, Burns). Within most MDCs, cases are then divided into surgical DRGs and medical DRGs. Surgical DRGs are assigned based on a surgical hierarchy that orders operating room (O.R.) procedures or groups of O.R. procedures by resource intensity. The GROUPER software program does not recognize all ICD-9-CM procedure codes as procedures affecting DRG assignment. That is, procedures that are not surgical (for example, EKGs), or minor surgical procedures (for example, a biopsy of skin and subcutaneous tissue (procedure code 86.11)) do not affect the MS-LTC-DRG assignment based on their presence on the claim.

Generally, under the LTCH PPS, a Medicare payment is made at a predetermined specific rate for each discharge and that payment varies by the MS-LTC-DRG to which a beneficiary's stay is assigned. Cases are classified into MS-LTC-DRGs for payment based on the following six data elements:

  • Principal diagnosis;
  • Additional or secondary diagnoses;
  • Surgical procedures;
  • Age;
  • Sex; and
  • Discharge status of the patient.

Through FY 2010, the number of diagnosis and procedure codes considered for MS-DRG assignment was limited to nine and six, respectively. However, for claims submitted on the 5010 format beginning January 1, 2011, we increased the capacity to process diagnosis and procedure codes up to 25 diagnoses and 25 procedures. This includes one principal diagnosis and up to 24 secondary diagnoses for severity of illness determinations. We refer readers to section II.G.11.c. of the preamble of the FY 2011 IPPS/LTCH PPS final rule for a complete discussion of this change (75 FR 50127).

Under HIPAA transactions and code sets regulations at 45 CFR Parts 160 and 162, covered entities must comply with the adopted transaction standards and operating rules specified in Subparts I through S of Part 162. Among other requirements, by January 1, 2012, covered entities were required to use the ASC X12 Standards for Electronic Data Interchange Technical Report Type 3—Health Care Claim: Institutional (837), May 2006, ASC X12N/005010X223, and Type 1 Errata to Health Care Claim: Institutional (837) ASC X12 Standards for Electronic Data Interchange Technical Report Type 3, October 2007, ASC X12N/005010X233A1 for the health care claims or equivalent encounter information transaction (45 CFR 162.1102).

HIPAA requires covered entities to use the applicable medical data code set requirements when conducting HIPAA transactions (45 CFR 162.1000). Currently, upon the discharge of the patient, the LTCH must assign appropriate diagnosis and procedure codes from the most current version of the Internal Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM). For additional information on the ICD-9-CM coding system, we refer readers to the FY 2008 IPPS final rule with comment period (72 FR 47241 through 47243 and 47277 through 47281). We also refer readers to the detailed discussion on correct coding practices in the August 30, 2002 LTCH PPS final rule (67 FR 55981 through 55983). Additional coding instructions and examples are published in the Coding Clinic for ICD-9-CM, a product of the American Hospital Association. (We refer readers to section II.G.13. of the preamble of this final rule for additional information on the annual revisions to the ICD-9-CM codes.)

Providers use the code sets under the ICD-9-CM coding system to report diagnoses and procedures for Medicare hospital inpatient services under the MS-DRG system. We have been discussing the conversion to the ICD-10 coding system for many years. In the FY 2015 IPPS/LTCH PPS proposed rule, we referred readers to section II.G.1. of the preamble of that proposed rule for additional information on the implementation of the ICD-10 coding system.

Comment: One commenter requested that CMS develop a crosswalk between ICD-9-CM codes and ICD-10 codes to specifically assist LTCH providers in determining the appropriate MS-LTC-DRGs that are affected as a result of the transition to ICD-10-PCS. The commenter stated that additional guidance is needed regarding the specific MS-LTC-DRGs that LTCHs should concentrate their efforts on during the delay in the implementation of ICD-10-PCS.

Response: As noted above, the MS-LTC-DRGs under the LTCH PPS are structurally identical to the MS-DRGs used under the IPPS. For a detailed discussion of the conversion from the ICD-9-CM to the ICD-10-PCS code set and the ICD-9-CM to ICD-10 MS-DRGs, we refer readers to section II.G.1.a. of the preamble of this FY 2015 IPPS/LTCH PPS final rule. Included in this discussion are all the ICD-10 resources publicly available via the Internet on the CMS ICD-10 Web site: http://www.cms.gov/Medicare/Coding/ICD10/index.html. For example, the General Equivalence Mappings (GEMs) that consist of forward and backward mappings of ICD-9-CM and ICD-10-PCS coding sets are available for providers to review their current list of ICD-9-CM codes and map (or crosswalk) them to the appropriate available ICD-10-PCS codes. However, we note that the GEMs are not a substitute for coding from actual medical record documentation using the ICD-10-PCS code set. We also have held several ICD-10-PCS National Provider Calls where interested parties can listen to past presentations and review the accompanying slide presentations available. We refer readers to the following Web site: http://www.cms.gov/Medicare/Coding/ICD10/CMS-Sponsored-ICD-10-Teleconferences.html.

To create the MS-DRGs (and by extension, the MS-LTC-DRGs), base DRGs were subdivided according to the presence of specific secondary diagnoses designated as complications or comorbidities (CCs) into one, two, or three levels of severity, depending on the impact of the CCs on resources used for those cases. Specifically, there are sets of MS-DRGs that are split into 2 or 3 subgroups based on the presence or absence of a CC or a major complication or comorbidity (MCC). We refer readers to section II.D. of the FY 2008 IPPS final rule with comment period for a detailed discussion about the creation of MS-DRGs based on severity of illness levels (72 FR 47141 through 47175).

Medicare administrative contractors (MACs) enter the clinical and demographic information submitted by LTCHs into their claims processing systems and subject this information to a series of automated screening processes called the Medicare Code Editor (MCE). These screens are designed to identify cases that require further review before assignment into a MS-LTC-DRG can be made. During this process, certain cases are selected for further development (74 FR 43949).

After screening through the MCE, each claim is classified into the appropriate MS-LTC-DRG by the Medicare LTCH GROUPER software on the basis of diagnosis and procedure codes and other demographic information (age, sex, and discharge status). The GROUPER software used under the LTCH PPS is the same GROUPER software program used under the IPPS. Following the MS-LTC-DRG assignment, the Medicare contractor determines the prospective payment amount by using the Medicare PRICER program, which accounts for hospital-specific adjustments. Under the LTCH PPS, we provide an opportunity for LTCHs to review the MS-LTC-DRG assignments made by the Medicare contractor and to submit additional information within a specified timeframe as provided in § 412.513(c).

The GROUPER software is used both to classify past cases to measure relative hospital resource consumption to establish the MS-LTC-DRG relative weights and to classify current cases for purposes of determining payment. The records for all Medicare hospital inpatient discharges are maintained in the MedPAR file. The data in this file are used to evaluate possible MS-DRG and MS-LTC-DRG classification changes and to recalibrate the MS-DRG and MS-LTC-DRG relative weights during our annual update under both the IPPS (§ 412.60(e)) and the LTCH PPS (§ 412.517), respectively.

b. Changes to the MS-LTC-DRGs for FY 2015

As specified by our regulations at § 412.517(a), which require that the MS-LTC-DRG classifications and relative weights be updated annually, and consistent with our historical practice of using the same patient classification system under the LTCH PPS as is used under the IPPS, we proposed to update the MS-LTC-DRG classifications effective October 1, 2014, through September 30, 2015 (FY 2015) consistent with the proposed changes to specific MS-DRG classifications (that is, proposed GROUPER Version 32.0). We did not receive any public comments on this proposal. Therefore, we are adopting the proposal without modification in this final rule. In accordance with § 412.517(a) and consistent with our historical practice, we are updating the MS-LTC-DRG classifications effective October 1, 2014, through September 30, 2015 (FY 2015) consistent with the changes to specific MS-DRG classifications presented in section II.G. of this preamble (that is, GROUPER Version 32.0). Therefore, the MS-LTC-DRGs for FY 2015 presented in this final rule are the same as the MS-DRGs that are being used under the IPPS for FY 2015. In addition, because the MS-LTC-DRGs for FY 2015 are the same as the MS-DRGs for FY 2015, the other changes that affect MS-DRG (and by extension MS-LTC-DRG) assignments under GROUPER Version 32.0 as discussed in section II.G. of the preamble of this final rule, including the changes to the MCE software and the ICD-9-CM coding system, also are applicable under the LTCH PPS for FY 2015.

3. Development of the FY 2015 MS-LTC-DRG Relative Weights

a. General Overview of the Development of the MS-LTC-DRG Relative Weights

One of the primary goals for the implementation of the LTCH PPS is to pay each LTCH an appropriate amount for the efficient delivery of medical care to Medicare patients. The system must be able to account adequately for each LTCH's case-mix in order to ensure both fair distribution of Medicare payments and access to adequate care for those Medicare patients whose care is more costly (67 FR 55984). To accomplish these goals, we have annually adjusted the LTCH PPS standard Federal prospective payment system rate by the applicable relative weight in determining payment to LTCHs for each case.

The basic methodology used to develop the MS-LTC-DRG relative weights is generally consistent with the general methodology established when the LTCH PPS was implemented in the August 30, 2002 LTCH PPS final rule (67 FR 55989 through 55991), with the exception of some modifications of our historical procedures for assigning relative weights in cases of zero volume and/or nonmonotonicity resulting from the adoption of the MS-LTC-DRGs. (For details on the modifications to our historical procedures for assigning relative weights in cases of zero volume and/or nonmonotonicity, we refer readers to the FY 2008 IPPS final rule with comment period (72 FR 47289 through 47295) and the FY 2009 IPPS final rule (73 FR 48542 through 48550).) Under the LTCH PPS, relative weights for each MS-LTC-DRG are a primary element used to account for the variations in cost per discharge and resource utilization among the payment groups (§ 412.515). To ensure that Medicare patients classified to each MS-LTC-DRG have access to an appropriate level of services and to encourage efficiency, we calculate a relative weight for each MS-LTC-DRG that represents the resources needed by an average inpatient LTCH case in that MS-LTC-DRG. For example, cases in a MS-LTC-DRG with a relative weight of 2 will, on average, cost twice as much to treat as cases in a MS-LTC-DRG with a relative weight of 1.

b. Development of the MS-LTC-DRG Relative Weights for FY 2015

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50755 through 50760), we presented our policies for the development of the MS-LTC-DRG relative weights for FY 2014. The basic methodology we used to develop the FY 2014 MS-LTC-DRG relative weights was the same as the methodology we used to develop the FY 2013 MS-LTC-DRG relative weights in the FY 2013 IPPS/LTCH PPS final rule and was consistent with the general methodology established when the LTCH PPS was implemented in the August 30, 2002 LTCH PPS final rule (67 FR 55989 through 55991). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28181 through 28187), we proposed to continue to use our existing methodology to determine the MS LTC-DRG relative weights for FY 2015, including the application of established policies related to the data, the hospital-specific relative value methodology, the treatment of severity levels in the MS LTC-DRGs, low-volume and no-volume MS-LTC-DRGs, adjustments for nonmonotonicity, and the steps for calculating the proposed MS-LTC-DRG relative weights with a budget neutrality factor.

Beginning with the FY 2008 update, we established a budget neutrality requirement for the annual update to the MS-LTC-DRG classifications and relative weights at § 412.517(b) (in conjunction with § 412.503), such that estimated aggregate LTCH PPS payments would be unaffected, that is, would be neither greater than nor less than the estimated aggregate LTCH PPS payments that would have been made without the classification and relative weight changes (72 FR 26882 through 26884). Consistent with § 412.517(b), we proposed to continue to apply our established two-step budget neutrality methodology. As such, the proposed update to the MS-LTC-DRG classifications and relative weights for FY 2015 was based on the FY 2014 MS-LTC-DRG classifications and relative weights established in Table 11 listed in section VI. of the Addendum to the FY 2014 IPPS/LTCH PPS final rule (78 FR 51002).

Comment: A few commenters recommended that CMS review its calculation of the proposed FY 2015 MS-LTC-DRG relative weights with the proposed budget neutrality factor to confirm that the those weights resulted in no change in aggregate LTCH PPS payments under § 412.517. The commenters made this request after performing their own analysis of the proposed relative weight calculations. One commenter performed a comparative analysis using the LTCH discharges from the MedPAR data and its estimate of LTCH PPS payments using the FY 2014 MS-LTC-DRGs relative weights and the proposed FY 2015 MS-LTC-DRGs relative weights, and found an aggregate reduction in LTCH PPS payments, in which the majority of that reduction was due to the proposed decrease in the relative weight for MS-LTC-DRG 207. Another commenter's analysis found a reduction in the proposed relative weight for 11 of the 20 most frequently utilized MS-LTC-DRGs, which the commenter believed suggested that the proposed MS-LTC-DRGs relative weights result in an aggregate decrease in LTCH PPS payments. Because these commenters believed that their analyses revealed an estimated aggregate decrease in LTCH PPS payments, they further believed that the proposed MS-LTC-DRGs relative are not “budget neutral” and, therefore, are not consistent with the requirement under § 412.517(b) that CMS ensure that estimated LTCH PPS payments are not affected by the annual update to the MS-LTC-DRGs classifications and relative weights. We note that the commenters did not comment specifically on any of our specific proposals related to the determination of the MS-LTC-DRGs relative weights for FY 2015, which includes our calculation of the normalization factor and the budget neutrality factor determined under the proposed application of our two-step budget neutrality methodology (discussed in Step 7 of section VII.B.3.g. of the proposed rule).

Response: We appreciate the commenters' analysis of the determination of the proposed MS-LTC-DRG relative weight calculations. In consideration of these public comments, we have reviewed the application of our methodology and the calculation of the MS-LTC-DRGs relative weights for FY 2015. We found no methodological or computational errors. In particular, in light of the commenter's focus on MS-LTC-DRG 207, we reviewed our budget neutrality calculations to ensure that the proposed decrease in the relative weight for MS-LTC-DRG 207 was accurately reflected in our aggregate LTCH PPS payment estimates. As described in step 7 under section VII.B.3.g. of the proposed rule, after determining and applying the normalization factor, we compared estimated aggregate LTCH PPS payments using the FY 2014 MS-LTC-DRGs and relative weights to estimate aggregate LTCH PPS payments using the proposed FY 2015 MS-LTC-DRGs and relative weights using LTCH claims data from the December 2013 update of the FY 2013 MedPAR file. Prior to the application of the proposed budget neutrality factor, we estimated that aggregate LTCH PPS payments using the proposed FY 2015 MS-LTC-DRGs and normalized relative weights would have resulted in an increase in aggregate LTCH PPS payments in FY 2015. To remove this estimated increase in aggregate LTCH PPS payments, we determined that a factor of 0.995275 needed to be applied to each of the proposed normalized FY 2015 MS-LTC-DRG relative weights. Therefore, we disagree with the commenters that the proposed MS LTC DRG relative weights are not “budget neutral,” and are not consistent with the budget neutrality requirement under § 412.517(b). As noted above, the commenters did not comment specifically on our calculation of the normalization factor and the budget neutrality factor determined under the proposed application of our two-step budget neutrality methodology.

The budget neutrality provision under § 412.517(b) requires that estimated aggregate LTCH PPS payments would be unaffected, that is, would be neither greater than nor less than the estimated aggregate LTCH PPS payments that would have been made without the classification and relative weight changes (72 FR 26882 through 26884). Consistent with § 412.517(b), we proposed to continue to apply our established two-step budget neutrality methodology. Under both steps of this methodology, based on the best data available, we assess the aggregate effects of the annual classification and relative weight changes. Specifically, as described in the proposed rule, in the first step we determine a normalization factor to ensure that estimated payments are not affected by changes in the composition of case types or the changes to the classification system using a ratio of average CMIs calculated across all LTCH PPS cases used for recalibration. Similarly, in the second step, the comparison of estimated aggregate LTCH PPS payments used to determine the budget neutrality factor is based on the sum of the estimated payments for all LTCH claims in the specified database. While the commenter is correct that the proposed relative weights for 11 of the 20 most frequently utilized MS-LTC-DRGs (or approximately 55 percent) are decreasing (which includes MS-LTC-DRG 207), the LTCH cases in those MS-LTC-DRGs only includes less than 60 percent of the LTCH claims. When the analysis is expanded to the 50 most frequently utilized MS-LTC-DRGs, which includes over 80 percent of the LTCH claims, the percentage of MS-LTC-DRGs with a proposed decrease in its relative weight drops to approximately 45 percent. This demonstrates that the number of MS-LTC-DRGs included in such an analysis can show contrary results. Therefore, we disagree with the commenter's assertion that its analysis of the proposed relative weights for 11 of the 20 most frequently utilized MS-LTC-DRGs is an indication that the proposed MS-LTC-DRG relative weights will result in an aggregate decrease in LTCH PPS payments and, therefore, are not budget neutral.

In this FY 2015 IPPS/LTCH PPS final rule, after consideration of public comments we received, as proposed, we are continuing to apply our established methodology to develop the MS-LTC-DRG relative weights for FY 2015. Specifically, we are finalizing our proposed methodology for developing the FY 2015 MS-LTC-DRG relative weights without modification, including the proposed application of established policies related to the data, hospital-specific relative value methodology, the treatment of severity levels in the MS-LTC-DRGs, low-volume and no-volume MS-LTC-DRGs, adjustments for nonmonotonicity, and the proposed steps for calculating the MS-LTC-DRG relative weights with a budget neutrality factor. Below we present the methodology that we are continuing to use to determine the MS-LTC-DRG relative weights for FY 2015, which is consistent with the methodology presented in the FY 2014 IPPS/LTCH PPS final rule. In addition, after consideration of the public comments we received, we are adopting as final the continued application our established two-step budget neutrality methodology, which is based on the current year MS-LTC-DRG classifications and relative weights (that is, the annual update to the MS-LTC-DRG classifications and relative weights for FY 2015 are based on the FY 2014 MS-LTC-DRG classifications and relative weights established in Table 11 listed in section VI. of the Addendum to the FY 2014 IPPS/LTCH PPS final rule (78 FR 51002)). For additional information on the established two-step budget neutrality methodology, we refer readers to the FY 2008 IPPS final rule (72 FR 47295 through 47296).

c. Data

For the FY 2014 IPPS/LTCH PPS final rule (78 FR 50755), to calculate the MS-LTC-DRG relative weights for FY 2014, we obtained total charges from FY 2012 Medicare LTCH bill data from the December 2012 update of the FY 2012 MedPAR file, which were the best available data at that time, and used the finalized Version 31.0 of the GROUPER to classify LTCH cases. As stated previously in this section, this approach is consistent with our proposals regarding the continued application of established policies related to the data as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28181 through 28182), which we are finalizing without modification in this final rule. Consistent with our historical practice, to calculate the MS-LTC-DRG relative weights for FY 2015 in this final rule, we obtained total charges from the FY 2013 Medicare LTCH bill data from the March 2014 update of the FY 2013 MedPAR file, which are the best available data at this time, and used Version 32.0 of the GROUPER to classify LTCH cases.

In this final rule and consistent with our historical methodology, we excluded the data from LTCHs that are all-inclusive rate providers and LTCHs that are reimbursed in accordance with demonstration projects authorized under section 402(a) of Public Law 90-248 or section 222(a) of Public Law 92-603. Furthermore, consistent with our historical practice, we excluded Medicare Advantage (Part C) claims, which are now included in the MedPAR files, in the calculations for the relative weights under the LTCH PPS that are used to determine payments for Medicare fee-for-service claims. Specifically, we did not use any claims from the MedPAR files that had a GHO Paid indicator value of “1,” which effectively removed Medicare Advantage claims from the relative weight calculations. Accordingly, in the development of the FY 2015 MS-LTC-DRG relative weights in this final rule, we excluded the data of 12 all-inclusive rate providers and one LTCH that is paid in accordance with demonstration projects that had claims in the March 2014 update of the FY 2013 MedPAR file, as well as any Medicare Advantage claims.

d. Hospital-Specific Relative Value (HSRV) Methodology

By nature, LTCHs often specialize in certain areas, such as ventilator-dependent patients and treatment of infections and wound care. Some case types (MS-DRGs) may be treated, to a large extent, in hospitals that have, from a perspective of charges, relatively high (or low) charges. This nonrandom distribution of cases with relatively high (or low) charges in specific MS-LTC-DRGs has the potential to inappropriately distort the measure of average charges. As stated previously in this section, this approach is consistent with our proposals regarding the continued use of the HSRV methodology as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28182), which we are finalizing without modification in this final rule. Therefore, in this final rule, to account for the fact that cases may not be randomly distributed across LTCHs, consistent with the methodology we have used since the implementation of the LTCH PPS, we are continuing to use a hospital-specific relative value (HSRV) methodology to calculate the MS-LTC-DRG relative weights for FY 2015. We believe this method removes this hospital-specific source of bias in measuring LTCH average charges (67 FR 55985). Specifically, under this methodology, we reduce the impact of the variation in charges across providers on any particular MS-LTC-DRG relative weight by converting each LTCH's charge for a case to a relative value based on that LTCH's average charge.

Under the HSRV methodology, we standardize charges for each LTCH by converting its charges for each case to hospital-specific relative charge values and then adjusting those values for the LTCH's case-mix. The adjustment for case-mix is needed to rescale the hospital-specific relative charge values (which, by definition, average 1.0 for each LTCH). The average relative weight for a LTCH is its case-mix, so it is reasonable to scale each LTCH's average relative charge value by its case-mix. In this way, each LTCH's relative charge value is adjusted by its case-mix to an average that reflects the complexity of the cases it treats relative to the complexity of the cases treated by all other LTCHs (the average case-mix of all LTCHs).

In accordance with our established methodology, we are continuing to standardize charges for each case by first dividing the adjusted charge for the case (adjusted for SSOs under § 412.529 as described in section VII.B.3.g. (Step 3) of this preamble) by the average adjusted charge for all cases at the LTCH in which the case was treated. SSO cases are cases with a length of stay that is less than or equal to five-sixths the average length of stay of the MS-LTC-DRG (§ 412.529 and § 412.503). The average adjusted charge reflects the average intensity of the health care services delivered by a particular LTCH and the average cost level of that LTCH. The resulting ratio is multiplied by that LTCH's case-mix index to determine the standardized charge for the case (67 FR 55989).

Multiplying the resulting ratio by the LTCH's case-mix index accounts for the fact that the same relative charges are given greater weight at a LTCH with higher average costs than they would at a LTCH with low average costs, which is needed to adjust each LTCH's relative charge value to reflect its case-mix relative to the average case-mix for all LTCHs. Because we standardize charges in this manner, we count charges for a Medicare patient at a LTCH with high average charges as less resource intensive than they would be at a LTCH with low average charges. For example, a $10,000 charge for a case at a LTCH with an average adjusted charge of $17,500 reflects a higher level of relative resource use than a $10,000 charge for a case at a LTCH with the same case-mix, but an average adjusted charge of $35,000. We believe that the adjusted charge of an individual case more accurately reflects actual resource use for an individual LTCH because the variation in charges due to systematic differences in the markup of charges among LTCHs is taken into account.

e. Treatment of Severity Levels in Developing the MS-LTC-DRG Relative Weights

For purposes of determining the MS-LTC-DRG relative weights, under our historical methodology, there are three different categories of MS-DRGs based on volume of cases within specific MS-LTC-DRGs: (1) MS-LTC-DRGs with at least 25 cases are each assigned a unique relative weight; (2) low-volume MS-LTC-DRGs (that is, MS-LTC-DRGs that contain between 1 and 24 cases based on a given year's claims data) are grouped into quintiles (as described below) and assigned the relative weight of the quintile; and (3) no-volume MS-LTC-DRGs (that is, no cases in the given year's claims data are assigned to those MS-LTC-DRGs) are cross-walked to other MS-LTC-DRGs based on the clinical similarities and assigned the relative weight of the cross-walked MS-LTC-DRG (as described in greater detail below). As stated previously in this section, this approach is consistent with our proposals regarding the continued use of our existing methodology related to the treatment of severity levels as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28182), which we are finalizing without modification in this final rule.

Therefore, in this final rule, we are continuing to utilize these same three categories of MS-LTC-DRGs for purposes of the treatment of severity levels in determining the MS-LTC-DRG relative weights for FY 2015. (We provide in-depth discussions of our policy regarding weight-setting for low-volume MS-LTC-DRGs in section VII.B.3.f. of the preamble of this final rule and for no-volume MS-LTC-DRGs, under Step 5 in section VII.B.3.g. of the preamble of this final rule.) Furthermore, in determining the FY 2015 MS-LTC-DRG relative weights, when necessary, we made adjustments to account for nonmonotonicity, as discussed in greater detail below in Step 6 of section VII.B.3.g. of this preamble. We refer readers to the discussion in the FY 2010 IPPS/RY 2010 LTCH PPS final rule for our rationale for including an adjustment for nonmonotonicity (74 FR 43953 through 43954).

f. Low-Volume MS-LTC-DRGs

In order to account for MS-LTC-DRGs with low volume (that is, with fewer than 25 LTCH cases), consistent with our existing methodology for purposes of determining the FY 2015 MS-LTC-DRG relative weights, we are continuing to employ the quintile methodology for low-volume MS-LTC-DRGs, such that we grouped the “low-volume MS-LTC-DRGs” (that is, MS-LTC-DRGs that contained between 1 and 24 cases annually) into one of five categories (quintiles) based on average charges (67 FR 55984 through 55995 and 72 FR 47283 through 47288). As stated previously in this section, this approach is consistent with our proposals regarding the continued use of our existing methodology for the treatment of low-volume MS-LTC-DRGs as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28182 through 28183), which we are finalizing without modification in this final rule. Therefore, in determining the FY 2015 MS-LTC-DRG relative weights in this final rule, in cases where the initial assignment of a low-volume MS-LTC-DRG to a quintile results in nonmonotonicity within a base-DRG, in order to ensure appropriate Medicare payments, consistent with our historical methodology, we made adjustments to the treatment of low-volume MS-LTC-DRGs to preserve monotonicity, as discussed in detail below in section VII.B.3.g. (Step 6) of the preamble of this final rule.

In this final rule, using LTCH cases from the March 2014 update of the FY 2013 MedPAR file (which is currently the best available data), we identified 295 MS-LTC-DRGs that contained between 1 and 24 cases. This list of MS-LTC-DRGs was then divided into one of the 5 low-volume quintiles, each containing 59 MS-LTC-DRGs (295/5 = 59). We assigned a low-volume MS-LTC-DRG to a specific low-volume quintile by sorting the low-volume MS-LTC-DRGs in ascending order by average charge in accordance with our established methodology. Based on the data available for this final rule, the number of MS-LTC-DRGs with less than 25 cases was evenly divisible by 5, and therefore, it was not necessary to employ our historical methodology for determining which of the low-volume quintiles contain an additional low-volume MS-LTC-DRG. Specifically for this final rule, after organizing the MS-LTC-DRGs by ascending order by average charge, we assigned the first fifth (1st through 59th) of low-volume MS-LTC-DRGs (with the lowest average charge) into Quintile 1. The MS-LTC-DRGs with the highest average charge cases were assigned into Quintile 5. Table 13A, which is listed in section VI. of the Addendum to this final rule and is available via the Internet, lists the composition of the low-volume quintiles for MS-LTC-DRGs for FY 2015.

Accordingly, in order to determine the FY 2015 relative weights for the MS-LTC-DRGs with low volume, we used the five low-volume quintiles described above. We determined a relative weight and (geometric) average length of stay for each of the five low-volume quintiles using the methodology that we applied to the MS-LTC-DRGs (25 or more cases), as described below in section VII.B.3.g. of the preamble of this final rule. We assigned the same relative weight and average length of stay to each of the low-volume MS-LTC-DRGs that made up an individual low-volume quintile. We note that, as this system is dynamic, it is possible that the number and specific type of MS-LTC-DRGs with a low volume of LTCH cases will vary in the future. Furthermore, we note that we will continue to monitor the volume (that is, the number of LTCH cases) in the low-volume quintiles to ensure that our quintile assignments used in determining the MS-LTC-DRG relative weights result in appropriate payment for such cases and do not result in an unintended financial incentive for LTCHs to inappropriately admit these types of cases.

g. Steps for Determining the FY 2015 MS-LTC-DRG Relative Weights

In this final rule, we determined the FY 2015 MS-LTC-DRG relative weights based on our existing methodology. (For additional information on the original development of this methodology, and modifications to it since the adoption of the MS-LTC-DRGs, we refer readers to the August 30, 2002 LTCH PPS final rule (67 FR 55989 through 55995) and the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43951 through 43966).) As stated previously in this section, this approach is consistent with our proposals regarding the continued use of our existing methodology to determine the FY 2015 MS-LTC-DRG relative weights as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28183 through 28187), which we are finalizing without modification in this final rule.

In summary, to determine the FY 2015 MS-LTC-DRG relative weights, we grouped LTCH cases to the appropriate MS-LTC-DRG, while taking into account the low-volume quintile (as described above). After grouping the cases to the appropriate MS-LTC-DRG (or low-volume quintile), we calculated the FY 2015 relative weights by first removing statistical outliers and cases with a length of stay of 7 days or less (Steps 1 and 2 below). Next, we adjusted the number of cases in each MS-LTC-DRG (or low-volume quintile) for the effect of SSO cases (Step 3 below). After removing statistical outliers (Step 1 below) and cases with a length of stay of 7 days or less (Step 2 below), the SSO adjusted discharges and corresponding charges were then used to calculate “relative adjusted weights” for each MS-LTC-DRG (or low-volume quintile) using the HSRV method. Below we discuss in detail the steps for calculating the FY 2015 MS-LTC-DRG relative weights. We note that, as we discussed in section VII.B.3.c. of the preamble of this final rule, we excluded the data of all-inclusive rate LTCHs, LTCHs that are paid in accordance with demonstration projects, and any Medicare Advantage claims in the March 2014 update of the FY 2013 MedPAR file.

Step 1—Remove statistical outliers.

The first step in the calculation of the FY 2015 MS-LTC-DRG relative weights is to remove statistical outlier cases. Consistent with our historical relative weight methodology, we are continuing to define statistical outliers as cases that are outside of 3.0 standard deviations from the mean of the log distribution of both charges per case and the charges per day for each MS-LTC-DRG. These statistical outliers were removed prior to calculating the relative weights because we believe that they may represent aberrations in the data that distort the measure of average resource use. Including those LTCH cases in the calculation of the relative weights could result in an inaccurate relative weight that does not truly reflect relative resource use among the MS-LTC-DRGs. (For additional information on this step of the relative weight methodology, we refer readers to 67 FR 55989 and 74 FR 43959.)

Step 2—Remove cases with a length of stay of 7 days or less.

The MS-LTC-DRG relative weights reflect the average of resources used on representative cases of a specific type. Generally, cases with a length of stay of 7 days or less do not belong in a LTCH because these stays do not fully receive or benefit from treatment that is typical in a LTCH stay, and full resources are often not used in the earlier stages of admission to a LTCH. If we were to include stays of 7 days or less in the computation of the FY 2015 MS-LTC-DRG relative weights, the value of many relative weights would decrease and, therefore, payments would decrease to a level that may no longer be appropriate. We do not believe that it would be appropriate to compromise the integrity of the payment determination for those LTCH cases that actually benefit from and receive a full course of treatment at a LTCH by including data from these very short stays. Therefore, consistent with our historical relative weight methodology, in determining the FY 2015 MS-LTC-DRG relative weights, we removed LTCH cases with a length of stay of 7 days or less. (For additional information on this step of the relative weight methodology, we refer readers to 67 FR 55989 and 74 FR 43959.)

Step 3—Adjust charges for the effects of SSOs.

After removing cases with a length of stay of 7 days or less, we were left with cases that have a length of stay of greater than or equal to 8 days. As the next step in the calculation of the FY 2015 MS-LTC-DRG relative weights, consistent with our historical relative weight methodology, we adjusted each LTCH's charges per discharge for those remaining cases for the effects of SSOs (as defined in § 412.529(a) in conjunction with § 412.503).

In this final rule, we made this adjustment by counting an SSO case as a fraction of a discharge based on the ratio of the length of stay of the case to the average length of stay for the MS-LTC-DRG for non-SSO cases. This has the effect of proportionately reducing the impact of the lower charges for the SSO cases in calculating the average charge for the MS-LTC-DRG. This process produces the same result as if the actual charges per discharge of an SSO case were adjusted to what they would have been had the patient's length of stay been equal to the average length of stay of the MS-LTC-DRG.

Counting SSO cases as full discharges with no adjustment in determining the FY 2015 MS-LTC-DRG relative weights would lower the FY 2015 MS-LTC-DRG relative weight for affected MS-LTC-DRGs because the relatively lower charges of the SSO cases would bring down the average charge for all cases within a MS-LTC-DRG. This would result in an “underpayment” for non-SSO cases and an “overpayment” for SSO cases. Therefore, we adjusted for SSO cases under § 412.529 in this manner because it results in more appropriate payments for all LTCH cases. (For additional information on this step of the relative weight methodology, we refer readers to 67 FR 55989 and 74 FR 43959.)

Step 4—Calculate the FY 2015 MS-LTC-DRG relative weights on an iterative basis.

Consistent with our historical relative weight methodology, we calculated the FY 2015 MS-LTC-DRG relative weights using the HSRV methodology, which is an iterative process. First, for each LTCH case, we calculated a hospital-specific relative charge value by dividing the SSO adjusted charge per discharge (see Step 3) of the LTCH case (after removing the statistical outliers (see Step 1) and LTCH cases with a length of stay of 7 days or less (see Step 2)) by the average charge per discharge for the LTCH in which the case occurred. The resulting ratio was then multiplied by the LTCH's case-mix index to produce an adjusted hospital-specific relative charge value for the case. An initial case-mix index value of 1.0 was used for each LTCH.

For each MS-LTC-DRG, we calculated the FY 2015 relative weight by dividing the average of the adjusted hospital-specific relative charge values (from above) for the MS-LTC-DRG by the overall average hospital-specific relative charge value across all cases for all LTCHs. Using these recalculated MS-LTC-DRG relative weights, each LTCH's average relative weight for all of its cases (that is, its case-mix) was calculated by dividing the sum of all the LTCH's MS-LTC-DRG relative weights by its total number of cases. The LTCHs' hospital-specific relative charge values (from above) were then multiplied by the hospital-specific case-mix indexes. The hospital-specific case-mix adjusted relative charge values were then used to calculate a new set of MS-LTC-DRG relative weights across all LTCHs. This iterative process was continued until there was convergence between the relative weights produced at adjacent steps, for example, when the maximum difference was less than 0.0001.

Step 5—Determine a FY 2015 relative weight for MS-LTC-DRGs with no LTCH cases.

As we stated above, we determined the FY 2015 relative weight for each MS-LTC-DRG using total Medicare allowable total charges reported in the best available LTCH claims data (that is, the March 2014 update of the FY 2013 MedPAR file for this final rule). Using these data, we identified the MS-LTC-DRGs for which there were no LTCH cases in the database, such that no patients who would have been classified to those MS-LTC-DRGs were treated in LTCHs during FY 2013 and, therefore, no charge data were available for these MS-LTC-DRGs. Therefore, in the process of determining the MS-LTC-DRG relative weights, we were unable to calculate relative weights for the MS-LTC-DRGs with no LTCH cases using the methodology described in Steps 1 through 4 above. However, because patients with a number of the diagnoses under these MS-LTC-DRGs may be treated at LTCHs, consistent with our historical methodology, we assigned a relative weight to each of the no-volume MS-LTC-DRGs based on clinical similarity and relative costliness (with the exception of “transplant” MS-LTC-DRGs and “error” MS-LTC-DRGs, as discussed below). (For additional information on this step of the relative weight methodology, we refer readers to 67 FR 55991 and 74 FR 43959 through 43960.)

In general, we determined FY 2015 relative weights for the MS-LTC-DRGs with no LTCH cases in the March 2014 update of the FY 2013 MedPAR file used in this final rule (that is, “no-volume” MS-LTC-DRGs) by cross-walking each no-volume MS-LTC-DRG to another MS-LTC-DRG with a calculated relative weight (determined in accordance with the methodology described above). Then, the “no-volume” MS-LTC-DRG was assigned the same relative weight (and average length of stay) of the MS-LTC-DRG to which it was cross-walked (as described in greater detail below).

Of the 753 MS-LTC-DRGs for FY 2015, we identified 237 MS-LTC-DRGs for which there are no LTCH cases in the database (including the 8 “transplant” MS-LTC-DRGs and 2 “error” MS-LTC-DRGs). As stated above, we assigned relative weights for each of the 237 no-volume MS-LTC-DRGs (with the exception of the 8 “transplant” MS-LTC-DRGs and the 2 “error” MS-LTC-DRGs, which are discussed below) based on clinical similarity and relative costliness to one of the remaining 516 (753−237= 516) MS-LTC-DRGs for which we were able to determine relative weights based on FY 2013 LTCH claims data using the steps described above. (For the remainder of this discussion, we refer to the “cross-walked” MS-LTC-DRGs as the MS-LTC-DRGs to which we cross-walked one of the 237 “no volume” MS-LTC-DRGs, with the exception of the 8 “transplant” MS-LTC-DRGs and the 2 “error” MS-LTC-DRGs, for purposes of determining a relative weight.) Then, we assigned the no-volume MS-LTC-DRG the relative weight of the cross-walked MS-LTC-DRG. (As explained below in Step 6, when necessary, we made adjustments to account for nonmonotonicity.)

For this final rule, we cross-walked the no-volume MS-LTC-DRG to an MS-LTC-DRG for which there were LTCH cases in the March 2014 update of the FY 2013 MedPAR file, and to which it was similar clinically in intensity of use of resources and relative costliness as determined by criteria such as care provided during the period of time surrounding surgery, surgical approach (if applicable), length of time of surgical procedure, postoperative care, and length of stay. We evaluated the relative costliness in determining the applicable MS-LTC-DRG to which a no-volume MS-LTC-DRG was cross-walked in order to assign an appropriate relative weight for the no-volume MS-LTC-DRGs in FY 2015. (For more details on our process for evaluating relative costliness, we refer readers to the FY 2010 IPPS/RY 2010 LTCH PPS final rule (73 FR 48543).) We believe in the rare event that there would be a few LTCH cases grouped to one of the no-volume MS-LTC-DRGs in FY 2015, the relative weights assigned based on the cross-walked MS-LTC-DRGs would result in an appropriate LTCH PPS payment because the crosswalks, which are based on similar clinical similarity and relative costliness, generally require equivalent relative resource use.

We then assigned the relative weight of the cross-walked MS-LTC-DRG as the relative weight for the no-volume MS-LTC-DRG such that both of these MS-LTC-DRGs (that is, the no-volume MS-LTC-DRG and the cross-walked MS-LTC-DRG) have the same relative weight for FY 2015. We note that if the cross-walked MS-LTC-DRG had 25 cases or more, its relative weight, which was calculated using the methodology described in Steps 1 through 4 above, was assigned to the no-volume MS-LTC-DRG as well. Similarly, if the MS-LTC-DRG to which the no-volume MS-LTC-DRG was cross-walked had 24 or less cases and, therefore, was designated to one of the low-volume quintiles for purposes of determining the relative weights, we assigned the relative weight of the applicable low-volume quintile to the no-volume MS-LTC-DRG such that both of these MS-LTC-DRGs (that is, the no-volume MS-LTC-DRG and the cross-walked MS-LTC-DRG) have the same relative weight for FY 2015. (As we noted above, in the infrequent case where nonmonotonicity involving a no-volume MS-LTC-DRG resulted, additional adjustments as described in Step 6 were required in order to maintain monotonically increasing relative weights.)

For this final rule, a list of the no-volume MS-LTC-DRGs and the MS-LTC-DRGs to which each was cross-walked (that is, the cross-walked MS-LTC-DRGs) for FY 2015 is shown in Table 13B, which is listed in section VI. of the Addendum to this final rule and is available via the Internet.

To illustrate this methodology for determining the relative weights for the FY 2015 MS-LTC-DRGs with no LTCH cases, we are providing the following example, which refers to the no-volume MS-LTC-DRGs crosswalk information for FY 2015 provided in Table 13B.

Example:

There were no cases in the FY 2013 MedPAR file used for this final rule for MS-LTC-DRG 61 (Acute Ischemic Stroke with Use of Thrombolytic Agent with MCC). We determined that MS-LTC-DRG 70 (Nonspecific Cerebrovascular Disorders with MCC) was similar clinically and based on resource use to MS-LTC-DRG 61. Therefore, we assigned the same relative weight of MS-LTC-DRG 70 of 0.8632 for FY 2015 to MS-LTC-DRG 61 (obtained from Table 11, which is listed in section VI. of the Addendum to this final rule and is available via the Internet).

Again, we note that, as this system is dynamic, it is entirely possible that the number of MS-LTC-DRGs with no volume of LTCH cases based on the system will vary in the future. We used the most recent available claims data in the MedPAR file to identify no-volume MS-LTC-DRGs and to determine the relative weights in this final rule.

Furthermore, for FY 2015, consistent with our historical relative weight methodology, we are establishing a relative weight of 0.0000 for the following transplant MS-LTC-DRGs: Heart Transplant or Implant of Heart Assist System with MCC (MS-LTC-DRG 1); Heart Transplant or Implant of Heart Assist System without MCC (MS-LTC-DRG 2); Liver Transplant with MCC or Intestinal Transplant (MS-LTC-DRG 5); Liver Transplant without MCC (MS-LTC-DRG 6); Lung Transplant (MS-LTC-DRG 7); Simultaneous Pancreas/Kidney Transplant (MS-LTC-DRG 8); Pancreas Transplant (MS-LTC-DRG 10); and Kidney Transplant (MS-LTC-DRG 652). This is because Medicare will only cover these procedures if they are performed at a hospital that has been certified for the specific procedures by Medicare and presently no LTCH has been so certified. At the present time, we include these eight transplant MS-LTC-DRGs in the GROUPER program for administrative purposes only. Because we use the same GROUPER program for LTCHs as is used under the IPPS, removing these MS-LTC-DRGs would be administratively burdensome. (For additional information regarding our treatment of transplant MS-LTC-DRGs, we refer readers to the RY 2010 LTCH PPS final rule (74 FR 43964).)

Step 6—Adjust the FY 2015 MS-LTC-DRG relative weights to account for nonmonotonically increasing relative weights.

As discussed earlier in this section, the MS-DRGs contain base DRGs that have been subdivided into one, two, or three severity of illness levels. Where there are three severity levels, the most severe level has at least one secondary diagnosis code that is referred to as an MCC (that is, major complication or comorbidity). The next lower severity level contains cases with at least one secondary diagnosis code that is a CC (that is, complication or comorbidity). Those cases without an MCC or a CC are referred to as “without CC/MCC.” When data do not support the creation of three severity levels, the base MS-DRG is subdivided into either two levels or the base MS-DRG is not subdivided. The two-level subdivisions could consist of the MS-DRG with CC/MCC and the MS-DRG without CC/MCC. Alternatively, the other type of two-level subdivision may consist of the MS-DRG with MCC and the MS-DRG without MCC.

In those base MS-LTC-DRGs that are split into either two or three severity levels, cases classified into the “without CC/MCC” MS-LTC-DRG are expected to have a lower resource use (and lower costs) than the “with CC/MCC” MS-LTC-DRG (in the case of a two-level split) or both the “with CC” and the “with MCC” MS-LTC-DRGs (in the case of a three-level split). That is, theoretically, cases that are more severe typically require greater expenditure of medical care resources and will result in higher average charges. Therefore, in the three severity levels, relative weights should increase by severity, from lowest to highest. If the relative weights decrease as severity increases (that is, if within a base MS-LTC-DRG, an MS-LTC-DRG with CC has a higher relative weight than one with MCC, or the MS-LTC-DRG “without CC/MCC” has a higher relative weight than either of the others), they are nonmonotonic. We continue to believe that utilizing nonmonotonic relative weights to adjust Medicare payments would result in inappropriate payments because the payment for the cases in the higher severity level in a base MS-LTC-DRG (which are generally expected to have higher resource use and costs) would be lower than the payment for cases in a lower severity level within the same base MS-LTC-DRG (which are generally expected to have lower resource use and costs). Therefore, in determining the FY 2015 MS-LTC-DRG relative weights in this final rule, consistent with our historical methodology, we combined MS-LTC-DRG severity levels within a base MS-LTC-DRG for the purpose of computing a relative weight when necessary to ensure that monotonicity was maintained. For a comprehensive description of our existing methodology to adjust for nonmonotonicity, we refer readers to the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43964 through 43966). Any adjustments for nonmonotonicity that were made in determining the FY 2015 MS-LTC-DRG relative weights in this final rule by applying this methodology are denoted in Table 11, which is listed in section VI. of the Addendum to this final rule and is available via the Internet.

Step 7—Calculate the FY 2015 budget neutrality factor.

In accordance with the regulations at § 412.517(b) (in conjunction with § 412.503), the annual update to the MS-LTC-DRG classifications and relative weights is done in a budget neutral manner such that estimated aggregate LTCH PPS payments would be unaffected, that is, would be neither greater than nor less than the estimated aggregate LTCH PPS payments that would have been made without the MS-LTC-DRG classification and relative weight changes. (For a detailed discussion on the establishment of the budget neutrality requirement for the annual update of the MS-LTC-DRG classifications and relative weights, we refer readers to the RY 2008 LTCH PPS final rule (72 FR 26881 and 26882).)

The MS-LTC-DRG classifications and relative weights are updated annually based on the most recent available LTCH claims data to reflect changes in relative LTCH resource use (§ 412.517(a) in conjunction with § 412.503). Under the budget neutrality requirement at § 412.517(b), for each annual update, the MS-LTC-DRG relative weights are uniformly adjusted to ensure that estimated aggregate payments under the LTCH PPS would not be affected (that is, decreased or increased). Consistent with that provision, we are updating the MS-LTC-DRG classifications and relative weights for FY 2015 based on the most recent available LTCH data, and applying a budget neutrality adjustment in determining the FY 2015 MS-LTC-DRG relative weights.

To ensure budget neutrality in the update to the MS-LTC-DRG classifications and relative weights under § 412.517(b), we are continuing to use our established two-step budget neutrality methodology. As discussed previously in this section, this approach is consistent with our proposals regarding the continued use of our existing methodology to calculate the FY 2015 budget neutrality factor for the FY 2015 MS-LTC-DRG relative weights as presented in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28183 through 28187), which we are finalizing without modification after consideration of public comments we received in this final rule.

In this final rule, in the first step of our MS-LTC-DRG budget neutrality methodology, for FY 2015, we calculated and applied a normalization factor to the recalibrated relative weights (the result of Steps 1 through 6 above) to ensure that estimated payments were not affected by changes in the composition of case types or the changes to the classification system. That is, the normalization adjustment is intended to ensure that the recalibration of the MS-LTC-DRG relative weights (that is, the process itself) neither increases nor decreases the average CMI.

To calculate the normalization factor for FY 2015 (the first step of our budget neutrality methodology), we used the following three steps: (1.a.) we used the most recent available LTCH claims data (FY 2013) and grouped them using the FY 2015 GROUPER (Version 32.0) and the recalibrated FY 2015 MS-LTC-DRG relative weights (determined in Steps 1 through 6 of the Steps for Determining the FY 2015 MS-LTC-DRG Relative Weights above) to calculate the average CMI; (1.b.) we grouped the same LTCH claims data (FY 2013) using the FY 2014 GROUPER (Version 31.0) and FY 2014 MS-LTC-DRG relative weights and calculated the average CMI; and (1.c.) we computed the ratio of these average CMIs by dividing the average CMI for FY 2014 (determined in Step 1.b.) by the average CMI for FY 2015 (determined in Step 1.a.). In determining the MS-LTC-DRG relative weights for FY 2015, each recalibrated MS-LTC-DRG relative weight was multiplied by 1.12464 (determined in Step 1.c.) in the first step of the budget neutrality methodology, which produced “normalized relative weights.”

In the second step of our MS-LTC-DRG budget neutrality methodology, we determined a budget neutrality factor to ensure that estimated aggregate LTCH PPS payments (based on the most recent available LTCH claims data) after reclassification and recalibration (that is, the FY 2015 MS-LTC-DRG classifications and relative weights) are equal to estimated aggregate LTCH PPS payments before reclassification and recalibration (that is, the FY 2014 MS-LTC-DRG classifications and relative weights). Accordingly, consistent with our existing methodology, we used FY 2013 discharge data to simulate payments and compared estimated aggregate LTCH PPS payments using the FY 2014 MS-LTC-DRGs and relative weights to estimate aggregate LTCH PPS payments using the FY 2015 MS-LTC-DRGs and relative weights. Specifically, for this final rule, as discussed previously in section VII.B.3.c. of this preamble, we used LTCH claims data from the March 2014 update of the FY 2013 MedPAR file, as these are the best available data at this time.

For this final rule, we determined the FY 2015 budget neutrality adjustment factor using the following three steps: (2.a.) we simulated estimated total LTCH PPS payments using the normalized relative weights for FY 2015 and GROUPER Version 32.0 (as described above); (2.b.) we simulated estimated total LTCH PPS payments using the FY 2014 GROUPER (Version 31.0) and the FY 2014 MS-LTC-DRG relative weights in Table 11 of the Addendum to the FY 2014 IPPS/LTCH PPS final rule available on the Internet (78 FR 51002); and (2.c.) we calculated the ratio of these estimated total LTCH PPS payments by dividing the estimated total LTCH PPS payments using the FY 2014 GROUPER (Version 31.0) and the FY 2014 MS-LTC-DRG relative weights (determined in Step 2.b.) by the estimated total LTCH PPS payments using the FY 2015 GROUPER (Version 32.0) and the normalized MS-LTC-DRG relative weights for FY 2015 (determined in Step 2.a.). In determining the FY 2015 MS-LTC-DRG relative weights, each normalized relative weight was multiplied by a budget neutrality factor of 0.9956326 (determined in Step 2.c.) in the second step of the budget neutrality methodology to determine the budget neutral FY 2015 relative weight for each MS-LTC-DRG.

Accordingly, in determining the FY 2015 MS-LTC-DRG relative weights in this final rule, consistent with our existing methodology, we applied a normalization factor of 1.12464 and a budget neutrality factor of 0.9956326 (computed as described above). Table 11, which is listed in section VI. of the Addendum to this final rule and is available via the Internet, lists the MS-LTC-DRGs and their respective relative weights, geometric mean length of stay, five-sixths of the geometric mean length of stay (used to identify SSO cases under § 412.529(a)), and the “IPPS Comparable Thresholds” (used in determining SSO payments under § 412.529(c)(3)), for FY 2015 (and reflect both the normalization factor of 1.12464 and the budget neutrality factor of 0.9956326).

C. LTCH PPS Payment Rates for FY 2015

1. Overview of Development of the LTCH Payment Rates

The basic methodology for determining LTCH PPS Federal prospective payment rates is set forth at § 412.515 through § 412.536. In this section, we discuss the factors that we are using to update the LTCH PPS standard Federal rate for FY 2015, that is, effective for LTCH discharges occurring on or after October 1, 2014 through September 30, 2015.

For further details on the development of the FY 2003 standard Federal rate when the LTCH PPS was initially implemented, we refer readers to the August 30, 2002 LTCH PPS final rule (67 FR 56027 through 56037). For subsequent updates to the LTCH PPS standard Federal rate as implemented under § 412.523(c)(3), we refer readers to the following final rules: RY 2004 LTCH PPS final rule (68 FR 34134 through 34140); RY 2005 LTCH PPS final rule (68 FR 25682 through 25684); RY 2006 LTCH PPS final rule (70 FR 24179 through 24180); RY 2007 LTCH PPS final rule (71 FR 27819 through 27827); RY 2008 LTCH PPS final rule (72 FR 26870 through 27029); RY 2009 LTCH PPS final rule (73 FR 26800 through 26804); FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 44021 through 44030); FY 2011 IPPS/LTCH PPS final rule (75 FR 50443 through 50444); FY 2012 IPPS/LTCH PPS final rule (76 FR 51769 through 51773); FY 2013 IPPS/LTCH PPS final rule (77 FR 53479 through 53481); and FY 2014 IPPS/LTCH PPS final rule (78 FR 50760 through 50765).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28187 through 28190), we presented our proposals related to the update to the LTCH PPS standard Federal rate for FY 2015, which included the proposed annual market basket update to the LTCH PPS standard Federal rate. Consistent with our historical practice of using the best data available, we also proposed to use more recent data, if available, to determine the FY 2015 annual market basket update to the LTCH PPS standard Federal rate in the final rule. We did not receive any public comments in response to these proposals and, therefore, are adopting the proposals as final without modification in this final rule, using the most recent available data.

The update to the LTCH PPS standard Federal rate for FY 2015 is presented in section V.A. of the Addendum to this final rule. The components of the annual market basket update to the LTCH PPS standard Federal rate for FY 2015 are discussed below, including the reduction to the annual update for LTCHs that fail to submit quality reporting data for fiscal year FY 2015 as required by the statute (as discussed in section VII.C.2.c. of the preamble of this final rule). Furthermore, as discussed in section VII.C.3. of the preamble of this final rule, for FY 2015, in addition to the update factor, under the final year of the 3-year phase-in under the current regulations at § 412.523(d)(3), we are making a one-time prospective adjustment to the standard Federal rate for FY 2015 so that the effect of any significant difference between the data used in the original computations of budget neutrality for FY 2003 and more recent data to determine budget neutrality for FY 2003 is not perpetuated in the prospective payment rates for future years. In addition, as discussed in section V.A. of the Addendum of this final rule, we are making an adjustment to the standard Federal rate to account for the estimated effect of the changes to the area wage level adjustment for FY 2015 on estimated aggregate LTCH PPS payments, in accordance with § 412.523(d)(4). (We refer readers to the discussion of the reduction to the annual update for LTCHs that fail to submit quality reporting data under section VII.C.2.c. of the preamble of this final rule, the application of the one-time prospective adjustment under the final year of the 3-year phase-in under section VII.C.3. of this preamble, and the budget neutrality adjustment for changes in the area wage levels under section V.A. of the Addendum of this final rule.)

2. FY 2015 LTCH PPS Annual Market Basket Update

a. Overview

Historically, the Medicare program has used a market basket to account for price increases in the services furnished by providers. The market basket used for the LTCH PPS includes both operating and capital-related costs of LTCHs because the LTCH PPS uses a single payment rate for both operating and capital-related costs. As discussed in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53468 through 53476), we adopted the newly created FY 2009-based LTCH-specific market basket for use under the LTCH PPS beginning in FY 2013. For additional details on the historical development of the market basket used under the LTCH PPS, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53467 through 53468) and this preamble.

Section 3401(c) of the Affordable Care Act provides for certain adjustments to any annual update to the standard Federal rate and refers to the timeframes associated with such adjustments as a “rate year” (which are discussed in more detail in section VII.C.2.b. of the preamble of this final rule.) We note that because the annual update to the LTCH PPS policies, rates, and factors now occurs on October 1, we adopted the term “fiscal year” (FY) rather than “rate year” (RY) under the LTCH PPS beginning October 1, 2010, to conform with the standard definition of the Federal fiscal year (October 1 through September 30) used by other PPSs, such as the IPPS (75 FR 50396 through 50397). Although the language of sections 3004(a) 3401(c), 10319, and 1105(b) of the Affordable Care Act refers to years 2010 and thereafter under the LTCH PPS as “rate year,” consistent with our change in the terminology used under the LTCH PPS from “rate year” to “fiscal year,” for purposes of clarity, when discussing the annual update for the LTCH PPS, including the provisions of the Affordable Care Act, we use “fiscal year” rather than “rate year” for 2011 and subsequent years.

b. Revision of Certain Market Basket Updates as Required by the Affordable Care Act

Section 1886(m)(3)(A) of the Act, as added by section 3401(c) of the Affordable Care Act, specifies that, for rate year 2010 and each subsequent rate year through 2019, any annual update to the standard Federal rate shall be reduced:

  • For rate year 2010 through 2019, by the “other adjustment” specified in sections 1886(m)(3)(A)(ii) and (m)(4) of the Act; and
  • For rate year 2012 and each subsequent year, by the productivity adjustment (which we refer to as “the multifactor productivity (MFP) adjustment”) described in section 1886(b)(3)(B)(xi)(II) of the Act.

Section 1886(m)(3)(B) of the Act provides that the application of paragraph (3) of section 1886(m) of the Act may result in the annual update being less than zero for a rate year, and may result in payment rates for a rate year being less than such payment rates for the preceding rate year.

Section 1886(b)(3)(B)(xi)(II) of the Act defines the MFP adjustment as equal to the 10-year moving average of changes in annual economy-wide, private nonfarm business multifactor productivity (as projected by the Secretary for the 10-year period ending with the applicable fiscal year, calendar year, cost reporting period, or other annual period). Under our methodology, the end of the 10-year moving average of changes in the MFP coincides with the end of the appropriate FY update period. In addition, the MFP adjustment that is applied in determining any annual update to the LTCH PPS standard Federal rate is the same adjustment that is required to be applied in determining the applicable percentage increase under the IPPS under section 1886(b)(3)(B)(i) of the Act as they are both based on a fiscal year. The MFP adjustment is derived using a projection of MFP that is currently produced by IHS Global Insight, Inc. (For additional details on the development of the MFP adjustment and its application under the LTCH PPS, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51691 through 51692 and 51770 through 51771).)

For FY 2015, as we proposed, we are continuing to use our methodology for calculating and applying the MFP adjustment to determine the annual update to the LTCH PPS standard Federal rate for FY 2015. (For details on the development of the MFP adjustment, including our finalized methodology for calculating and applying the MFP adjustment, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51689 through 51692).)

c. Adjustment to the Annual Update to the LTCH PPS Standard Federal Rate under the Long-Term Care Hospital Quality Reporting (LTCHQR) Program

1. Background

In accordance with section 1886(m)(5) of the Act, as added by section 3004(a) of the Affordable Care Act, the Secretary established the Long-Term Care Hospital Quality Reporting (LTCHQR) Program. (As noted above, although the language of section 3004(a) of the Affordable Care Act refers to years 2011 and thereafter under the LTCH PPS as “rate year,” consistent with our change in the terminology used under the LTCH PPS from “rate year” to “fiscal year,” for purposes of clarity, when discussing the annual update for the LTCH PPS, including the provisions of the Affordable Care Act, we use “fiscal year” rather than “rate year” for 2011 and subsequent years.) Under the LTCHQR Program, as required by section 1886(m)(5)(A)(i) of the Act, for FY 2014 and each subsequent year, in the case of an LTCH that does not submit quality reporting data to the Secretary in accordance with section 1886(m)(5)(C) of the Act with respect to such a year, any annual update to a standard Federal rate for discharges for the hospital during the year, and after application of section 1886(m)(3) of the Act, shall be reduced by 2.0 percentage points. Section 1886(m)(5)(A)(ii) of the Act provides that the application of the 2.0 percentage points reduction may result in an annual update that is less than 0.0 for a year, and may result in LTCH PPS payment rates for a year being less than such LTCH PPS payment rates for the preceding year. Furthermore, section 1886(m)(5)(B) of the Act specifies that the 2.0 percentage points reduction is applied in a noncumulative manner, such that any reduction made under section 1886(m)(5)(A) of the Act shall apply only with respect to the year involved, and shall not be taken into account in computing the LTCH PPS payment amount for a subsequent year. For additional information on the history of the LTCHQR Program, including the statutory authority and the selected measures, we refer readers to section IX.C. of the preamble of this final rule.

2. Reduction to the Annual Update to the LTCH PPS Standard Federal Rate under the LTCHQR Program

Consistent with section 1886(m)(5)(A)(i) of the Act, for FY 2014 and subsequent fiscal years, for LTCHs that do not submit quality reporting data under the LTCHQR Program with respect to such a fiscal year, any annual update to a standard Federal rate for discharges for the LTCH during the fiscal year and after application of the market basket update adjustments required by section 1886(m)(3) of the Act, is further reduced by 2.0 percentage points. That is, in establishing an update to the LTCH PPS standard Federal rate for FY 2014 and subsequent fiscal years, the full LTCH PPS market basket increase estimate, subject to an adjustment based on changes in economy-wide productivity (“the MFP adjustment”) required under section 1886(m)(3)(A)(i) of the Act and an additional reduction required by sections 1886(m)(3)(A)(ii) and 1886(m)(4) of the Act, is further reduced by 2.0 percentage points for LTCHs that fail to submit quality reporting data under the LTCHQR Program. The reduction in the annual update to the LTCH PPS standard Federal rate for failure to report quality data under the LTCHQR Program for FY 2014 and subsequent fiscal years is codified under § 412.523(c)(4) of the regulations.

Specifically, consistent with section 1886(m)(5)(A)(i) of the Act, under § 412.523(c)(4)(i), for an LTCH that does not submit quality reporting data in the form and manner and at the time specified by the Secretary under the LTCHQR Program, the annual update to the standard Federal rate under § 412.523(c)(3) is further reduced by 2.0 percentage points. In addition, consistent with section 1886(m)(5)(A)(ii) of the Act, § 412.523(c)(4)(ii) specifies that any reduction of the annual update to the standard Federal rate under § 412.523(c)(4)(i) will apply only to the fiscal year involved and will not be taken into account in computing the annual update to the standard Federal rate for a subsequent fiscal year. Lastly, consistent with section 1886(m)(5)(B) of the Act, under § 412.523(c)(4)(iii), the application of any reduction of the annual update to the standard Federal rate under § 412.523(c)(4)(i) may result in an annual update that is less than 0.0 percent for a fiscal year, and may result in payment rates for a fiscal year that would be less than such payment rates for the preceding rate year.

We discuss the application of the 2.0 percentage point reduction under § 412.523(c)(4)(i) in our discussion of the annual market basket update to the LTCH PPS standard Federal rate for FY 2015 below in section VII.C.2.e. of the preamble of this final rule.

d. Market Basket Under the LTCH PPS for FY 2015

Under the authority of section 123 of the BBRA as amended by section 307(b) of the BIPA, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53468), we adopted a newly created FY 2009-based LTCH-specific market basket for use under the LTCH PPS beginning in FY 2013. The FY 2009-based LTCH-specific market basket is based solely on the Medicare cost report data submitted by LTCHs and, therefore, specifically reflects the cost structures of only LTCHs. For additional details on the development of the FY 2009-based LTCH-specific market basket, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53467 through 53476).

For FY 2015, as we proposed, we are continuing to use the FY 2009-based LTCH-specific market basket to update the LTCH PPS for FY 2015. We continue to believe that the FY 2009-based LTCH-specific market basket appropriately reflects the cost structure of LTCHs for the reasons discussed when we adopted the FY 2009-based LTCH-specific market basket for use under the LTCH PPS in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53467 through 53476).

e. Annual Market Basket Update for LTCHs for FY 2015

Consistent with our historical practice and as we proposed, we estimate the market basket update and the MFP adjustment based on IGI's forecast using the most recent available data. Based on IGI's second quarter 2014 forecast, the FY 2015 full market basket estimate for the LTCH PPS using the FY 2009-based LTCH-specific market basket is 2.9 percent. Using our established methodology for determining the MFP adjustment, the current estimate of the MFP adjustment for FY 2015 based on IGI's second quarter 2014 forecast is 0.5 percent, as discussed in section IV.B. of the preamble of this final rule. In addition, consistent with our historical practice of using the best available data, as we proposed, we used the most recent data available to estimate the market basket update and the MFP adjustment for FY 2015 in this final rule.

For FY 2015, section 1886(m)(3)(A)(i) of the Act requires that any annual update to the standard Federal rate be reduced by the productivity adjustment (“the MFP adjustment”) described in section 1886(b)(3)(B)(xi)(II) of the Act. Consistent with the statute, we are reducing the full FY 2015 market basket update by the FY 2015 MFP adjustment. To determine the market basket update for LTCHs for FY 2015, as reduced by the MFP adjustment, consistent with our established methodology, as we proposed, we subtracted the FY 2015 MFP adjustment from the FY 2015 market basket update. Furthermore, sections 1886(m)(3)(A)(ii) and 1886(m)(4)(E) of the Act requires that any annual update to the standard Federal rate for FY 2015 be reduced by the “other adjustment” described in paragraph (4), which is 0.2 percentage point for FY 2015. Therefore, following application of the productivity adjustment, as we proposed, we are reducing the adjusted market basket update (that is, the full market basket increase less the MFP adjustment) by the “other adjustment” specified by sections 1886(m)(3)(A)(ii) and 1886(m)(4) of the Act. (For additional details on our established methodology for adjusting the market basket increase by the MFP and the “other adjustment” required by the statute, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51771).)

As discussed previously in section VII.C.2.c. of the preamble of this final rule, for FY 2015, section 1886(m)(5) of the Act requires that for LTCHs that do not submit quality reporting data under the LTCHQR Program, any annual update to a standard Federal rate, after application of the adjustments required by section 1886(m)(3) of the Act, is further reduced by 2.0 percentage points. Therefore, the update to the LTCH PPS standard Federal rate for FY 2015 for LTCHs that fail to submit quality reporting data under the LTCHQR Program, the full LTCH PPS market basket increase estimate, subject to an adjustment based on changes in economy-wide productivity (“the MFP adjustment”) as required under section 1886(m)(3)(A)(i) of the Act and an additional reduction required by sections 1886(m)(3)(A)(ii) and 1886(m)(4) of the Act, will also be further reduced by 2.0 percentage points.

In this final rule, in accordance with the statute, we are reducing the FY 2015 full market basket estimate of 2.9 percent (based on IGI's second quarter 2014 forecast of the FY 2009-based LTCH-specific market basket) by the FY 2015 MFP adjustment (that is, the 10-year moving average of MFP for the period ending FY 2015, as described in section IV.B. of the preamble of this final rule) of 0.5 percentage point (based on IGI's second quarter 2014 forecast). Following application of the productivity adjustment, the adjusted market basket update of 2.4 percent (2.9 percent minus 0.5 percentage point) is then reduced by 0.2 percentage point, as required by sections 1886(m)(3)(A)(ii) and 1886(m)(4)(E) of the Act. Therefore, in this final rule, under the authority of section 123 of the BBRA as amended by section 307(b) of the BIPA, we are establishing an annual market basket update under the LTCH PPS for FY 2015 of 2.2 percent (that is, the most recent estimate of the LTCH PPS market basket update of 2.9 percent, less the MFP adjustment of 0.5 percentage point, and less the 0.2 percentage point required under section 1886(m)(4)(E) of the Act), provided the LTCH submits quality reporting data in accordance with section 1886(m)(5) of the Act. Accordingly, consistent with our proposal, we are revising § 412.523(c)(3) by adding a new paragraph (xi), which specifies that the standard Federal rate for FY 2015 is the standard Federal rate for the previous LTCH PPS year updated by 2.2 percent, and as further adjusted, as appropriate, as described in § 412.523(d). For LTCHs that fail to submit quality reporting data under the LTCHQR Program, under § 412.523(c)(3)(xi) in conjunction with § 412.523(c)(4), we are further reducing the annual update to the LTCH PPS standard Federal rate by 2.0 percentage points in accordance with section 1886(m)(5) of the Act. Accordingly, consistent with our proposal, we are establishing an annual update to the LTCH PPS standard Federal rate of 0.2 percent (that is, 2.2 percent minus 2.0 percentage points) for FY 2015 for LTCHs that fail to submit quality reporting data under the LTCHQR Program. As stated above, consistent with our historical practice of using the best available data, we used the most recent data available to establish an annual update to the LTCH PPS standard Federal rate for FY 2015 under § 412.523(c)(3)(xi) in this final rule. (We note that, we also are adjusting the FY 2015 standard Federal rate by applying a one-time prospective adjustment under the final year of the 3-year phase-in under § 412.523(d)(3) (discussed in section VII.C.3. of the preamble of this final rule) and by an area wage level budget neutrality factor in accordance with § 412.523(d)(4) (as discussed in section V.B.5. of the Addendum of this final rule).)

3. Adjustment for the Final Year of the Phase-In of the One-Time Prospective Adjustment to the Standard Federal Rate under § 412.523(d)(3)

We set forth regulations implementing the LTCH PPS, based upon the broad authority granted to the Secretary, under section 123 of the BBRA (as amended by section 307(b) of the BIPA). Section 123(a)(1) of the BBRA required that the system “maintain budget neutrality” in the August 30, 2002 LTCH PPS final rule (67 FR 55954). The statutory budget neutrality requirement means that estimated aggregate payments under the LTCH PPS for FY 2003 would be equal to the estimated aggregate payments that would have been made if the LTCH PPS were not implemented for FY 2003. The methodology for determining the LTCH PPS standard Federal rate for FY 2003 that would “maintain budget neutrality” is described in considerable detail in the August 30, 2002 final rule (67 FR 56027 through 56037). Our methodology for estimating payments for the purposes of budget neutrality calculations used the best available data, and necessarily reflected several assumptions (for example, costs, inflation factors, and intensity of services provided) in estimating aggregate payments that would have been made if the LTCH PPS had not been implemented (without accounting for certain statutory provisions that affect the level of payments to LTCHs in years prior to the implementation of the LTCH PPS, as required by the statute).

In the August 30, 2002 final rule, we also stated our intentions to monitor LTCH PPS payment data to evaluate whether later data varied significantly from the data available at the time of the original budget neutrality calculations (for example, data related to inflation factors, intensity of services provided, or behavioral response to the implementation of the LTCH PPS). To the extent the later data significantly differed from the data employed in the original calculations, the aggregate amount of payments during FY 2003 based on later data may be higher or lower than the estimates upon which the budget neutrality calculations were based. Therefore, in that same final rule, under the broad authority conferred upon the Secretary in developing the LTCH PPS, including the authority for establishing appropriate adjustments, under section 123(a)(1) of the BBRA, as amended by section 307(b) of the BIPA, we provided in § 412.523(d)(3) of the regulations for the possibility of making a one-time prospective adjustment to the LTCH PPS rates, so that the effect of any significant difference between actual payments and estimated payments for the first year of the LTCH PPS would not be perpetuated in the LTCH PPS rates for future years. We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53487 through 53488) for a complete discussion of the history of the development of the one-time prospective adjustment to the LTCH PPS standard Federal rate at § 412.523(d)(3).

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53495), we finalized our policy to make a one-time prospective adjustment to the standard Federal rate so that it will be permanently reduced by approximately 3.75 percent to account for the estimated difference between projected aggregate FY 2003 LTCH PPS payments and the projected aggregate payments that would have been made in FY 2003 under the TEFRA payment system if the LTCH PPS had not been implemented. Specifically, using the methodology we adopted in that same final rule, we determined that permanently applying a factor of 0.9625 (that is, a permanent reduction of approximately 3.75 percent) to the standard Federal rate is necessary to ensure estimated total FY 2003 LTCH PPS payments equal estimated total FY 2003 TEFRA payments consistent with our stated policy goal of the one-time prospective adjustment under § 412.523(d)(3) (that is, to ensure that the difference between estimated total FY 2003 LTCH PPS payments and estimated total FY 2003 TEFRA payments is not perpetuated in the LTCH PPS payment rates in future years). (We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53487 through 53502) for a complete discussion of the evaluation approach, methodology, and determination of the one-time prospective adjustment to the LTCH PPS standard Federal rate at § 412.523(d)(3).)

Given the magnitude of this adjustment, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53501 through 53502), under § 412.523(d)(3), we established a policy to phase-in the permanent adjustment of 0.9625 to the standard Federal rate over a 3-year period. To achieve a permanent adjustment of 0.9625, under the phase-in of this adjustment, in that same final rule, we explained that we will apply a factor of 0.98734 to the standard Federal rate in each year of the 3-year phase-in, that is, in FY 2013 (which does not apply to payments for discharges occurring on or after October 1, 2012, and on or before December 28, 2012, consistent with current law), FY 2014, and FY 2015. By applying a permanent factor of 0.98734 to the standard Federal rate in each year for FYs 2013, 2014, and 2015, we will completely account for the entire adjustment by having applied a cumulative factor of 0.9625 (calculated as 0.98734 × 0.98734 × 0.98734 = 0.9625) to the standard Federal rate. Accordingly, under § 412.523(d)(3), we applied a permanent factor of 0.98734 to the standard Federal rate in both FY 2013 and FY 2014 under the established 3-year phase-in of the one-time prospective adjustment.

In this final rule, for FY 2015, as we proposed, we are applying a permanent one-time prospective adjustment factor of 0.98734 to the standard Federal rate for FY 2015 under the last year of the 3-year phase-in of the one-time prospective adjustment, in accordance with the existing regulations under § 412.523(d)(3).

4. Summary of Other Public Comments Received on the Proposed LTCH PPS Payment Rates for FY 2015

We received a number of public comments that were not within the scope of the proposed rule, but we appreciate the commenters for providing that feedback. We also received a few public comments on issues related to the proposed LTCH PPS payment rates for FY 2015, but these issues were not specifically addressed by the proposals and related discussion presented in the FY 2015 IPPS/LTCH PPS proposed rule.

Comment: One commenter requested that CMS provide additional payment for end-stage renal disease (ESRD) patients under the same circumstances as under the IPPS under the LTCH PPS, noting that section 1881(b) of the Act does not limit the adjustment to subsection (d) hospitals. The commenter indicated that included information and analysis previously provided to CMS supports their request for this additional payment amount.

Response: Despite the fact that this comment is beyond the scope of the proposed rule, we note that we have responded to the issue that this commenter raised in a detailed response in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50767). As discussed in that final rule, based on an our analysis of FY 2012 LTCH PPS claims data, we continue to believe that the costs of treating ESRD patients in LTCHs are adequately reflected in data used to determine the MS-LTC-DRG relative weights for non-dialysis MS-LTC-DRGs, and that the additional resources associated with renal dialysis treatments are include in the LTCH PPS payments. Therefore, we are not adopting the commenters' request to provide for an additional payment for ESRD patients under the LTCH PPS.

D. Revision of LTCH PPS Geographic Classifications

1. Background

As discussed in the August 30, 2002 LTCH PPS final rule, which implemented the LTCH PPS (67 FR 56015 through 56019), in establishing an adjustment for area wage levels, the labor-related portion of an LTCH's standard Federal payment rate is adjusted by using an appropriate wage index based on the labor market area in which the LTCH is located. Specifically, the application of the LTCH PPS area wage-level adjustment, which is codified under existing § 412.525(c) of the regulations, is based on the location of the LTCH—either in an “urban” area or a “rural” area. Currently, under the LTCH PPS, as codified under § 412.503 of the regulations, an “urban area” is defined as a Metropolitan Statistical Area (which includes a Metropolitan division, where applicable) as defined by the Executive OMB, and a “rural area” is defined as any area outside of an urban area.

In the RY 2006 LTCH PPS final rule (70 FR 24184 through 24185), we revised § 412.525(c) to update the labor market area definitions used under the LTCH PPS, effective for discharges occurring on or after July 1, 2005, based on the Executive OMB's Core-Based Statistical Area (CBSA) designations (“CBSA designations”), which are based on 2000 Census data. We made this revision because we believed that the CBSA designations (geographic classifications) would ensure that the LTCH PPS wage index adjustment most appropriately accounts for and reflects the relative hospital wage levels in the geographic area of the hospital as compared to the national average hospital wage level. We noted that these were the same CBSA designations implemented for acute care hospitals under the IPPS, which were codified under § 412.64(b) of the regulations, beginning in FY 2005. (For a further discussion of the CBSA-based labor market area designations currently used under the LTCH PPS, we refer readers to the RY 2006 LTCH PPS final rule (70 FR 24182 through 24191).) We have generally updated the LTCH PPS CBSA designations annually since they were adopted for RY 2006 when updates from OMB were available (73 FR 26812 through 26814, 74 FR 44023 through 44204, and 75 FR 50444 through 50445).

In OMB Bulletin No. 10-2, issued on December 1, 2009, OMB announced that the CBSA changes in that bulletin would be the final update prior to the 2010 Census of Population and Housing. We adopted those changes under the LTCH PPS in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50444 through 50445), effective October 1, 2010. We continued to use these CBSA designations for FYs 2012 and 2013 (76 FR 51808 and 77 FR 53710, respectively). New OMB labor market area delineations (which we refer to in this section as “new OMB delineations”) based on 2010 standards and the 2010 Decennial Census data were announced by OMB on February 28, 2013. OMB issued Bulletin No. 13-01, which announced revisions to the delineation of Metropolitan Statistical Areas, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the uses of the delineation of these labor market areas. (For a copy of this bulletin, we refer readers to the following Web site: http://www.whitehouse.gov/sites/default/files/omb/bulletins/2013/b-13-01.pdf. This bulletin specifically provides the delineations of all Metropolitan Statistical Areas (MSAs), Metropolitan Divisions, Micropolitan Statistical Areas, Combined Statistical Areas, and New England City and Town Areas in the United States and Puerto Rico based on the standards published in the Federal Register on June 28, 2010 (75 FR 37246 through 37252) and 2010 Census data. (We note that, as discussed in section III.B. of the preamble of this final rule, consistent with the terminology used in the OMB Bulletin No. 13-01 and the standards published in the Federal Register on June 28, 2010, when referencing the new OMB geographic boundaries of Metropolitan Statistical Areas (MSAs) based on 2010 standards, we are using the term “new OMB delineations” rather than the term “CBSA-based labor market area definitions” that we have used in the past to refer to OMB geographic boundaries of statistical areas (75 FR 37249).)

As discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50994 through 50995), in order to implement these changes for the LTCH PPS (as in the case of the IPPS), it is necessary to identify the new OMB delineations for each county and hospital in the country. While the revisions OMB published on February 28, 2013, are not as sweeping as the changes OMB announced in 2003, the February 28, 2013 bulletin does contain a number of significant changes. For example, under the new OMB delineations, there are new CBSAs, urban counties that have become rural, rural counties that have become urban, and existing CBSAs that have been split apart and moved to other CBSAs. Because the update was not issued until February 28, 2013, and it was necessary for the changes made by the update and their ramifications to be extensively reviewed and verified, we were unable to undertake such a lengthy process before publication of the FY 2014 rulemaking cycle. That is, by the time the update was issued, the FY 2014 IPPS/LTCH PPS proposed rule was in the advanced stages of development, and the proposed FY 2014 LTCH PPS wage indexes based on the CBSA designations that are currently used under the LTCH PPS had been developed. Therefore, we did not propose to use the changes to the LTCH PPS CBSA designations for FY 2014 based on the new OMB delineations. Rather, to allow for sufficient time to assess the new changes and their ramifications, we stated that we intended to propose the adoption of the new OMB delineations and the corresponding changes to the wage index based on those delineations under the LTCH PPS for FY 2015 through notice and comment rulemaking, consistent with the approach used under the IPPS (78 FR 50994 through 50995). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28191 through 28194), we proposed to adopt the new OMB delineations announced in the February 28, 2013 OMB Bulletin No. 13-01, effective for FY 2015 under the LTCH PPS. As discussed below, after consideration of the public comments we received, in this final rule, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we are adopting the new OMB delineations announced in the February 28, 2013 OMB Bulletin No. 13-01, effective for FY 2015 under the LTCH PPS as proposed without modification. We note that this policy consistent with the approach being adopted under the IPPS as discussed in section III.B. of the preamble of this final rule.

2. Use of the New OMB Labor Market Area Delineations (“New OMB Delineations”)

Historically, Medicare prospective payment systems have utilized labor market area definitions developed by the OMB. As discussed above, the CBSA designations currently used under the LTCH PPS are based on the most recent market area definitions issued by the OMB. The OMB reviews its market area definitions/delineations based on data from the preceding decennial census to reflect more recent population changes. As discussed above and in section III.B. of the preamble of this final rule, the new OMB delineations are based on the OMB's latest market area delineations based on the 2010 Decennial Census data. Because we believe that the OMB's latest labor market area delineations are the best available data that reflect the local economies and wage levels of the areas in which hospitals are currently located, as we proposed, we are adopting the new OMB delineations based on the 2010 Decennial Census data under the LTCH PPS, beginning in FY 2015, for the reasons discussed below (which are consistent with the IPPS policy discussed in section III.B. of the preamble of this final rule).

When we implemented the wage index adjustment under § 412.525(c) for the LTCH PPS, and updated the LTCH PPS labor market area definitions based on the CBSA designations beginning in RY 2006, we explained that the LTCH PPS wage index adjustment was intended to reflect the relative hospital wage levels in the geographic area of the hospital as compared to the national average hospital wage level. (We refer readers to the RY 2003 LTCH PPS final rule (67 FR 56016) and the RY 2006 LTCH PPS final rule (70 FR 24184).) Because we believe that the new OMB delineations based on 2010 Decennial Census data (reflect the most recent available geographic classifications (market area delineations), as we proposed, we are revising the geographic classifications used under the LTCH PPS based on these new OMB delineations to ensure that the LTCH PPS wage index adjustment continues to most appropriately account for and reflect the relative hospital wage and wage-related costs in the geographic area of the hospital as compared to the national average hospital wage and wage-related costs. Specifically, as we proposed, we are adopting the new OMB delineations (as discussed in greater detail below), effective for LTCH PPS discharges occurring on or after October 1, 2014 (that is, effective for FY 2015). As we noted in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28191), because the application of the LTCH PPS area wage-level adjustment under existing § 412.525(c) is made on the basis of the location of the LTCH—either in an “urban” area or a “rural” area as those terms are defined under existing § 412.503. Under § 412.503, an “urban area” is defined as a Metropolitan Statistical Area as defined by the Executive OMB. A “rural area” is defined as any area outside of an urban area. Therefore, we did not make any changes to the existing regulations under this policy.

As discussed in section III.B. of this preamble, while CMS and other stakeholders have explored potential alternatives to the current CBSA-based labor market system, no consensus has been achieved regarding how best to implement a replacement system. While we recognize that MSAs are not designed specifically to define labor market areas, we believe that they do represent a useful proxy for this purpose. Consistent with the approach taken for the IPPS, we have used MSAs to define labor market areas for purposes of Medicare wage indices under the LTCH PPS since its implementation in FY 2003. MSAs also are used to define labor market areas for purposes of the wage index for many of the other Medicare payment systems (for example, the IRF PPS, the SNF PPS, the HHA PPS, the OPPS, and the IPF PPS). (We refer readers to the RY 2006 LTCH PPS final rule (70 FR 24184).) Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28191 through 28194), under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we proposed to adopt the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for FY 2015 under the LTCH PPS. In addition, we proposed to use the new OMB delineations to calculate area wage indexes in a manner that is consistent with the CBSA-based methodologies finalized in the RY 2006 LTCH PPS final rule, as refined in subsequent rulemaking. We also proposed to implement a transitional wage index policy (as discussed in more detail below) for LTCHs that would experience a negative payment impact due to the adoption of the new OMB delineations. This proposed policy, including the transitional wage index policy, is consistent with the policy proposed under the IPPS for FY 2015, as discussed in section III.B. of this preamble.

Comment: A few commenters supported the proposal to adopt the new OMB delineations and to use these new OMB delineations to calculate area wage indexes effective for FY 2015 under the LTCH PPS. We did not receive any public comments opposing the proposed adoption of the new OMB delineations under the LTCH PPS. We also note that we did not receive any public comments that specifically addressed the details of our proposals with regard to the adoption of the new OMB labor market area delineations relating to Micropolitan Statistical Areas, urban counties that would become rural, rural counties that would become urban, or urban counties that moved to a different urban CBSA. (We refer readers to the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28192 through 28193) for details regarding these proposals.) A few commenters also commented on the proposed transitional wage index policy, which we discuss below in section VII.D.2.e. of this preamble.

Response: We appreciate the commenters' support for the proposal to adopt the new OMB delineations under the LTCH PPS, as we believe that the new OMB delineations based on 2010 Decennial Census data reflect the most recent data available to define geographic classifications (market area delineations) for LTCHs and ensure that the LTCH PPS wage index adjustment continues to most appropriately account for and reflect the relative hospital wage and wage-related costs in the geographic area of the hospital as compared to the national average hospital wage and wage-related costs. Therefore, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, in this final rule, we are adopting the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for FY 2015 under the LTCH PPS, as we proposed without modification. We also are using these new OMB delineations to calculate area wage indexes in a manner that is consistent with the CBSA-based methodologies finalized in the RY 2006 LTCH PPS final rule, as refined in subsequent rulemaking. In addition, as discussed below in section VII.D.2.e. of this preamble, after consideration of the public comments we received, as we proposed, we are implementing a budget neutral transitional wage index policy for LTCHs that will experience a negative payment impact due to the use of the new OMB delineations. This policy, including the transitional wage index policy, is consistent with the policy being adopted under the IPPS presented in section III.B. of the preamble of this final rule. The discussion below focuses on issues related to the use of the new OMB delineations to define labor market areas for purposes of the wage index adjustment under the LTCH PPS, and as we explained in the proposed rule, is consistent with what is being adopted under the IPPS.

a. Micropolitan Statistical Areas

When we adopted the CBSA designations under the LTCH PPS in RY 2006, we discussed CMS' consideration of whether to use Micropolitan Statistical Areas to define the labor market areas for the purpose of the LTCH PPS wage index. OMB defines a “Micropolitan Statistical Area” as a Consolidated Metropolitan Statistical Area (CMSA) “associated with at least one urban cluster that has a population of at least 10,000, but less than 50,000” (70 FR 24183). We refer to these areas as “Micropolitan Areas.” After conducting an extensive impact analysis, we determined that the best course of action would be to treat all hospitals located in “Micropolitan Areas” as “rural,” and to include these hospitals in the calculation of each State's rural wage index. Because Micropolitian Areas tend to encompass smaller population centers and contain fewer hospitals than MSAs, we determined that if Micropolitan Areas were to be treated as separate labor market areas, the IPPS wage index would include drastically more single-provider labor market areas. This larger number of labor market areas with fewer providers could create instability in year-to-year wage index values for a large number of hospitals; could reduce the averaging effect of the wage index, lessening some of the efficiency incentive inherent in a system based on the average hourly wages for a large number of hospitals; and could arguably create an inequitable system when so many hospitals would have wage indexes based solely on their own wage data while other hospitals' wage indexes would be based on an average hourly wage across many hospitals. For these reasons, we adopted a policy to include Micropolitan Areas in the State's rural wage area, and have continued this policy through the present. (We refer reader to the RY 2006 LTCH PPS final rule (70 FR 24187).)

Based upon the 2010 Decennial Census data, a number of rural and urban counties have joined or have become Micropolitan Areas, while other counties that once were part of a Micropolitan Area under previous OMB CBSA designations, have become either urban or rural under the new OMB delineations. Overall, there are fewer Micropolitan Areas (541) under the new OMB delineations based on 2010 Decennial Census data than existed under the data from the 2000 Census (581). As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28192), we believe that it is appropriate to continue the policy established in the RY 2006 LTCH PPS final rule, and we are treating Micropolitan Areas as rural labor market areas under the LTCH PPS. These areas continue to be defined as having relatively small urban cores (populations of 10,000-49,999). We do not believe that it would be appropriate to calculate a separate wage index for areas that typically may include only a few hospitals for the reasons set forth in the RY 2006 LTCH PPS final rule, as discussed above.

As previously noted, we did not receive any public comments on our proposals relating to the adoption of the new OMB labor market area delineations with regard to Micropolitan Statistical Areas. Therefore, we are adopting these policies as final without modification in this final rule. In conjunction with our policy to adopt the new OMB labor market area delineations, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, for FY 2015, we are continuing to treat Micropolitan Areas as “rural,” and will assign the Micropolitan Area the statewide rural wage index for the State in which the LTCH is located. We also are establishing that, beginning in FY 2015, the wage data for any IPPS hospitals located in the Micropolitan Areas will be included in the calculation of each State's LTCH PPS rural area wage index. (As discussed in section V.B.2. of the Addendum to this final rule, the LTCH PPS area wage index values are calculated using the wage data of IPPS hospitals.) We note that this policy is consistent with the policy adopted under the IPPS discussed in section III.B.2.a. of the preamble of this final rule. For a discussion of our policies to moderate the impact of our adoption of the new OMB delineations under the LTCH PPS, we refer readers to section VII.D.2.e. of the preamble of this final rule.

b. Urban Counties That Became Rural under the New OMB Labor Market Area Delineations

Under the new OMB delineations, which are based upon 2010 Decennial Census data, for FY 2015, we found that there are a number of counties (or county equivalents) that are defined as “urban” under the previous CBSA designations that are now defined as “rural” under the new OMB delineations. As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28193) and in section III.B. of this preamble, an analysis of the new OMB delineations shows that a total of 37 counties (and county equivalents) that were considered to be part of an “urban” CBSA will now be considered to be located in a “rural” area, beginning in FY 2015, based on the new OMB delineations. We refer readers to a table presented in section III.B.2.b. of the preamble of this final rule that lists the 37 urban counties that are defined as rural under our adoption of the new OMB delineations.

As previously noted, we did not receive any public comments on our proposals relating to the adoption of the new OMB labor market area delineations with regard to urban counties that would become rural. Therefore, we are adopting these policies as final without modification in this final rule. Under our adoption of the new OMB delineations for the LTCH PPS, we are establishing that LTCHs located in any of the 37 counties listed in the table under section III.B.2.b. of the preamble of this final rule will be considered “rural,” and will receive their respective State's rural area wage index for FY 2015 under the LTCH PPS. We note that, currently, there are no LTCHs located in any of the 37 counties listed in the table that are currently considered to be part of an “urban” CBSA and that will be considered to be located in a “rural” area, beginning in FY 2015. The wage data for any IPPS hospitals located in those 37 counties listed in the table now will be considered “rural” when calculating the respective State's LTCH PPS rural area wage index beginning in FY 2015. (As discussed in section V.B.2. of the Addendum to this final rule, the LTCH PPS area wage index values are calculated using the area wage data of IPPS hospitals.) We note that this policy is consistent with the policy adopted under the IPPS discussed in section III.B.2.b. of the preamble of this final rule. We refer readers to section VII.D.2.e. of this preamble for a discussion of our policies to moderate the impact of our policy to implement the new OMB delineations under the LTCH PPS.

c. Rural Counties That Became Urban under the New OMB Labor Market Area Delineations

In using the new OMB labor market area delineations (which are based upon 2010 Decennial Census data) for FY 2015, we found that there are a number of counties (or county equivalents) that are defined as “rural” under the previous OMB definitions (that is, CBSA designations) will be considered “urban” based on the adoption of the new OMB delineations. As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28193) and in section III.B.2.c. of the preamble of this final rule, an analysis of the new OMB labor market area delineations shows that a total of 105 counties (and county equivalents) that were previously located in “rural” areas now are located in an “urban” area under the new OMB delineations. We refer readers to a table in section III.B.2.c. of the preamble of this final rule that lists the 105 “rural” counties that will now be located in an “urban” area, based on our policy to adopt the new OMB delineations presented in section III.B.2.c. of the preamble of this final rule. There are currently no LTCHs located in the 105 “rural” counties listed in that table.

As previously noted, we did not receive any public comments on our proposals relating to the adoption of the new OMB labor market area delineations with regard to rural counties that would become urban. Therefore, we are adopting these policies as final without modification in this final rule. Under our adoption of the new OMB labor market area delineations, we are establishing that LTCHs located in any of those 105 counties will now be included in their new respective “urban” CBSAs and will receive the respective “urban” CBSA's area wage index. We also are establishing that, beginning in FY 2015, the wage data for any IPPS hospitals located within those 105 counties will now be included in the calculation of the LTCH PPS area wage index for those hospitals' respective “urban” CBSAs. (As discussed in section V.B.2. of the Addendum to this final rule, the LTCH PPS area wage index values are calculated using the area wage data of IPPS hospitals.) We note that this policy is consistent with the policy adopted under the IPPS discussed in section III.B.2.c. of the preamble of this final rule. We refer readers to section VII.D.2.e. of the preamble of this preamble for a discussion of our policies to moderate the impact of our policy to implement the new OMB delineations under the LTCH PPS.

d. Urban Counties Moved to a Different Urban CBSA under the New OMB Labor Market Area Delineations

In addition to “rural” counties that became “urban” and “urban” counties that became “rural” under the new OMB delineations, we found that several urban counties shifted from one urban CBSA to another urban CBSA. In certain cases, the new OMB delineations involved a change only in the CBSA name or code, while the CBSA continued to encompass the same constituent counties. However, in other cases, under the new OMB delineations, some counties are shifted between existing urban CBSAs and new urban CBSAs, changing the constituent makeup of those CBSAs. For example, in some cases, entire CBSA are subsumed by another CBSA. In other cases, some CBSAs have counties that are split off as part of a different urban CBSA, or to form entirely new labor market areas. We refer readers to section III.B.2.d. of the preamble of this final rule for additional information, including examples, on urban counties that have moved from one urban CBSA to a different urban CBSA under the new OMB delineations. As discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28193), LTCHs located in these affected counties that will move from one urban CBSA to a different urban CBSA under our policy to adopt the new OMB delineations will experience both negative and positive impacts in regard to the LTCH's specific area wage index values. We refer readers to section VII.D.2.e. of this preamble for a discussion of our policies to moderate the impact imposed upon hospitals because of our policy to adopt the new OMB labor market area delineations under the LTCH PPS. As previously noted, we did not receive any public comments on our proposals relating to the adoption of the new OMB labor market area delineations with regard to urban counties that moved to a different urban CBSA. Therefore, we are adopting these policies as final without modification in this final rule.

e. Transition Period

As indicated above, overall, we believe that our policy to adopt the new OMB delineations will result in LTCH PPS wage index values being more representative of the actual costs of labor in a given area. However, as we discussed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28193), we also recognize that some LTCHs would experience decreases in their area wage index values as a result of our policy. We also realize that many LTCHs would have higher area wage index values under our policy. To mitigate the impact imposed upon hospitals, we have in the past provided for transition periods when adopting changes that have significant payment implications, particularly large negative impacts. While we believe that using the new OMB delineations would create a more accurate payment adjustment for differences in area wage levels, we also recognize that adopting such changes may cause some short-term instability in LTCH PPS payments. Therefore, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we proposed to implement a transitional wage index policy for LTCHs that would experience a decrease in their area wage index values due to our proposal to adopt the new OMB delineations under the LTCH PPS. Specifically, we proposed a 1-year transitional wage index policy under which any LTCH that would experience a decrease in its area wage index value solely due to the adoption of the new OMB delineations would get a “50/50 blended area wage index” value that would be calculated as the sum of 50 percent of the wage index computed under the FY 2014 CBSA designations and 50 percent of the wage index computed under the new OMB delineations proposed for FY 2015. Furthermore, we proposed that this proposed transitional wage index policy would be applied in a budget neutral manner, consistent with the existing requirement under § 412.525(c)(2) that any changes to the adjustment for differences in area wage levels will be made in a budget neutral manner. We also presented a proposed methodology for calculating an area wage level adjustment budget neutrality factor for FY 2015 that included the proposed 50/50 blended wage index as applicable (79 FR 28193 through 28194).

Comment: Commenters that supported the proposed adoption of the new OMB delineations under the LTCH PPS also supported our proposed transitional wage index policy for LTCHs that would experience a negative payment impact due to the adoption of the new OMB delineations. While the commenters conveyed their appreciation for the proposed transitional wage index policy to help mitigate any negative financial ramifications, they requested that the proposed transitional wage index policy be extended beyond FY 2015 to allow hospitals more time to operationally adjust to the change to their area wage adjustment. We note that we did not receive any public comments on our proposal to apply the proposed transitional wage index policy in a budget neutral manner, or on our proposed methodology for calculating an area wage level adjustment budget neutrality factor for FY 2015 that included the proposed 50/50 blended wage index as applicable.

Response: We appreciate the commenters' support for the proposed transitional wage index policy for LTCHs that would experience a negative payment impact due to the adoption of the new OMB delineations. While we understand the commenters' concern regarding the potential financial impact, as we explained in the proposed rule, the revisions under the new OMB delineations are not as extensive as the changes that OMB announced in 2003 that were adopted under the IPPS in FY 2005 with a 1-year transition and adopted under the LTCH PPS in RY 2006 with no additional transitional policy other than the transitional wage index policy in effect at that time. While it is our longstanding policy to provide temporary adjustments to mitigate negative impacts from the adoption of new policies or procedures, we continue to believe that the 1-year “50/50 blended wage index” transitional policy provides an adequate safeguard against any significant payment reductions, allows for sufficient time to make operational changes for future fiscal years, and provides a reasonable balance between mitigating some short-term instability in LTCH PPS payments and improving the accuracy of the payment adjustment for differences in area wage levels.

While we acknowledge that some LTCHs will experience a reduction in their wage index as a result of the adoption of the new OMB delineations, we also point out that several LTCHs will experience an increase in their wage index based on the adoption of the new OMB delineations. Because the new OMB delineations reflect the most recent data available to define geographic classifications (market area delineations) for LTCHs, we believe that the wage index values computed under those delineations will result in more appropriate payments to providers by more accurately accounting for and reflecting the differences in area wage levels (that is, the relative hospital wage and wage-related costs in the geographic area of the hospital as compared to the national average hospital wage and wage-related costs). Because we believe that the implementation of the new OMB delineations will create more accurate representations of a LTCH's labor market areas and result in LTCH PPS wage index values being more representative of the actual costs of labor in a given area, we believe that it is important to implement the wage index values calculated under new OMB delineations with as minimal a transition as possible. Extending the transitional “50/50 blended wage index” policy beyond FY 2015 would only further delay the improved accuracy of area wage level adjustments to LTCH PPS payments under the new OMB delineations. In addition, because the proposed transitional 50/50 blended wage index policy would be made in a budget neutral manner, all LTCH PPS payments are reduced to offset the additional payments that result under the transitional policy. For these reasons, we are not adopting the commenters' suggestion to extend the proposed transitional 50/50 blended wage index policy beyond FY 2015.

Therefore, in this final rule, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we are adopting a 1-year transitional wage index policy for LTCHs that will experience a decrease in their area wage index values due to our policy to adopt the new OMB delineations under the LTCH PPS, as we proposed without modification. In addition, we are finalizing our proposal to apply the transitional area wage index policy in a budget neutral manner, and our methodology for calculating an area wage level adjustment budget neutrality factor for FY 2015, which includes the proposed 50/50 blended wage index as applicable, as proposed without modification.

Under the transitional wage index policy that we are establishing for FY 2015 we computed a blended area wage index value for any LTCH that will experience a decrease in its area wage index value solely due to the adoption of the new OMB delineations. That is, for purposes of determining an LTCH's area wage index for FY 2015, we computed LTCH PPS wage index values using the area wage data (discussed in section V.B.4. of the Addendum to this final rule) under both the FY 2014 CBSA designations and the FY 2015 new OMB delineations based on the 2010 OMB Decennial Census data. For each LTCH, we compared these two wage indexes. If an LTCH's wage index for FY 2015 under the new OMB delineations was lower than the LTCH's wage index under the FY 2014 CBSA designations, we are establishing that, for FY 2015, the LTCH will be paid based on a blended wage index that is computed as the sum of 50 percent of each of the two wage index values described above (referred to as the 50/50 blended wage index). If an LTCH's wage index for FY 2015 under the new OMB delineations is higher than the LTCH's wage index under the FY 2014 CBSA designations, we are establishing that, for FY 2015, the LTCH will be paid based on 100 percent of the wage index under the FY 2015 new OMB delineations (and will not receive the 50/50 blended wage index).

Furthermore, we are establishing that the transitional area wage index policy will be adopted in a budget neutral manner. Under § 412.525(c)(2), any changes to the adjustment for differences in area wage levels will be made in a budget neutral manner such that estimated aggregate FY 2015 LTCH PPS payments are unaffected; that is, will be neither greater than nor less than estimated aggregate LTCH PPS payments without such changes to the area wage level adjustment. Under this policy, we determine an area wage-level adjustment budget neutrality factor that is applied to the standard Federal rate (under § 412.523(d)(4)) to ensure that any changes to the area wage level adjustments are budget neutral such that any changes to the wage index values or labor-related share would not result in any change (increase or decrease) in estimated aggregate LTCH PPS payments. Because our transitional wage index policy for LTCHs that will experience a decrease in their area wage index values solely as a result of our finalized policy to adopt the new OMB delineations under the LTCH PPS will result in an increase in estimated aggregate LTCH PPS payments without such changes, we are including the finalized 50/50 blended wage index values in our calculations for the area wage level adjustment budget neutrality factor that is applied to the standard Federal rate to ensure that any changes to the area wage level adjustment are budget neutral. Specifically, consistent with our established methodology, we used the following methodology to determine an area wage level adjustment budget neutrality factor for FY 2015:

  • Step 1—We simulated estimated aggregate LTCH PPS payments using the FY 2014 wage index values as established in Tables 12A and 12B for the FY 2014 IPPS/LTCH PPS final rule (which is available via the Internet on the CMS Web site) and the FY 2014 labor-related share of 62.537 percent as established in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50996).
  • Step 2—We simulated estimated aggregate LTCH PPS payments using the FY 2015 wage index values as shown in Tables 12A through 12D for this final rule (which are available via the Internet on the CMS Web site), including the transitional 50/50 blended wage index values, if applicable (as discussed above and in section V.B.4. of the Addendum of this final rule), and the FY 2015 labor-related share of 62.306 percent (as discussed in section V.B.3. of the Addendum to this final rule).
  • Step 3—We determined the ratio of these estimated total LTCH PPS payments by dividing the estimated total LTCH PPS payments using the FY 2014 area wage level adjustments (calculated in Step 1) by the estimated total LTCH PPS payments using the FY 2015 area wage level adjustments (calculated in Step 2) to determine the FY 2015 area wage level adjustment budget neutrality factor.
  • Step 4—We applied the FY 2015 area wage level adjustment budget neutrality factor from Step 3 to the FY 2015 LTCH PPS standard Federal rate after the application of the FY 2015 annual update as discussed in section V.A.2. of the Addendum to this final rule.

As explained above, we are applying this factor in determining the FY 2015 standard Federal rate to ensure that the updates to the area wage level adjustment for FY 2015 will be implemented in a budget neutral manner. For this final rule, using the steps in the methodology described above, we determined a FY 2015 area wage level adjustment budget neutrality factor of 1.0016703.

We note that this transitional wage index policy under our policy to adopt the new OMB delineations for FY 2015 under the LTCH PPS is consistent with the policies adopted under the IPPS presented in sections III.B.2.e.(5) and (6) of the preamble of this final rule. As noted previously in section VII.D.2.b. of the preamble of this final rule, there are currently no LTCHs located in an “urban” county that became “rural” under the policy to adopt the new OMB delineations. Therefore, as we discussed in the FY 2015 IPPS/LTCH PPS proposed rule, we are not establishing a transitional wage index policy that is consistent with the IPPS policy presented in section III.B.2.e.(2) of the preamble of this final rule for hospitals that are currently located in an “urban” county that became “rural” under the adoption of the new OMB delineations. We also note that we are not establishing any transitional policies under the LTCH PPS that are consistent with those presented under the IPPS for hospitals with a reclassification or redesignation as discussed in section III.B.2.e.(3) of the preamble of this final rule, or for hospitals deemed urban under section 1886(d)(8)(B) of the Act as discussed in section III.B.2.e.(4) of the preamble of this final rule, as those reclassifications, redesignations, and statutory deems are not applicable to LTCHs.

E. Reinstatement and Extension of Certain Payment Rules for LTCH Services—The 25-Percent Threshold Payment Adjustment

1. Background

Section 1206(b)(1)(A) of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67), enacted on December 26, 2013, provides for the retroactive reinstatement and extension, for an additional 4 years, of the moratorium on the full implementation of the 25-percent threshold payment adjustment (hereinafter referred to as “the 25-percent policy”) under the LTCH PPS established under section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act. In addition, section 1206(b)(1)(B) of Pub. L. 113-67 provides for a permanent exemption from the application of the 25-percent policy for certain grandfathered co-located LTCHs.

Section 1206(b)(1)(C) of Public Law 113-67 also requires that “. . . [n]ot later than 1 year before the end of the 9-year period referred to in section 114(c)(1) of the Medicare, Medicaid, and SCHIP Extension Act of 2007 (42 U.S.C. 1395ww note), as amended by subparagraph (B) [of section 1206 of Pub. L. 113-67], the Secretary of Health and Human Services shall submit to Congress a report on the need for any further extensions (or modifications of the extensions) of the 25 percent rule described in sections 412.534 and 412.536 of title 42, Code of Federal Regulations, particularly taking into account the application of section 1886(m)(6) of the Social Security Act, as added by subsection (a)(1) [of section 1206 of Pub. L. 113-67].”

The 25-percent policy is a payment adjustment under the LTCH PPS, originally established in our regulations at 42 CFR 412.534 for LTCHs and LTCH satellite facilities and their co-located referring hospitals in the FY 2005 IPPS final rule (69 FR 49191), and at 42 CFR 412.536 for all other LTCHs and referring hospitals in the RY 2007 LTCH PPS final rule (72 FR 26870), based on analyses of Medicare discharge data that indicated that patterns of patient shifting appeared to be occurring more for provider financial advantage than for patient benefit. In order to discourage such activity, a payment adjustment was applied for LTCH discharges of patients who were admitted to the LTCH from the same referring hospital in excess of an applicable percentage threshold, which was to transition to a 25-percent threshold after specified phase-in periods. (For rural and single-urban LTCHs and those with MSA-dominant referring hospitals, a 50-percent threshold was applied.). Under this policy, discharges in excess of the threshold are paid at an “IPPS equivalent” rate, instead of the much higher LTCH PPS rate. (We refer readers to detailed discussions of the 25-percent policy for LTCH HwHs and LTCH satellite facilities in the FY 2005 IPPS final rule (69 FR 49191 through 49214) and its application to all other LTCHs in the RY 2008 LTCH PPS final rule (72 FR 26919 through 26944).)

The results of the different rulemaking schedules in effect when §§ 412.534 and 412.536 were implemented (FY 2005 (October 1, 2004) and RY 2007 (July 1, 2006), respectively) are as follows: for co-located LTCHs and LTCH satellite facilities governed under § 412.534, the 25-percent policy was effective for cost reporting periods beginning on or after October 1, 2005 (“October” LTCHs); for LTCHs and LTCH satellite facilities governed under § 412.536, the 25-percent policy was effective for cost reporting periods beginning on or after July 1, 2007 (“July” LTCHs). In addition, even though grandfathered LTCH HwHs and LTCH satellite facilities are governed under § 412.534(h), they are “July” LTCHs because the 25-percent policy was applied to these facilities in the RY 2008 LTCH PPS final rule.

Section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act, provided for a 5-year moratorium on the full application of the 25-percent policy that expired for some LTCHs and LTCH satellite facilities for cost reporting periods beginning on or after October 1, 2012 (“October” LTCHs) and for other LTCHs and LTCH satellite facilities for cost reporting periods beginning on or after July 1, 2012 (“July” LTCHs). (For a detailed description of the moratorium on the application of the 25-percent policy, we refer readers to the May 22, 2008 Interim Final Rule with Comment Period (73 FR 29699 through 29704) and the August 27, 2009 Interim Final Rule with Comment Period for the ARRA, which was published in the FY 2010 IPPS final rule and Changes to the LTCH PPS and Rate Years 2010 and 2009 Rates final rule (74 FR 43990 through 43992).

The expiration of the statutory moratorium for both “July” and “October” LTCHs was delayed because CMS established regulatory extensions in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53483 through 53484), as amended by the FY 2013 IPPS/LTCH PPS correcting amendment (77 FR 63751 through 63753). Specifically, we established a 1-year extension (that is, for cost reporting periods beginning on or after October 1, 2012, and before October 1, 2013) on the full application of the 25-percent policy for “October” LTCHs. For those “July” LTCHs that would have been affected by the “gap” between the expiration of the statutory moratorium (for cost reporting periods beginning on or after July 1, 2012) and our prospective regulatory relief (for cost reporting periods beginning on or after October 1, 2012), we also provided for an additional moratorium based on LTCH discharges occurring on or after October 1, 2012 and ending at the start of the LTCHs' next cost reporting period. For those “July” LTCHs with cost reporting periods beginning on or after October 1, 2012, the regulatory extension of the statutory moratorium, described above, effective for the hospital's first cost reporting period beginning on or after October 1, 2012, resulted in seamless coverage for that group. However, for those “July” LTCHs with cost reporting periods beginning on or after July 1, 2012, and before October 1, 2012, that would have otherwise been subject to the “gap” between the expiration of the statutory moratorium and the effective date of the regulatory moratoria, we established a second regulatory moratorium effective with discharges occurring beginning October 1, 2012, through the end of the LTCH's cost reporting period (that is, the end of the cost reporting period that began on or after July 1, 2012, and before October 1, 2012). Therefore, by providing for the above described regulatory extension for “July” LTCHs, we eliminated the distinction between “July” and “October” LTCHs, which resulted in the 25-percent policy being applied for all cost reporting periods beginning on or after October 1, 2012, following the expiration of the moratorium. For more details about these moratoria, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53483 through 53484).

Because we did not extend the regulatory moratorium on the 25-percent policy in the FY 2014 IPPS/LTCH PPS final rule, the full application of the payment adjustment policy was effective for all LTCHs (both “October” and “July” LTCHs) for cost reporting periods beginning on or after October 1, 2013 (78 FR 50772).

2. Implementation of Section 1206(b)(1) of Pub. L. 113-67

As stated earlier, section 1206(b)(1)(A) of Public Law 113-67 provides an additional amendment to section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act, that extends the “original” statutory moratorium on the full implementation of the 25-percent policy to a total of 9 years from the original effective dates established by the MMSEA (July 1 or October 1, 2007, as applicable). As a result, the lapse of the regulatory moratorium on the full implementation of the 25-percent policy is moot. This “seamless” statutory moratorium provides relief until cost reporting periods beginning on or after July 1, or October 1, 2016, as applicable. Section 1206(b)(1)(B) provides a permanent exemption from the 25-percent policy for certain grandfathered co-located LTCHs. In this final rule, based on the statutory changes made by sections 1206(b)(1)(A) and (b)(1)(B) of Public Law 113-67, we are making conforming amendments to the regulations governing application of the 25-percent policy. Specifically, we are revising §§ 412.534(c)(1)(i) and (c)(1)(ii), (c)(2), (c)(3), (d)(1) and (d)(1)(i), (d)(2), (d)(3), (e)(1) and (e)(1)(i), (e)(2), (e)(3), the introductory text of paragraph (h), (h)(4), and (h)(5) and removing paragraph (h)(6); and removing paragraphs (a)(1)(iii) and (a)(2)(ii), revising (a)(2), and removing paragraph (a)(3) of § 412.536 to reflect the statutory changes.

Comment: One commenter suggested that the costs associated with the new limitations provided by the application of the 25-percent policy, that is, any additional costs to the Medicare program because of the moratorium on full implementation of the 25-percent policy, be absorbed by the hospitals that receive the benefit from the extension of this moratorium. The commenter questioned whether this absorption of costs could be accomplished by a hospital-specific adjustment similar to the one presently used for failure to submit quality data, and whether the suggested adjustment amount could be calculated based on a facility's compliance with 25-percent policy.

Response: We appreciate the commenter's suggestion, but note that these suggestions are beyond the scope of the proposals presented in the FY 2015 IPPS/LTCH PPS proposed rule. We believe that Congress specified how we are to implement this policy when it instructed the Secretary to extend the relief provided by section 114(c) of the MMSEA of 2007, and its amendments, until the moratorium expires, or in the case of certain grandfathered LTCHs, indefinitely. The provisions of section 114(c) of the MMSEA of 2007, nor its amendments, include any measures to absorb any Medicare program costs associated with the moratorium on the full application of this policy. We do not believe that further regulatory initiatives are appropriate at this time.

Comment: Several commenters urged CMS to repeal the 25-percent policy immediately. Some commenters reasoned that “Congress has not required the partial implementation of the 25 percent rule, but rather has prohibited the full implementation of the 25 percent rule.” Other commenters believed that applying the 25-percent policy after patient-level criteria are implemented would “violate” the provisions in Public Law 113-67 that require use of patient-level criteria to determine which cases receive standard or site neutral Medicare payments. Some commenters also believed that the 25-percent policy is unnecessary recognizing the forthcoming changes to the LTCH PPS, and stated that the 25-percent policy would reduce the payment distinctions between the number of cases receiving payments based on standard payment rates and the number of cases receiving payments based on site-neutral payment rates, thereby “weakening” the incentives that the commenters believed Congress intended to impose under the statute.

Response: Although we initially implemented the 25-percent policy under §§ 412.534 and 412.536 of the regulations through our general rulemaking authority, the 25-percent policy is now mandated under section 114(c) of the MMSEA, as amended. This statutory moratorium currently expires effective with cost reporting periods beginning on July 1, 2016, or October 1, 2016, as applicable. Therefore, CMS does not have the authority to “repeal” a statutory provision. As discussed in the May 22, 2008 interim final rule with comment period, and as we further discussed in the FY 2010 IPPS/LTCH final rule (74 FR 43980 through 43986), we believe that section 114(c)(1) of the MMSEA provided a 3-year delay in the application of §§ 412.534 and 412.536 to “only two categories of LTCHs . . . [s]imilarly, the 3-year relief . . . in section 114(c)(2) in the form of increased thresholds . . . was narrowly targeted to only those `applicable LTCHs and LTCH satellite facilities,' that is, those `subject to the transition rules under § 412.534(g) of title 42 Code of Federal Regulations' ” (74 FR 43982). In fact, with the enactment of the extension of the original moratorium under section 1206(b)(1)(B) of Public Law 113-67, and the extended relief provided from the 25-percent policy, Congress added only one specific change to the provisions of the original moratorium, that is, the permanent exemption of grandfathered LTCHs from the 25-percent policy. We also note that there is an additional provision of the statute that specifies the viability of the 25-percent policy, at least until the initial implementation of the new payment framework under the LTCH PPS. Specifically, section 1206(b)(1)(C) requires CMS to submit a report to Congress “[n]ot later than 1 year before the end of the 9-year period referred to in section 114(c) of the Medicare, Medicaid, and SCHIP Extension Act of 2007 . . . on the need for any further extensions (or modifications of the extensions) of the 25 percent rule . . . particularly taking into account the application of section 1886(m)(6) of the Social Security Act as added by subsection (a)(1).” In response to the commenters expressed concerns relating to an “overlapping” of the full implementation of the 25-percent policy and the new payment framework specified under section 1206(a) of Public Law 113-67, we assure the commenters that any such interplay will be fully considered during the development of the required July 2015 Report to Congress. This date is at least a full year prior to the expiration of the current moratorium. Furthermore, as the statutory payment methodology revisions to the LTCH PPS will be phased-in under a “blended” payment methodology effective with LTCH cost reporting periods beginning during FY 2016, there still may be a need for the 25-percent policy during that phase-in period, although our study may or may not conclude that this policy is not required after full implementation of the new statutory payment methodology under the LTCH PPS.

F. Discussion of the “Greater Than 3-Day Interruption of Stay” Policy and the Transfer to Onsite Providers Policies Under the LTCH PPS

The interrupted stay policy is a payment adjustment that was included under the LTCH PPS from the inception; that is, for cost reporting periods beginning on or after October 1, 2002 (FY 2003). In this discussion, we use the terms “interrupted stay” and “interruption of stay” interchangeably. An “interruption of stay” occurs when, during the course of an LTCH hospitalization, a patient is discharged to an inpatient acute care hospital, an IRF, or a SNF for treatment or services not available at the LTCH for a specified period followed by a readmittance to the same LTCH. We refer readers to the RY 2003 LTCH PPS final rule (67 FR 56002). When we established this policy, we believed that the readmission to the LTCH represented a continuation of the initial treatment, a stay in which an “interruption” occurred, rather than a new admission if the length of stay at the intervening facility was within a specified number of days. If an “interruption of stay” occurred, payment for both “halves” of the LTCH discharge were then “bundled,” and Medicare would make one payment based on the second date of discharge. Specifically, under this policy, we established a fixed-day threshold, which applied to the specified number of days a Medicare beneficiary spends as an inpatient at an acute care hospital, an IRF, or a SNF. In the RY 2003 LTCH PPS final rule, we explained that we were implementing this policy because we wanted “. . . to reduce the incentives inherent in a discharged-based prospective payment system of `shifting' patients between Medicare-covered sites of care in order to maximize Medicare payments. This policy is particularly appropriate for LTCHs because, as a group, these hospitals differ considerably in the range of services offered such that where some LTCHs may be able to handle certain acute conditions, others will need to transfer their patients to acute care hospitals.

“For instance, some LTCHs are equipped with operating rooms and intensive care units and are capable of performing minor surgeries. However, other LTCHs are unable to provide those services and will need to transfer the beneficiary to an acute care hospital. We believed that our policy also provided for a patient . . . “who no longer requires hospital-level care, but is not ready to return to the community,” and who “. . . could be transferred to a SNF.” (We refer readers to the RY 2003 LTCH PPS final rule (67 FR 56002).)

In the regulations under 42 CFR 412.531, we defined two types of interruptions of stays. Under § 412.531(a)(1), “[a] 3-day or less interruption of stay” means a stay at a LTCH during which a Medicare inpatient is discharged from the LTCH to an acute care hospital, IRF, SNF, or the patient's home and readmitted to the same LTCH within 3 days of the discharge from the LTCH. Under the “3 day or less interruption of stay policy,” the fixed-day threshold period begins with the calendar date of discharge from the LTCH and ends not later than midnight of the third day. If an LTCH patient's “interruption” exceeds this threshold, payment is governed by the “greater than 3-day interruption of stay” policy. (We refer readers to the RY 2005 LTCH PPS final rule (69 FR 25690 through 25700), the RY 2006 LTCH PPS final rule (70 FR 24206), and the RY 2007 LTCH PPS final rule (71 FR 27872 through 27875) for detailed discussions of the 3-day or less interruption of stay policy.) In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28196), we did not propose to revise the 3-day or less category of interrupted stays, but we make mention of the policy for clarity in making a distinction between the 3-day or less interruption of stay policy and the greater than 3-day interruption of stay policy that we proposed to revise in our proposed rule.

The “greater than 3-day interruption of stay policy,” is defined under § 412.531(a)(2) as a stay during which a Medicare inpatient is transferred upon discharge to an acute care hospital, an IRF, or a SNF for treatment or services that are not available in the long-term care hospital and returns to the same long-term care hospital within the applicable fixed-day period specified in regulations under § 412.531(a)(2)(i) through (a)(2)(iii). For a discharge to an acute care hospital, the applicable fixed-day period is between 4 and 9 consecutive days; the counting of the days begins on the calendar day of discharge from the LTCH and ends on the 9th day when the patient is readmitted to the LTCH. For a discharge to an IRF, the applicable fixed-day period is between 4 and 27 consecutive days; the counting of the days begins on the calendar day of discharge from the LTCH and ends on the 27th day. For a discharge to a SNF, the applicable fixed-day period is between 4 and 45 consecutive days; the counting of the days begins on the calendar day of discharge from the LTCH and ends on the 45th day. We refer readers to our proposed rule for a more detailed description of the derivation of our day thresholds (79 FR 28196).

Under the greater than 3-day interrupted stay policy, if an LTCH readmission occurs within the fixed-day period both halves of the LTCH discharge are treated as a single discharge for the purposes of payment under the LTCH PPS. In such instances, the beneficiary's readmittance to the LTCH is paid for with a single MS-LTC-DRG payment that covers the initial admission to the LTCH and the subsequent readmission. That is, a single Medicare payment is made for the entire two-part discharge. Payment to the acute care hospital, the IRF, or the SNF is then made in accordance with the applicable payment policies for those providers when the interruption of stay exceeds 3 days. Therefore, we balanced the payment incentives of both the LTCH and the acute care hospital, the IRF, or the SNF to which the LTCH patient might be discharged before being readmitted to the LTCH.

As we discussed in the RY 2003 LTCH PPS final rule (67 FR 56007), our concerns about patient shifting were significantly increased in the context of transfers between co-located LTCHs and LTCH satellite facilities, or for LTCH hospital-within-hospital transfers. Collectively, we refer to these arrangements as transfers to “onsite” providers. In the regulations under § 412.532(b), we define a facility that is “co-located or “onsite” as a hospital, satellite facility, unit, or SNF that occupies space in a building also used by another hospital or unit or in one or more buildings on the same campus, as defined in § 413.65(a)(2), as buildings used by another hospital or unit. Under this LTCH PPS policy, if more than 5 percent of the Medicare patients discharged from an LTCH during a cost reporting period were discharged to an “onsite” SNF, IRF, or psychiatric facility, or to an “onsite” acute care hospital, and directly readmitted to the same LTCH, the LTCH would be paid one MS-LTC-DRG payment to cover both LTCH discharges, regardless of the length of the interrupted stay. As is the case in regard to the greater than 3-day interruption of stay policy, payment to an acute care hospital, an IRF, or a SNF would not be affected under the 5-percent policy.

Our concern about patient shifting among “onsite” providers did not originate with the implementation of the LTCH PPS. The LTCH 5-percent policy under § 412.532 was recodified from an earlier regulation under § 413.40(a)(3), which applied a payment adjustment to hospitals paid under the TEFRA payment system, including LTCHs, to address inappropriate discharges of patients to a host hospital paid under the IPPS from an excluded hospital-within-a-hospital (such as a LTCH) that culminated in a readmission to the hospital-within-a-hospital. We refer readers to the FY 2000 IPPS final rule, the RY 2003 LTCH PPS final rule, and the FY 2015 IPPS/LTCH PPS proposed rule for a detailed description of the 5-percent policy, its initial application under the TEFRA payment system, and our policy concerns (64 FR 41535, 67 FR 56007 through 56014, and 79 FR 28196 through 28197).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28196), we proposed to revise our policies on interrupted stays. Specifically, we proposed to modify the fixed-day thresholds under the greater than 3-day interruption of stay policy to provide for a 30-day fixed threshold as an “acceptable standard” for determining a linkage between an index discharge and a readmission from an inpatient facility as specified under this policy (that is, an IPPS hospital, an IRF, or a SNF) consistent with the intervals presently used in two recently implemented Medicare initiatives: the Hospital Readmissions Reductions Program and the Hospital Inpatient Quality Reporting Program. (We refer readers to our proposed rule for a description of these two policies (79 FR 28197). We also proposed to remove our regulation at § 412.532, Special payment provisions for patients who are transferred to onsite providers and readmitted to a long-term care hospital, stating that as an “after the fact” payment adjustment (that is, following cost report settlement), we believed that this policy had a limited impact on provider behavior, and additionally our proposed changes to the interrupted stay policy make it unnecessary.

Comment: Commenters objected to the CMS proposal to modify the fixed-day threshold for the greater than 3-day interrupted stay policy. The commenters provided many reasons for their objections to the proposal, including that:

  • CMS should first implement the new statutory framework under Public Law 113-67 that applies patient-level criteria to payments under the LTCH PPS in FY 2016 and then “. . . assess whether any problems related to the interrupted stay policy exist under the transformed payment system.”
  • CMS used an “inappropriate analogy” in its proposal to change the fixed-day threshold under the greater than 3-day interrupted stay policy to 30 days because the referenced thresholds for the Hospital IQR Program and the Hospital Readmissions Reduction Program are used under the IPPS, but not under the LTCH PPS. Therefore, the commenters believed that 30 days is an inappropriate benchmark for the LTCH PPS policy. The commenters further noted that the LTCH PPS greater than 3-day interrupted stay policy applies a payment adjustment when an LTCH discharges a patient for access to clinical services not available at the LTCH and the patient is readmitted to the LTCH within the fixed-day threshold. In contrast, under the Hospital Readmissions Reduction Program, a payment reduction is applied to the hospital's payment if the patient returns to the hospital for care within the fixed-day threshold, and it was not expected that the patient would return to the hospital for continuation of care in relation to the most recent discharge. The commenters specifically stated that “the interrupted stay thresholds are intended . . . to define a point at which the care required for a current episode of illness changes significantly enough to warrant `re-setting the clock' to an entirely separate episode of care for the subsequent readmission . . .” to the LTCH. They added that “The 30-day readmission threshold, on the other hand, can be likened to a 30-day warranty period during which a readmission could indicate suboptimal quality of care during the initial admission.” The commenters believed that comparing the interrupted stay policy to the readmissions initiatives would result in “crucial incongruence” because the two policies address fundamentally different clinical care scenarios. Furthermore, the commenters stated that a clinical threshold is not the same as a quality initiative. Some commenters stated that CMS had not demonstrated that an LTCH stay interrupted by 30 days at an IPPS hospital followed by a readmission to the LTCH constitutes a single episode of care or hospital stay. Several commenters asserted that “the agency's previous research contradicts this premise upon which the proposed policy change is based.”
  • The proposal did not include an adequate discussion of CMS' rationale as an explanation of the Agency's proposal. The commenters asserted that the publically available data sets did not provide adequate information for stakeholders to study the potential impact on hospitals based on this proposed policy. The commenters noted that the inclusion of such material in the FY 2012 proposal enabled LTCH stakeholders to meaningfully comment in response to the proposals. Furthermore, the commenters believed that as a result of the policy changes that will be implemented in FY 2016, LTCHs would be subject to significant financial and operational upheaval if this new policy is implemented as well.
  • CMS did not offer evidence to indicate that LTCHs have been overpaid under the current policy or adequate data detailing the impact this proposed policy would have on LTCHs. The commenters suggested a more detailed impact analysis for this policy, including whether patient access to care would be harmed.
  • Given that the potential impact imposed upon LTCHs based on the proposal to change the fixed-day threshold from 9 to 30 days for an intervening IPPS stay is so “drastic,” if finalized, there should be 3-year transition period from the current policy if CMS were to finalize such a policy, and CMS should change the MS-LTC-DRG relative payment weights to account for the resulting changes in LTCH treatment costs and Medicare payments.

Response: We appreciate the commenters' responses. After careful consideration of the public comments we received, we agree with the commenters who indicated that, in light of the forthcoming modifications to the LTCH PPS, a major revision to the existing greater than 3-day interrupted stay policy may be premature at this time. We will take the other comments we received into consideration in preparation for any potential future rulemaking on this issue.

Despite our decision to not finalize our proposal to change the fixed-day threshold under the LTCH PPS greater than 3-day interrupted stay policy, our goal continues to be to help ensure that readmission decisions are made on a clinical basis and not based on payment considerations. During the past several years, the Office of the Inspector General (OIG) has been evaluating the effects of the interrupted stay policies for LTCHs, primarily focusing on readmissions from acute care hospitals. In the OIG's March 28, 2014 report, “Vulnerabilities in Medicare's Interrupted-Stay Policy” (OEI-04-12-00490), the OIG stated that “we identified several vulnerabilities in the LTCH interrupted-stay policy, including inappropriate payments (and) financial incentives to delay readmissions. . . .” The report further stated that “. . . 59 LTCHs had a high number of readmissions after the fixed-day period. . . .” (We refer readers to the Executive Summary of the OIG's March 28, 2014 report for further details.) The report also noted that “[f]orty-five of the 59 LTCHs were part of a chain, and 23 of these LTCHs were part of the same chain . . . For 50 of these 59 LTCHs, the number of returns doubled immediately after the fixed-day period.” (We refer readers to page 17 of the OIG's March 28, 2014 report for further details.) The OIG recommended, among other things, that CMS take appropriate action regarding LTCHs with a high number of readmissions immediately after the fixed-day period and LTCHs with a high number of readmissions following multiple short intervening facility stays.

In our response to the OIG's report, CMS agreed that LTCH readmission decisions should be based on the patient's clinical needs and not the hospital's financial benefit. We stated that if we find evidence that an individual hospital or chain is making readmission decisions based on financial considerations rather than the patient's clinical needs, we would take the appropriate action in those cases to rectify the inconsistencies in adhering to the current policy. In addition, as noted earlier, we will consider potential changes to the greater than 3-day interrupted stay policy as we gain experience under the new framework for the LTCH PPS.

Comment: Commenters supported the proposal to remove the regulation at § 412.532 (Special payment provisions for patients who are transferred to onsite providers and readmitted to a long-term care hospital), noting that the existing greater than 3-day interrupted stay policy addresses many of CMS' concerns about patient shifting. Commenters also indicated that they believed that the patient-level criteria that we will be implementing for FY 2016 will result in changes to LTCH PPS that further reduce the need for this policy.

Response: We appreciate the commenters' support. After consideration of the public comments we received, we are finalizing our proposal to remove the regulatory requirements under § 412.532 because we believe that this policy has had a limited impact on provider behavior.

In summary, in this final rule, we are not finalizing our proposal to change the fixed day threshold under the greater than 3-day interrupted stay policy under §§ 412.531(a)(2) and (b)(4) of the regulations. However, we are finalizing the proposal to remove § 412.532 in its entirety and to make a conforming change to § 412.525 by removing and reserving paragraph (d)(3), which references payments under § 412.532.

G. Moratoria on the Establishment of LTCHs and LTCH Satellite Facilities and on the Increase in the Number of Beds in Existing LTCHs or LTCH Satellite Facilities

As previously noted, Public Law 113-67 was enacted on December 26, 2013. Section 1206(b)(2) of Public Law 113-67 amended section 114(d) of the MMSEA of 2007, as previously amended by section 4302 of the American Recovery and Reinvestment Act (ARRA) of 2009 (Pub. L. 111-5) and sections 3106(b) and 10312(b) of the Affordable Care Act (Pub. L. 111-148). As further amended by section 112(b) of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93), section 114(d) of the MMSEA includes a “new” statutory moratoria on the establishment of new LTCHs and LTCH satellite facilities, and on the increase in the number of hospital beds in existing LTCHs and LTCH satellite facilities, “for the period beginning April 1, 2014 and ending September 30, 2017, which mirrors nearly identical provisions of the “expired” moratoria under section 114(d)(1) of the MMSEA, as amended by sections 4302 of the ARRA and sections 3106(b) and 10312(b) of the Affordable Care Act. These prior, yet nearly identical, provisions of section 114(d)(1) of the MMSEA, as amended by the ARRA and the Affordable Care Act, expired on December 28, 2012. For clarity and brevity, we will refer to the “expired” moratoria or moratorium to reference those that expired on December 28, 2012, and the “new” moratoria or moratorium to reference those that began on April 1, 2014, as applicable, throughout this discussion.

The primary difference between the “expired” moratoria and the “new” moratoria is that, while the “expired moratoria” provided for specific exceptions to both the moratorium on the establishment of new LTCHs and LTCH satellite facilities and on increases in the number of beds in existing LTCHs and LTCH satellite facilities, the “new” moratoria only provides exceptions to the moratorium on the establishment of new LTCHs and LTCH satellite facilities. No exceptions are provided under the “new” moratorium on increases in the number of hospital beds in existing LTCHs and LTCH satellite facilities. (For a detailed description of the “expired” moratoria provisions (including the applicable exceptions) that were in effect from December 29, 2007 through December 28, 2012, we refer readers to the May 22, 2008 Interim Final Rule with Comment Period (73 FR 29705 through 29708).

In light of the expiration date of the “expired” moratoria on December 28, 2012, and the effective date of the “new” moratoria on April 1, 2014, there has been a period of time in which new LTCHs and LTCH satellite facilities have been allowed to be established, and during which time there may have been increases in the number of hospital beds in LTCHs and LTCH satellite facilities. In accordance with section 114(d)(1) of the MMSEA, as amended by section 112(b) of Public Law 113-93, for the period beginning April 1, 2014 through September 30, 2017, CMS will be unable to designate any hospital as an LTCH or recognize a new LTCH satellite facility as such, unless one of the exceptions (described below) is met.

Additionally, as of April 1, 2014, in accordance with sections 114(d)(6) and (d)(7) of the MMSEA, as amended by section 112(b) of Public Law 113-93, an existing LTCH may not increase the number of its hospital beds. This moratorium will extend through September 30, 2017, and is not subject to any exceptions.

To qualify for an exception under the “new” moratorium to establish a new LTCH or LTCH satellite facility during the timeframe between April 1, 2014, and September 30, 2017, a hospital or entity must meet the following criteria:

  • The hospital or entity must have begun its qualifying period for payment as an LTCH under 42 CFR 412.23(e).
  • The hospital or entity must have a binding written agreement with an outside, unrelated party for the actual construction, renovation, lease, or demolition for an LTCH, and must have expended before April 1, 2014, at least 10 percent of the estimated cost of the project or, if less, $2,500,000.
  • The hospital or entity must have obtained an approved certificate of need in a State where one is required.

While this exception only applies to the “new” moratorium on the establishment of new LTCHs and LTCH satellite facilities under section 114(d)(7) of the MMSEA, as amended by section 112(b) of Public Law 113-93, the mechanics of the exception are analogous to those established under the “expired” moratorium, which ended in 2012. The “expired” moratoria were implemented in a May 22, 2008 Interim Final Rule with Comment Period (73 FR 29704 through 29707). As discussed in that rule, some of the terminology in the statutory provision was internally inconsistent. A strictly literal reading of the statutory language under section 114(d)(2) of the MMSEA, as amended by section 4302 of the ARRA and sections 3106(b) and 10312(b) of the Affordable Care Act, presented practical challenges for implementation in light of the established LTCH classification criteria under § 412.23(e) of the regulations. Therefore, we adopted interpretations that we believed would reasonably reconcile seemingly inconsistent provisions and that would result in a logical and workable mandate. Specifically, as drafted, the exception only applies to a hospital or entity when it is already classified as an “LTCH.” Such entities will not need an exception to the moratorium on becoming an “LTCH” because they will already be an LTCH. As such, we are interpreting this provision under the new exception as we interpreted the exceptions to the “expired” moratorium. We discuss our interpretations below.

At the outset of this discussion, we want to clarify which provisions of section 114(d) of the MMSEA, as amended, were subject to the “expired” moratoria, as distinguished from those which are subject to the “new” moratoria. Sections 114(d)(2) and 114(d)(3) of the MMSEA, as amended, only address exceptions under the “expired” moratoria. Section 114(d)(6) of the MMSEA, as amended, defines when the exceptions addressed in sections 114(d)(2) and 114(d)(3) expired. Section (d)(7) of the MMSEA addresses the exception under the “new” moratorium on the establishment of new LTCHs and LTCH satellite facilities. There are no exceptions to the “new” moratorium on the increases in the number of beds in existing LTCHs and LTCH satellite facilities, as noted above.

Section 114(d)(7)(A) of the MMSEA, as amended, mirrors the expired provisions of section 114(d)(2)(A). Both provisions refer to an LTCH that began its qualifying period for payment as a “long-term care hospital” on or before a given date. However, a hospital would not be classified as an LTCH during that qualifying period; the facility or entity would typically be classified as an IPPS hospital. For a full discussion of our rationale for interpreting section 114(d)(2)(A) of the MMSEA to refer to an IPPS hospital meeting the stated requirements, we refer readers to our May 22, 2008 Interim Final Rule with Comment Period (73 FR 20704 through 29707) regarding the implementation of the “expired” moratorium. In this final rule, we are applying the same rationale in regard to the interpretation of section 114(d)(7)(A), that is, we are interpreting the provision to refer to an acute care hospital meeting the stated requirements as the hospital or entity seeking classification as an LTCH. As we did when interpreting the same language under the “expired” moratorium exception under section 114(d)(2)(A) of the MMSEA, as amended by section 4302 of the ARRA and sections 3106(b) and 10312(b) of the Affordable Care Act, we note that the exception under section 114(d)(7)(A) of the MMSEA cannot provide any relief to LTCH satellite facilities because there is no “qualifying period” for the establishment of a LTCH satellite facility for payment as a LTCH under § 412.23(e). Therefore, an LTCH satellite facility cannot meet the stated requirements for an exception under section 114(d)(7)(A) of the MMSEA.

Section 114(d)(7)(B) of the MMSEA specifies the conditions for an exception to the moratorium on the establishment of new LTCHs and LTCH satellite facilities having: (1) a binding written agreement with an outside, unrelated party for the actual construction, renovation, lease, or demolition for an LTCH; and (2) expended, before the date of enactment of Public Law 113-93, April 1, 2014, “at least 10 percent of the estimated cost of the project (or, if less, $2,500,000).” As drafted, this provision is also problematic. In cases in which a hospital has not yet been built, but there is a binding written agreement for the actual construction of a hospital that intends to be classified as an LTCH, the entity hiring those who would complete the construction would not be classified as an LTCH. Prior to the designation or classification of a hospital or an entity as an LTCH, a hospital must first be established and certified and must then complete the procedures specified under § 412.23(e) in order to qualify as an LTCH, at which point the hospital would be reclassified as an LTCH.

In accordance with our interpretation of section 114(d)(2)(B) of the MMSEA, as amended by section 4302 of the ARRA and sections 3106(b) and 10312(b) of the Affordable Care Act, we are interpreting the contracting and expenditure provisions under section 114(d)(7)(B) of the MMSEA, as added by section 112(b) of Public Law 113-93, to apply to the hospital/entity requesting an exception to the moratorium on the establishment of new LTCHs and LTCH satellite facilities between April 1, 2014, and September 30, 2017—the entity that would be classified as an LTCH if it meets the stated requirements. That entity must have a binding written agreement with an outside unrelated party for the actual construction, renovation, lease, or demolition for converting the hospital to an LTCH, and it must have expended at least 10 percent of the estimated cost of the project (or, if less, $2,500,000) by the date of enactment of Public Law 113-93—April 1, 2014.

Furthermore, with regard to the first prong, as when we implemented the “expired” moratoria, we continue to believe that the use of the term “actual” in the context of the “actual construction, renovation, lease, or demolition” indicates that the provision focuses only on the specific actions cited in the statute, and does not include those actions that are being contemplated or are not yet substantially underway. Although we are aware that a hospital or some other type of entity may enter into binding written agreements regarding services and items (for example, feasibility studies or land purchase) and incur costs for those services and items prior to actual construction, renovation, lease or demolition, we believe that those services or items are not included in what we are permitted to consider under the statutory language of the exception requirements.

With respect to the second prong, the statute specifies that the hospital or entity must have “expended” at least 10 percent of the estimated cost of the project (or, if less, $2,500,000) by April 1, 2014. As we did in regard to the interpretation of section 114(d)(2)(B) of the MMSEA, as amended by section 4302 of the ARRA and section 3106(b) and 10312(b) of the Affordable Care Act, we are interpreting the phrase “cost of the project” to mean the activities enumerated in the first prong: “the actual construction, renovation, lease, or demolition for a long-term care hospital.” That is, the statute requires the hospital or entity to have spent the amount specified in the statute on the actual construction, renovation, lease, or demolition for the contemplated LTCH. Furthermore, as we did previously in regard to the interpretation of section 114(d)(2)(B) of the MMSEA, as amended by section 4302 of the ARRA and sections 3106(b) and 10312(b) of the Affordable Care Act, because the statute uses the phrase “has expended” (that is, a past tense phrase), we are limiting funds counting toward the 10 percent or $2,500,000 minimum to those funds that have actually been transferred as payment for the stated aspects of the project prior to April 1, 2014, as opposed to merely obligating capital and posting the cost of the project on its books. We believe that the provision addressed the concept of “obligate” in the first prong of the test where the statute specifies “a binding written agreement . . . for the actual construction, renovation, lease, or demolition of the long-term care hospital. . . .” and there is no reason to believe that the second prong of the test, which requires the “expenditure” of 10 percent of the project or, if less, $2,500,000, was intended as a redundancy. The ability to post the expense on the hospital's or entity's books could be satisfied by merely having a binding written agreement under the first prong of section 114(d)(7)(B) of the MMSEA. The fact that a second requirement is included that involves an expenditure indicates that an additional threshold must be met.

Finally, section 114(d)(7)(C) of the MMSEA includes an exception to the moratorium if an LTCH, prior to April 1, 2014, has “obtained an approved certificate of need in a State where one is required”. As discussed above, we are applying this exception requirement to the entity that is requesting approval for an exception to the moratorium on the establishment of new LTCHs and LTCH satellite facilities between April 1, 2014, and September 30, 2017—the entity that would be classified as an “LTCH” if the stated requirements are met.

However, with that said, we are clarifying what kind of certificate of need we will accept under the provisions of section 114(d)(7) of the MMSEA. We believe that the certificate of need exception applies to a “hospital” or entity that was actively engaged in developing an LTCH, as evidenced by the fact that either an entity that wanted to create a LTCH but did not exist as a hospital prior to April 1, 2014, had obtained a certificate of need for a hospital by the date of enactment, or an existing hospital had obtained a certificate of need to convert the hospital into a new LTCH by that date. We are applying this exception requirement to a hospital that is already in existence prior to the date of enactment of Public Law 113-93, and that had previously obtained an approved certificate of need for a hospital (other than a LTCH) prior to April 1, 2014. We believe that Congress intended the exception to the moratorium to save those entities that were already actively engaged in becoming an LTCH. The fact that a hospital may have had a certificate of need issued to it years before April 1, 2014, to operate a hospital (other than a LTCH) is not indicative of such active engagement, and, we believe, is outside of what is contemplated in these LTCH-specific statutory provisions. We are only applying this exception requirement where the certificate of need is specifically for an LTCH. Because the certificate of need process is controlled at the State level, in determining whether the hospital or entity has obtained an approved certificate of need prior to April 1, 2014, we will consult the applicable State on a case-by-case basis for that determination.

Decisions regarding the application of these moratoria and exceptions provided within the provisions of section 114(d) of the MMSEA will be handled on a case-by-case basis by the applicant's MAC and the CMS Regional Office. “Final” instructions on implementing the moratoria will be posted following the publication of this final rule.

In accordance with these policies, in this final rule we also are revising our regulations under § 412.23(e)(6) and (e)(7) to include a description of the “new” moratoria, which is in effect from April 1, 2014, through September 30, 2017, on the establishment of new LTCHs and LTCH satellite facilities (with specific exceptions), and on increasing the number of beds in existing LTCHs and existing LTCH satellite facilities.

Comment: Several commenters urged CMS to establish a regulatory exception to the statutory moratorium on the increase in the number of beds in existing LTCHs and LTCH satellite facilities. The commenters pointed out that, unlike the “expired” moratoria, the “new” moratoria under section 1206(b)(2) of Public Law 113-67 do not provide for such exceptions. The commenters further stated that when the statute was further amended by section 112(b) of the PAMA of 2014, Congress elected to provide an exception for the establishment of new LTCHs and LTCH satellite facilities, but not for the increase in the number of LTCH beds. Specifically, the commenters requested that CMS provide two regulatory exceptions to the moratorium to allow for the increase in the number of beds in existing LTCHs and LTCH satellite facilities if: (1) the LTCH has a binding written agreement as of the date of the enactment of this paragraph with an outside, unrelated party for the actual construction, renovation, lease or demolition for long-term care hospital beds, and has expended, before such date of enactment, at least 10 percent of the estimated cost of the project (or, if less, $2,500,000) (the “binding written agreement exception”); or (2) if the LTCH has obtained an approved certificate of need (CON) from the State where one is required on or before the date of enactment (the “CON exception”). The commenters believed that the creation of these exceptions would be within CMS' authority because: (1) the statute is “ambiguous” and, therefore, CMS may exercise its authority under the general rulemaking provisions under sections 1102 and 1871 of the Act to “resolve the ambiguity”; (2) with the enactment of section 112(b) of the PAMA, the effective date of the new moratorium on the increase in the number of beds in existing LTCHs and LTCH satellite facilities was changed from January 1, 2015, to April 1, 2014, which creates a disadvantage for those LTCHs that were in the process of increasing the number of beds in their facilities based on “reasonable reliance” on the January 1, 2015 effective date; (3) Congress acted in haste when enacting the PAMA, and not including a bed number exception was an error; and (4) the health needs of the critically ill Medicare beneficiary population will go unmet without these additional beds.

Response: We do not agree with the commenters' assertion regarding CMS' authority to establish two regulatory exceptions to the statutory moratorium on the increase in the number of beds in existing LTCHs and LTCH satellite facilities. Unlike the “expired” moratoria, the “new” moratoria under section 1206(b)(2) of Public Law 113-67 expressly noted that such exceptions would not apply under the “new” moratoria. We refer readers to section 1206(b)(2)(B) of Public Law 113-67. When further amended by section 112(b) of the PAMA of 2014, Congress only elected to provide exceptions for the establishment of new LTCHs and LTCH satellite facilities, but not for the increase in the number of LTCH beds. We do not believe that these two laws, read in concert, are ambiguous. Congress explicitly addressed the former exceptions as they relate to the “new” moratorium. In doing so, Congress clearly demonstrated its awareness of the prior exceptions, and by stating that the exceptions do not apply under the “new” moratorium while concurrently not offering new exceptions, clearly indicated that Congress intended to offer no such exceptions. Furthermore, there is no reason for CMS to presume that the subsequent provisions for exceptions under the “new” moratoria on the establishment of new LTCHs and LTCH satellite facilities, but not for the increase in the number of LTCH beds was anything other than intentional, absent evidence to the contrary. The commenters did not present any evidence of this nature. Therefore, in the absence of some indication that Congress intended to reverse its specific statement under section 1206(b)(2)(B) of Public Law 113-67 that limits the application of exceptions, such as it did in establishing exceptions to the moratorium on the establishment of new LTCHs and LTCH satellite facilities, we see no reason to infer that the absence of any exceptions in regard to the moratorium on the increase in the number of beds in existing LTCHs and LTCH satellite facilities was anything other than intentional.

Furthermore, in response to the commenters' “reasonable reliance” assertions, while we may understand the commenters' concerns regarding wasted resources, such concerns do not permit us to offer rulemaking that would be contrary to the express intent of Congress. Finally, while we understand the commenters' concerns regarding access to care for Medicare beneficiaries, we believe that Congress would have provided exceptions if it believed that beneficiary access to LTCH and LTCH satellite facility beds would be negatively impacted. Furthermore, we expect that Congress would address any unanticipated access issues, should these issues arise. Therefore, we disagree with the commenters' assertions.

Comment: One commenter urged CMS to revise its interpretation of the exceptions provisions under the moratorium on the development of new LTCHs and LTCH satellite facilities so as to include “ownership” of the property in the list of permitted activities that could be included in the criteria for qualifying for the “binding written agreement” exception. The commenter also urged CMS to include the purchase of architectural plans as a necessary element that would count towards quantifying the total expenditure amount.

Response: In the FY 2015 IPPS/LTCH PPS proposed rule, we noted that the “new” moratorium on the development of new LTCHs and LTCH satellite facilities provided under section 1206(b)(2) of Public Law 113-67, as amended by section 112(b) of the PAMA, and incorporated as section 114(d)(7) of the MMSEA “. . . mirrors the expired provisions of section 114(d)(2)(A)” of the `expired' moratorium.” Because Congress used the identical wording for these provisions, we proposed to apply the same interpretation of the exceptions provisions that we used for the “expired” moratorium in regard to the “new moratorium. (We refer readers to the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28198).) The specific exception provision that the commenters are questioning is outlined under section 114(d)(7)(B) of the MMSEA, as amended, and is defined as the “binding written agreement” exception. Section 114(d)(7)(B) of the MMSEA of 2007, as amended, specifies one of the qualifying criterion for this exception, namely, the requirement for the facility to have a binding written agreement with an outside, unrelated party for the actual construction, renovation, lease, or demolition for a long-term care hospital, and have expended, before the date of the enactment of the PAMA, at least 10 percent of the estimated cost of the project (or, if less, $2,500,000).

After we implemented the provisions of the “expired” moratorium, published in the May 22, 2008 interim final rule with comment period (73 FR 29699), in response to the FY 2010 IPPS/LTCH PPS final rule, commenters urged CMS to revise its interpretation of the “binding written agreement” exception under section 114(d)(2)(B) of the MMSEA to include “. . . feasibility studies, land purchases, architectural fees, attorneys' fees, appraisals, purchase of rights of way, as well as other activities that occur during the development of a hospital . . .” At that time, we stated in our response that, “. . . Congress expressly specified only four `actual' activities in the statute.” We also believe, as we stated in the May 22, 2008 interim final rule with comment period, that the use of the term “actual” in the context of the exception provisions terminology of “actual construction, renovation, lease, or demolition” limits the activities that Congress considers to represent a substantial commitment to that particular project of developing an LTCH or an LTCH satellite facility. By using the term “cost of the project,” we believe that the statute refers to the activities enumerated in the first prong, “the `actual' construction, renovation, lease, or demolition for a long-term care hospital.” (We refer readers to the FY 2010 IPPS/LTCH PPS final rule with comment period (74 FR 43988).) Our interpretation of the exception provisions for a “binding written agreement” under the “expired” moratorium was implemented in FY 2008 with the publication of the May 22, 2008 interim final rule with comment period (73 FR 29699), and finalized in the FY 2010 IPPS/LTCH PPS final rule with comment period (74 FR 43754). While we understand that our longstanding interpretation of the language in this exception may cause hardship to developing LTCHs and LTCH satellite facilities that seek to qualify for the exception under the “expenditure” prong, we continue to believe that only the specific costs cited in the statute may be considered in evaluating and granting exceptions to the “new” moratorium. Furthermore, we also believe that by using the same language in the exceptions provisions under the “new” moratorium that was used in the provisions of the “expired” moratorium, Congress had reason to expect that CMS would apply the same interpretation under the “new” exception provisions as were applied under the “expired” moratorium exception provisions. If Congress disagreed with our interpretation, or believed that other costs should be included or considered in determining whether an LTCH or LTCH satellite facility would qualify for this exception to the moratorium, it could have revised the language used under section 112(b) of the PAMA, which applies the exceptions, accordingly.

H. Evaluation and Treatment of LTCHs Classified Under Section 1886(d)(1)(B)(iv)(II) of the Act

Section 1206(d) of the Pathway for SGR Reform Act (Pub. L. 113-67) instructs the Secretary to evaluate payments and regulations governing “hospitals which are classified under subclause (II) of subsection (d)(1)(B)(iv) . . .” as part of the annual rulemaking for payment rates under subsection (d) of section 1886 of the Act for FY 2015 or FY 2016. (We refer to hospitals “classified under subclause (II) of subsection (d)(1)(B)(iv) . . .” as “subclause (II) LTCHs.”) Based on the results of this evaluation, the Secretary is authorized to adjust the payment rates under section 1886(b)(3) of the Act for this type of hospital (such as by applying a payment adjustment such that the payments resemble those under a “TEFRA-payment model”). To implement such a payment adjustment, the Secretary would have to propose changes to the existing regulations governing subclause (II) LTCHs.

For this final rule, under the requirements of section 1206(d)(1) of Public Law 113-67 to evaluate the payment rates and regulations governing subclause (II) LTCHs, we have reviewed Medicare data from the only hospital meeting the statutory definition of a subclause (II) LTCH. As a result of these analyses, we are applying a payment adjustment to subclause (II) LTCHs beginning in FY 2015, which will result in payments for this category of LTCHs that resemble a payment based upon a TEFRA payment model (that is, a reasonable cost payment, subject to a ceiling).

Section 4417(b) of the BBA established the meaning of “subsection (d) hospitals,” which are paid under the IPPS, and in doing so, excluded two categories of hospitals that experience extended average inpatient length of stays. It also authorized the Secretary to define how an average inpatient length of stay would be calculated for these excluded hospitals. These provisions are included under sections 1886(d)(1)(B)(iv)(I) and (d)(1)(B)(iv)(II) of the Act, and the two categories of hospitals are generally referred to as subclause (I) and subclause (II) LTCHs.

Subclause (I) LTCHs are required to have an average inpatient length of stay that is greater than 25 days. Subclause (II) LTCHs are only required to have an average inpatient length of stay of greater than 20 days. The subclause (II) LTCH definition further limited the classification of a subclause (II) LTCH by including the requirement that the LTCH must have been first excluded from the IPPS in CY 1986, and treated a Medicare inpatient population in which 80 percent of the discharges in the 12-month reporting period ending in Federal FY 1997 had a principal diagnosis that reflected a finding of neoplastic disease. This statutory requirement is implemented under 42 CFR 412.23(e)(2)(ii).

In establishing the category of subclause (II) LTCHs, Congress essentially authorized special treatment of a hospital that, since 1986, had focused on the provision of palliative care to Medicare beneficiaries diagnosed with end-stage cancer. In consideration of the distinction between hospitals qualifying as LTCHs, either as a subclause (I) LTCH or a subclause (II) LTCH, we established different standards for counting the average inpatient length of stay values for these two categories of LTCHs. We calculate the greater than 25-day average length of stay criteria using only Medicare claims data for subclause (I) LTCHs. However, for subclause (II) LTCHs, we calculate the average length of stay based on its entire patient population. We refer readers to the RY 2003 LTCH PPS final rule (67 FR 55974) for a full discussion of our rationale for implementing these average length of stay calculation methodologies.

The theoretical foundations of any PPS are based on a system of averages, where the costs of some cases may exceed the payment, while other cases' costs will be less than the payment, creating an adequate balance in payments. Therefore, it is assumed that a hospital paid under a PPS would be able to maintain a balance of patients that will allow the hospital to achieve fiscal stability. With that said, in developing the LTCH PPS we were aware that a per discharge PPS system that pays the same amount for every case in a specific MS-LTC-DRG could encourage hospitals to make decisions based on financial considerations (such as prematurely discharging patients to reduce the cost of such cases). As per discharge payments under the LTCH PPS are based on the extended lengths of stay that characterize LTCHs, at the outset of the LTCH PPS, we established a short-stay outlier (SSO) policy under which we apply a payment adjustment for LTCH discharges with lengths of stay that do not exceed 5/6 of the geometric average length of stay of the MS-LTC-DRG. Equally, we were aware that there would be exceptionally expensive cases that could create financial disincentives to treat such patients and, therefore, we adopted a high-cost outlier (HCO) policy as well. However, given the nature of a subclause (II) LTCH's patient population, it may not be reasonable to expect a balancing of more and less costly cases, as these LTCHs are generally only treating a subset of very sick patients. As such, we modified our original SSO payment policy for subclause (II) LTCHs, and we exempted this category of LTCHs from additional changes to the SSO policy to account for the extremely high percentage of cases that our data analysis revealed would have been subject to our SSO policy if that policy were to be applied to subclause (II) LTCHs.

In accordance with the requirements of section 1206(d)(1) of Public Law 113-67, we conducted an evaluation of the payment rates and regulations governing subclause (II) LTCHs. We analyzed MedPAR claims data for FY 2010 and estimated Medicare costs incurred by the one LTCH currently classified as a subclause (II) LTCH, a 225-bed LTCH located in New York. We also evaluated the same metrics for two comparison groups of LTCHs, that is, approximately 40 LTCHs located in the same census region (that is, the Northeast Census Region, which includes Connecticut, Maine, New Jersey, and Pennsylvania), and approximately 25 LTCHs with the same bed size category (that is, between 150 and 250 beds) in order to assess the distinctions between a subclause (I) LTCH and a subclause (II) LTCH. For purposes of this analysis, LTCH PPS payments were calculated from the payment field in the MedPAR claims data, and the estimated costs for those claims were calculated using the covered charges and CCRs in the Provider-Specific File (PSF) that correlate to the discharge date on each claim. We calculated the aggregate average margins (ratio of payment to costs) for the subclause (II) LTCH and for the two sets of comparison groups of LTCHs using the calculated FY 2010 costs and payments. Our analysis found that, under current LTCH PPS payment policy, the subclause (II) LTCH has much lower margins than comparable LTCHs located in the Northeast Census Region or LTCHs with 150-250 beds. Specifically, the subclause (II) LTCH had a negative margin for its Medicare patients paid under LTCH PPS in FY 2010, while both the Northeast Census Region LTCHs and LTCHs with 150-250 beds had positive aggregate margins for its Medicare patients paid under LTCH PPS for the same period.

In our evaluation of subclause (II) LTCHs under the LTCH PPS, in accordance with the requirements of section 1206(d) of Public Law 113-67, we also compared the types of patients treated at subclause (I) and subclause (II) LTCHs. The top five MS-LTC-DRGs for patients treated at the subclause (II) LTCH in FY 2010 account for almost one-third of all of its Medicare discharges. Four of the top five MS-LTC-DRGs for the subclause (II) LTCH involve a neoplastic disease, and its case-mix differs significantly from the subclause (I) LTCHs, which had large proportions of ventilator and respiratory patients. The five most common MS-LTC-DRGs for the subclause (I) LTCHs were: Respiratory system diagnosis with ventilator support 96+ hours (MS-LTC-DRG 207); Pulmonary edema and respiratory failure (MS-LTC-DRG 189); Septicemia or severe sepsis without ventilator support 96+ hours with MCC (MS-LTC-DRG 870); Skin ulcers with MCC (MS-LTC-DRG 592); and Respiratory system diagnosis with ventilator support < 96 hours (MS-LTC-DRG 208). In comparison, for the subclause (II) LTCH, the five most common MS-LTC-DRGs were: Respiratory neoplasms with CC (MS-LTC-DRG 181); Digestive malignancy with CC (MS-LTC-DRG 375); Respiratory neoplasms with MCC (MS-LTC-DRG 180); Organic disturbances & mental retardation (MS-LTC-DRG 884); and Malignancy, female reproductive system w CC (MS-LTC-DRG 755). These data highlight significant differences between a subclause (I) LTCH and a subclause (II) LTCH based on patient-mix and Medicare margins, notwithstanding the considerations that have been made in structuring the current LTCH regulations to acknowledge the uniqueness of an LTCH meeting the statutory definition of a subclause (II) LTCH.

In evaluating “both the payment rates and regulations governing hospitals which are classified under subclause (II) . . . ,” as required by section 1206(d) of Public Law 113-67, we also analyzed the impacts of upcoming changes to the LTCH PPS under section 1206(a) of Public Law 113-67. In discussing these analyses, we note that, as discussed in section VII.I.2. of the preamble of this final rule, we are not making any specific policy and payment changes in this final rule to implement the provisions of section 1206(a) of Public Law 113-67. We intend to establish policies related to the types of LTCH cases expected to meet the legislative patient-level criteria for the “standard LTCH PPS payment” and cases expected to meet the criteria for the “site neutral” payments under the LTCH PPS in the FY 2016 rulemaking cycle. Although we are not making any specific policy or payment changes in this final rule related to the provisions of section 1206(a) of Public Law 113-67 at this time, we discuss these provisions in this section because they relate to our analysis of the LTCH PPS payment rates and regulations governing subclause (II) LTCHs.

Absent the adoption of policies for the implementation of section 1206(d) of Public Law 113-67, the payment changes required by section 1206(a) of Public Law 113-67 will apply to subclause (II) LTCHs beginning with discharges occurring in cost reporting periods beginning on or after October 1, 2015 (that is, FY 2016 and beyond). Due to the changes required by the provisions of section 1206(a) of Public Law 113-67 (discussed at greater length under section VII.I. of the preamble of this final rule), beginning in FY 2016, only those LTCH discharges meeting specified patient-level clinical criteria will be paid a “standard LTCH PPS payment amount.” Discharges not meeting those criteria will be paid based on a “site neutral” payment amount (the lesser of the “IPPS comparable” amount, as applied under our SSO policy at § 412.529, or 100 percent of the estimated costs of the case). The statutory requirements to be paid the “standard LTCH PPS payment amount” are that the LTCH discharge does not have a principal diagnosis relating to a psychiatric diagnosis or to rehabilitation, and:

  • The stay in the LTCH was immediately preceded by a discharge from an acute care hospital that included at least 3 days in an intensive care unit (ICU); or
  • The stay in the LTCH was immediately preceded by a discharge from an acute care hospital and the patient's LTCH stay is assigned to an MS-LTC-DRG based on the receipt of ventilator services of at least 96 hours.

Furthermore, section 1206(a)(1)(C)(ii)(II) of Public Law 113-67 specifies that, effective with cost reporting periods beginning on or after FY 2020, any LTCH with an “LTCH discharge payment percentage” that demonstrates that more than 50 percent of that LTCH's discharges were paid for based on the “site neutral” payment rate will subsequently be paid for all discharges at the rate “. . . that would apply under subsection (d) for the discharge if the hospital were a subsection (d) hospital.” We refer readers to section VII.I. of the preamble of this final rule for a further discussion of the provisions of section 1206(a) of Public Law 113-67.

In light of these forthcoming statutory changes, we evaluated MedPAR claims data from the only hospital meeting the statutory definition of a subclause (II) LTCH for FY 2010 to project the impact of the revisions to the LTCH PPS made by section 1206(a) of Public Law 113-67. Our simulations included analyses of the potential financial impact of applying the patient-level criteria and “site neutral” payment policies to a subclause (II) LTCH, and the financial impact on payments if that LTCH were to be paid for more than 50 percent of its discharges at the “site neutral” payment rate. In conducting this analysis in the absence of rules implementing the changes mandated by section 1206(a) of Public Law 113-67, we assumed that there would be no changes in LTCH admission patterns in response to the LTCH PPS payment changes required by section 1206(a) of Public Law 113-67. Furthermore, we used the FY 2010 claims data for the subclause (II) LTCH and the two LTCH comparison groups described above in order to compare the potential effects of the payment changes under the LTCH PPS required by section 1206(a) of Public Law 113-67 between subclause (I) LTCHs and subclause (II) LTCHs. We simulated payments for those discharges that would be expected to meet the legislative patient-level criteria for the “standard LTCH PPS payment” and for discharges that would be expected to receive “site neutral” payments under the LTCH PPS. Our analysis found that the subclause (II) LTCH would be expected to have significantly fewer (approximately 5 times fewer) discharges that would be expected to meet the legislative patient-level criteria for the “standard LTCH PPS payment” than the comparison groups of subclause (I) LTCHs (that is, Northeast Census Region LTCHs and LTCHs with 150-250 beds).

Additionally, we analyzed the potential effects of the “LTCH discharge payment percentage” provision under the requirements of section 1206(a)(1)(C)(ii)(II) of Public Law 113-67, as noted above. We evaluated FY 2010 claims data from the subclause (II) LTCH to project the potential impact of this provision. Based on our simulations in which we projected which FY 2010 LTCH claims would be expected to receive “site neutral” payments under the LTCH PPS (as described above), and having found a significant number, we project that a significant negative financial impact would be imposed upon the subclause (II) LTCH's payments. Without considerable behavioral changes, the subclause (II) LTCH would be expected to have more than 50 percent of its discharges paid based on a “site neutral” payment and, therefore, would receive a payment adjustment under the provisions of section 1206(a)(1)(C)(ii)(II) of Public Law 113-67 for all of its discharges. Furthermore, our analysis revealed that, given the particular medical profile of their patient population, that the “subsection (d)” comparable payment amount under the payment adjustment required by section 1206(a)(1)(C)(ii)(II) of Public Law 113-67 would not likely cover the costs for a significant number of their discharges. Consequently, our analysis shows that the subclause (II) LTCH is projected to experience a large negative aggregate average margin for its Medicare discharges under the payment changes required by section 1206(a) of Public Law 113-67.

Based on our findings under our evaluation of payments to subclause (II) LTCHs under the LTCH PPS and consistent with the provisions of section 1206(d) of Public Law 113-67, we evaluated adjustments that could be applied to ensure appropriate payments under the LTCH PPS for a subclause (II) LTCH under the LTCH PPS. This analysis included consideration of a reasonable-cost based model, such as the TEFRA payment system under which certain PPS-excluded hospitals (such as children's and cancer hospitals) are currently paid. The TEFRA payment system, which was established under the provisions of Public Law 97-248, is implemented under the regulations at 42 CFR 413.40.

In addition to governing the current payment of certain PPS-excluded hospitals, the TEFRA payment system was also previously used to pay LTCHs prior to the implementation of the LTCH PPS. As described in the RY 2003 LTCH PPS final rule (67 FR 55957), the TEFRA payment system was “. . . established [to make] payments based on hospital-specific limits for inpatient operating costs. A ceiling on payments to such hospitals is determined by calculating the product of a facility's base year costs (the year on which its target reimbursement limit is based) per discharge, updated to the current year by a rate-of-increase percentage, and multiplied by the number of total current year discharges.” (A detailed discussion of target amount payment limits under Public Law 97-248 can be found in the September 1, 1983 final rule published in the Federal Register (48 FR 39746).)” Under the TEFRA payment system, in accordance with section 1886(g) of the Act, Medicare allowable capital costs are paid on a reasonable cost basis. We refer readers to the FY 2015 IPPS/LTCH PPS proposed rule for a detailed description of our analysis and evaluation of the application of the TEFRA payment model to a subclause (II) LTCH (78 FR 28202 through 28203). We note that in describing our estimated operating and capital payments under the TEFRA payment system principles in the proposed rule, we mistakenly stated that we used FY 2010 cost report data when those estimates were determined using FY 2011 cost report data.

Our analysis of the subclause (II) LTCH's projected payments under a TEFRA-payment model indicated that such payments would reasonably cover the costs for most of their discharges, and consequently, the subclause (II) LTCH is not projected to experience a negative aggregate margin for its Medicare discharges, unlike our projections under both the current LTCH PPS and the forthcoming payment changes to the LTCH PPS required by section 1206(a) of Public Law 113-67.

In the above analyses, we evaluated the current regulations as well as anticipated payment rates under various statutorily mandated policies for FY 2016 on a subclause (II) LTCH under the LTCH PPS based on FY 2010 discharge data, including payments, costs and case-mix. As discussed above, our evaluation indicates that, given the required patient-mix for a subclause (II) LTCH, the forthcoming changes to the LTCH PPS are likely to result in a financial situation that is not sustainable for the subclause (II) LTCH evaluated above. Furthermore, our analysis also shows that current LTCH PPS payments for a subclause (II) LTCH, even with taking into account the considerations that have been made in structuring current LTCH PPS policies to acknowledge the uniqueness of a subclause (II) LTCH, may not be sufficient to cover the costs incurred for the treatment of patients of the particular medical profile of the subclause (II) patient population prescribed by the statute. Furthermore, we believe that in establishing subclause (II) LTCHs, Congress endorsed the support of the unique mission of this particular category of hospital. In fact, while mandating a significant revision to the LTCH PPS under section 1206(a) of Public Law 113-67, under section 1206(d) of the same statute, Congress directed the Secretary to evaluate the impact of the LTCH PPS on subclause (II) LTCHs, and, based on those findings, authorized the Secretary to adjust payment rates and other regulations, as appropriate, for this category of LTCHs.

Accordingly, in recognition of the subclause (II) LTCH's current estimated payment-to-cost ratio under the LTCH PPS and further anticipated losses that would likely otherwise occur under the forthcoming statutory changes to the LTCH PPS, which would render this type of specially recognized facility fiscally untenable, we believe that it is appropriate to exercise the authority under section 1206(d)(2) of Public Law 113-67. Therefore, in this final rule, for cost reporting periods beginning on or after October 1, 2014 (FY 2015 and beyond), we are applying a payment adjustment to subclause (II) LTCH payments under the LTCH PPS such that these LTCH PPS payments will resemble payments made under the reasonable cost-based TEFRA payment system. We believe that it is appropriate to apply this payment adjustment for a subclause (II) LTCH's first cost reporting period beginning on or after October 1, 2014, rather than discharges occurring on or after October 1, 2014, because it is consistent with the annual update of the hospital-specific limits (ceiling) for inpatient operating costs under the TEFRA payment system (as described below). We are implementing this payment adjustment for subclause (II) LTCHs in the regulations by adding new § 412.526 under 42 CFR Part 412, Subpart O.

Specifically, in this final rule we are establishing new regulations under § 412.526 that will provide that, for cost reporting periods beginning on or after October 1, 2014, payments to a “subclause (II)” LTCH will be made under the LTCH PPS under Subpart O of Part 412, as adjusted. This adjusted payment amount will generally be equivalent to an amount determined under the reasonable cost-based reimbursement rules for both operating and capital-related costs under 42 CFR Part 413. As described above, Medicare payments for inpatient operating costs under the reasonable-cost based TEFRA payment system are subject to a hospital-specific ceiling on payments that is determined as the product of a hospital's base year costs per discharge (“target amount per discharge”), updated to the current year by a rate-of-increase percentage, and multiplied by the number of its Medicare discharges for the year. Medicare allowable inpatient capital-related costs are paid on a reasonable cost basis, in accordance with section 1886(g) of the Act.

Under this payment adjustment under new § 412.526 for inpatient operating costs, the adjusted payment amount will generally be determined in accordance with the cited provisions of § 413.40. Accordingly, we are establishing a “target amount” for a subclause (II) LTCH for purposes of calculating a hospital-specific ceiling on payments for inpatient operating costs under this payment adjustment. We will determine such a target amount based on the subclause (II) LTCH's target amount that was used to determine its payments for inpatient operating costs under the TEFRA payment system prior to the implementation of the LTCH PPS, updated by the TEFRA payment system rate-of-increase percentages under § 413.40(c)(3). Furthermore, in determining a subclause (II) LTCH's target amount for purposes of this payment adjustment, consistent with the statute (as explained below), we are not including the increases to LTCHs' TEFRA target amounts and caps provided for by section 307(a) of the BIPA. As discussed previously, prior to the implementation of the LTCH PPS, section 307(a) of the BIPA provided a 2-percent increase to the wage-adjusted 75th percentile cap on the TEFRA target amounts for existing LTCHs for cost reporting periods beginning in FY 2001 and a 25-percent increase to the hospital-specific TEFRA target amounts for LTCHs, subject to the increased 75th percentile cap. Section 307(a)(2) of the BIPA also specifies that the 2-percent increase to the 75th percentile cap and the 25-percent increase to the TEFRA target amounts were not to be taken into account in the development and implementation of the LTCH PPS. Therefore, consistent with the statutory requirement under section 307(a)(2) of the BIPA, under new § 412.526, we will determine a subclause (II) LTCH's updated target amount based on its FY 2000 TEFRA payment system target amount, the year prior to when the increases under section 307(a) of the BIPA were effective. Using its FY 2000 TEFRA payment system target amount will ensure that the increases provided for by section 307(a) of the BIPA will not be included in the LTCH PPS payments to subclause (II) LTCHs under this LTCH PPS payment adjustment. This approach for excluding those increases to the TEFRA payment system target amounts is consistent with the methodology that was used to develop the one-time prospective adjustment to the standard Federal rate in which we calculated what amount would have been paid under the TEFRA payment system had the LTCH PPS not been implemented (77 FR 53497 through 53500). Therefore, under the payment adjustment for subclause (II) LTCHs under new § 412.526, we will determine a FY 2015 LTCH PPS target amount by updating the subclause (II) LTCH's FY 2000 TEFRA target amount using the applicable rate-of-increase percentages for FYs 2001 through 2015 established under § 413.40(c)(3).

In addition, as with TEFRA payment system, we will pay for inpatient capital-related costs in accordance with the regulations under 42 CFR Part 413, under which Medicare allowable capital costs are paid on a reasonable cost basis, consistent with section 1886(g) of the Act.

Comment: Several commenters supported the proposed policy to apply a payment adjustment to subclause (II) LTCHs payments modeled on the TEFRA payment system. In addition, the commenters suggested that CMS provide the authority for this LTCH to request and receive an adjustment to its rate-of-increase ceiling, as specified in our TEFRA regulations at 42 CFR 413.40 (e), (g), and (i) for other hospitals paid on a TEFRA basis “. . . to address circumstances that arise that are beyond a hospital's control and render an applicable TEFRA ceiling amount inadequate.”

Response: We have evaluated the provisions specified by the commenters and considered the fiscal circumstances of the one subclause (II) LTCH that will be affected by the payment system revisions finalized in this final rule. In response to the commenters' concerns, we believe that it would be reasonable to consider circumstances that may arise that are beyond a hospital's control and that may render an applicable LTCH PPS ceiling amount inadequate. Therefore, we are adding new paragraph (c)(5)(i) under new § 412.526 entitled “Adjustments for Extraordinary circumstances.” Paragraph (c)(5)(i)(A) under new § 412.526 states that CMS may adjust the ceiling determined under paragraph (c)(1) of the section for one or more cost reporting periods when unusual inpatient operating costs have resulted in the hospital exceeding its ceiling imposed under this section due to extraordinary circumstances beyond the hospital's control. These circumstances include, but are not limited to, strikes, fire, earthquakes, floods, or similar unusual occurrences with substantial cost effects.

The other suggestion recommended by the commenters deal with the LTCH's ability to request an adjustment to their allowed LTCH PPS rate-of-increase ceiling, if their costs during a specific period are no longer comparable to the base year and the authority to request a new base year for its LTCH PPS target amount. Because our data reveal that, on average, for the past 6 years, this LTCH's costs are considerably below the amount that OACT calculated as its FY 2015 target amount, we believe that these additional features are unnecessary at this time. Moreover, if future data indicate that a change is warranted, we will consider proposing to add these features to our policy in future rulemaking.

In summary, for cost reporting periods beginning on or after October 1, 2014, we are establishing that payment to a “subclause (II)” LTCH will be made under the LTCH PPS, as adjusted. The adjusted payment amount will be comprised of an amount determined under the reasonable cost-based reimbursement rules for both operating and capital-related costs in accordance with the cited portions of Part 413. Under this payment adjustment, Medicare inpatient operating costs will be reimbursed on a reasonable cost basis, subject to a ceiling; that is, subject to an aggregate upper limit on the amount of a hospital's net Medicare inpatient operating costs that will be recognized for payment purposes. For each cost reporting period, the ceiling on payments for Medicare inpatient operating costs will be determined by multiplying the updated target amount for that period by the number of LTCH PPS discharges during that period. For cost reporting periods beginning during FY 2015, the target amount will be equal to the hospital's target amount determined under § 413.40(c)(4) for its cost reporting period beginning during FY 2000, updated by the applicable annual rate-of-increase percentages specified in § 413.40(c)(3) to the subject period (that is, for FYs 2001 through 2015). For subsequent cost reporting periods, the target amount will equal the hospital's target amount for the previous cost reporting period updated by the applicable annual rate-of-increase percentage specified in § 413.40(c)(3) for the subject cost reporting period. Payment for Medicare allowable inpatient capital-related costs under this payment adjustment will be made on a reasonable cost basis, in accordance with the cited portions of 42 CFR Part 413. In this final rule, we are codifying the provisions of this payment adjustment to subclause (II) LTCHs under new § 412.526 of the regulations. We are adding paragraph (c)(5), which establishes the general rules for requesting adjustments and also includes a provision to provide adjustments for unusual costs arising from extraordinary circumstances. In addition, we are making conforming changes to § 412.521(a)(2) to cross reference this payment adjustment under new § 412.526.

I. Description of Statutory Framework for Patient-Level Criteria-Based Payment Adjustment Under the LTCH PPS Under Pub. L. 113-67

1. Overview

In the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27668 through 27676), we presented a description of our research on the development of patient-level and facility-level criteria for LTCHs and a potential framework for developing changes to the LTCH PPS. The framework was based on the preliminary findings of two projects conducted by Kennell and Associates (Kennell) and its subcontractor, RTI, under the guidance of CMS' Center for Medicare and Medicaid Innovation (the Innovation Center). We stated that we believed that the findings from these projects, in large part, could be used to identify the subpopulation of Medicare beneficiaries that should form the core group of patients under the LTCH PPS (that is, a chronically critical ill/medically complex (CCI/MC) framework for the LTCH PPS). Although this research was not completed at the time of issuance of the FY 2014 IPPS/LTCH PPS proposed rule, we solicited feedback from LTCH stakeholders in the FY 2014 IPPS/LTCH PPS proposed rule on the description of the interim framework, and indicated that any public comments submitted would be evaluated and considered by our contractors with the expectation of formulating a proposal for FY 2015 based on this research (78 FR 27668 through 27676).

Section 1206(a) of Public Law 113-67 amended section 1886(m) of the Act by adding paragraph (6), which establishes patient-level criteria for payments under the LTCH PPS for implementation beginning in FY 2016. Therefore, our prior intention to present a proposal for a CCI/MC framework for the LTCH PPS (as discussed in the FY 2014 IPPS/LTCH PPS proposed and final rules) in the FY 2015 IPPS/LTCH PPS proposed rule was superseded. Accordingly, we did not propose revisions to the LTCH PPS based upon the Kennell/RTI framework for FY 2015. Rather, we stated that we intend to propose to implement the requirements established by section 1206(a) of Public Law 113-67 in the FY 2016 LTCH PPS rulemaking cycle. (We note that the final report on the CCI/MC framework developed by Kennell/RTI under our research contract is expected to be available later this year and will be made available to the public through a Web site.)

We refer readers to section VII.I.2. of the preamble of the proposed rule in which we summarized the statutory provisions of section 1206(a) of Public Law 113-67 (78 FR 28204). In section VII.I.2. of the preamble of this final rule, we discuss several significant issues arising from these statutory changes to the LTCH PPS, on which we requested stakeholder feedback prior to developing our proposals for FY 2016 implementation. We intend to propose the specific policy and payment changes that will be necessary to implement the provisions of Public Law 113-67 for cost reporting periods beginning on or after October 1, 2015, during the FY 2016 rulemaking cycle. Although we did not propose to make any policy and payment changes mandated by section 1206(a)(1) of Public Law 113-67 in the FY 2015 IPPS/LTCH PPS proposed rule, in light of the degree of the forthcoming changes, in section VII.I.3. of the preamble of the proposed rule, we discussed some of the changes and requested public feedback to inform our proposals for FY 2016.

2. Additional LTCH PPS Issues

The LTCH PPS was originally established for cost reporting periods beginning on or after October 1, 2002, by section 123(a) of the BBRA (Pub. L. 106-113) and section 307(b) of the BIPA (Pub. L. 106-554). (We also refer readers to section 1886(m) of the Act, as added by section 114(e) of the MMSEA.) Section 307(b) of the BIPA granted the Secretary considerable authority in developing the LTCH PPS, specifying that the Secretary shall “. . . examine and may provide for appropriate adjustments to the long-term hospital payment system, including adjustments to DRG weights, area wage adjustments, geographic reclassification, outliers, updates, and a disproportionate share adjustment. . . .”

Accordingly, as we evaluate the revisions to the LTCH PPS required by section 1206(a)(1) of Public Law 113-67, we believe that the broad authority permitted by the original statutory mandates continues to grant us the authority to modify, if appropriate, methodologies for our payment determinations under the LTCH PPS. (We refer readers to the RY 2003 LTCH PPS final rule (67 FR 55954), which describes the development and implementation of the LTCH PPS for FY 2003.) Specifically, section 1206(a) of Public Law 113-67 establishes two distinct payment groups for LTCH discharges under the revised system: discharges meeting specified patient-level criteria that will be paid under the “standard LTCH PPS payment amount” and all other patient discharges that will be paid under the “site neutral” payment rate and methodology (discussed above). In setting the payment rates and factors under the LTCH PPS as required by section 1206(a) of Public Law 113-67 for certain LTCH PPS payment adjustments, such as the MS-LTC-DRG relative weights and high-cost outlier payments, we plan to evaluate whether it would be appropriate to modify our historical methodology to account for the establishment of the two distinct payment methodologies for LTCHs. For example, we intend to examine whether, beginning in FY 2016, it is still appropriate to include data for all LTCH PPS cases, including “site neutral” payment cases, in our methodology for setting relative payment weights for MS-LTC-DRGs. We also intend to explore the need for changes to the LTCH PPS high-cost outlier payment policies. Given the fact that, for a number of LTCH patients, payment will be made based on the lower of the “IPPS comparable” per diem payment and the estimated cost of the case, we will need to decide whether to maintain a single high-cost outlier “target” for all LTCH PPS cases (including “site neutral” payment cases) or whether it may be more appropriate to establish separate high-cost outlier “targets” for each of the two payment groups under the revised LTCH PPS. Our existing methodology for calculating the MS-LTC-DRG relative weights is discussed during the annual rulemaking cycle and was, most recently, included in the FY 2014 IPPS/LTCH final rule (78 FR 50753 through 50760). Our detailed description of our existing high-cost outlier payment policy, which has remained the same since being implemented, can be found in the RY 2003 LTCH PPS final rule (67 FR 56022 through 56027). (We note that our methodology for calculating the MS-LTC-DRG relative payment weights for FY 2015 can be found in section VII.B.3. of the preamble of this final rule, and our policies under the high-cost outlier payment policy for FY 2015 can be found in section V.D. of the Addendum to this final rule.)

In the FY 2015 IPPS/LTCH PPS proposed rule, we stated that we were interested in receiving feedback from LTCH stakeholders on our plans to evaluate whether it would be appropriate to modify any of our historical methodologies as we implement the payment changes to the LTCH PPS under section 1206(a) of Public Law 113-67. In particular, we were interested in public feedback on the issues mentioned earlier (that is, policies relating to establishing the relative payment weights and high-cost outliers) so that we may evaluate various options in preparation for developing proposals to implement the statutory changes beginning in FY 2016.

Comment: In response to our request for feedback from LTCH stakeholders, numerous commenters addressed the setting of relative payment weights for MS-LTC-DRGs and establishing a high-cost outlier policy under the new LTCH PPS framework. MedPAC urged CMS to establish “. . . new LTCH base payment rates and new relative payment weights for each MS-LTC-DRG based solely on the most recent available standardized data associated with discharges meeting the specified patient-level criteria.” MedPAC stated that the change in methodology required by the new LTCH PPS framework should not result in increased aggregate payments for the cases paid under the standard LTCH PPS rate under the new LTCH PPS framework. MedPAC also recommended that both standard and site neutral payments receive high-cost outliers, and that total outlier payments under the LTCH PPS continue to account for 8 percent of total LTCH payments for all cases (both payment types combined) with the “same uniform national fixed-loss amount . . . applied to both cases being paid the standard LTCH PPS payment amount and to cases being paid the site neutral amount.” Most of the other commenters recommended that the high-cost outlier threshold and MS-LTC-DRG relative payment weights be calculated only using data from cases that meet the patient-level criteria established by section 1206 of Public Law 113-67; that is, cases for whom Medicare will make standard payments under the LTCH PPS, without including data on “site neutral” payments. Some of the commenters urged CMS to focus on keeping payments for standard cases at the same payment level as they have recently been, and recommended focusing only on standard cases for the calculation of the high-cost outlier threshold and for establishing MS-LTC-DRG relative payment weights. Other commenters recommended setting the fixed-loss threshold for high-cost outliers at 8 percent initially and then readjusting the threshold as more data become available. Several commenters conducted individual analyses and specifically recommended setting the fixed-loss threshold at 8 percent for each of the two payment types, standard and site neutral. A number of commenters made recommendations regarding specific aspects of the law. Other commenters opined that site neutral payments should be based on a full IPPS payment rather than the lesser of an IPPS comparable payment and the estimated costs of the case. Many commenters expressed concern regarding the severity of illness of many LTCH patients for whom site neutral payments would be made under the new LTCH PPS framework, and noted that the costs of treating such patients would not be covered under the statutory framework and could result in patient access problems for LTCH care. Other commenters suggested that the patient-level criteria that would have to be met in order for an LTCH to receive the standard payment rate be expanded to include severe wound care patients and diabetes diagnoses with post-surgical complications. Several commenters suggested that the statutory language be clarified regarding the application of IPPS ICU and CCU codes 020X and 021X to determine compliance with the 3-day criteria, and urged CMS to consider all categories within those codes. Several commenters requested that CMS hold public meetings for stakeholders to address the issues presented by the implementation of section 1206 of Public Law 113-67.

Response: We appreciate the commenters' thoughtful and detailed feedback, particularly those comments received regarding setting relative payment weights for MS-LTC-DRGs and establishing a high-cost outlier policy under the new LTCH PPS framework. In preparation for proposing the new LTCH PPS framework in the FY 2016 IPPS/LTCH PPS proposed rule, we will consider these suggestions and respond to stakeholders' concerns with openness and transparency.

Comment: MedPAC included additional comments on CMS' SSO policy in light of the new LTCH PPS framework that it believed are appropriate for inclusion in this final rule. MedPAC believed that the existing SSO policy, which pays an adjusted amount for cases with lengths of stay less than or equal to five-sixths of the geometric average length of stay for the MS-LTC-DRG, provides an incentive for LTCHs to make discharge decisions based on financial gain rather than purely clinical reasons. MedPAC analyses of LTCH discharge patterns indicate that the frequency of discharges rises sharply immediately after the SSO threshold. Once the statutory changes to the LTCH PPS are implemented, MedPAC recommended limiting the application of the existing SSO policy solely to cases paid under the standard LTCH PPS rate, and modifying the SSO policy to reduce the existing financial incentives by lowering the payment penalty for discharging patients before the SSO threshold. MedPAC recommended adopting the methodology used under the IPPS transfer policy; that is, for the first day of SSO cases payments would be twice the per diem rate for the MS-LTC-DRG with payment for each additional day set at the per diem rate up to the maximum of the full standard per discharge payment, which would only be reached 1 day before the average length of stay for the MS-LTC-DRG. For LTCH cases paid based on the site neutral payment methodology under the forthcoming statutory framework, MedPAC suggested that CMS adopt the short-stay policies that apply under the IPPS. Another commenter urged CMS to consider implementing a number of the SSO suggestions made by MedPAC.

Response: We appreciate MedPAC's detailed and thoughtful suggestions.

J. Technical Change

In this final rule, we are updating the legislative authorities cited for the regulations governing the LTCH PPS under Subpart O of Part 412. Specifically, we are adding references under new paragraphs (a)(4), (a)(5), and (a)(6) of § 412.500 of the regulations to the revisions to the Act made by section 4302(a) of Public Law 111-5, sections 3106(a) and 10312(a) of Public Law 111-148, and section 1206 of Public Law 113-67, respectively.

VIII. Administrative Appeals by Providers and Judicial Review

A. Proposed and Final Changes Regarding the Claims Required in Provider Cost Reports and for Provider Administrative Appeals

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978), we proposed to revise the cost reporting regulations in 42 CFR Part 413, Subpart B, by requiring a provider to include an appropriate claim for a specific item in its Medicare cost report in order to receive or potentially qualify for Medicare payment for the specific item. If the provider's cost report does not include an appropriate claim for a specific item, we proposed that payment for the item will not be included in the notice of program reimbursement (NPR) issued by the Medicare administrative contractor (MAC) (formerly known as fiscal intermediary and herein referred to as “contractor”) or in any decision or order issued by a reviewing entity (as defined in 42 CFR 405.1801(a) of the regulations) in an administrative appeal filed by the provider. In addition, we proposed to revise the appeals regulations in 42 CFR Part 405, Subpart R, by eliminating the requirement that a provider must include an appropriate claim for a specific item in its cost report in order to meet the dissatisfaction requirement for jurisdiction before the Provider Reimbursement Review Board (Board), and by specifying the procedures for Board review of whether the provider's cost report meets the proposed substantive reimbursement requirement of an appropriate cost report claim for a specific item. We also proposed technical revisions to other Board appeal regulations to conform those regulations to the main revisions (described above) to the cost reporting regulations and the provider appeal regulations, and proposed similar revisions to the Part 405, Subpart R regulations for appeals before the contractor hearing officers. We proposed that these revisions to the cost reporting regulations and the provider appeals regulations would apply to provider cost reporting periods beginning on or after the effective date of the final IPPS annual update rule.

We received numerous public comments of varied legal and procedural opinions in response to our proposals to revise the cost reporting regulations and the provider appeals regulations. The concerns raised by commenters about the breadth of the proposed provisions, and the questions raised in public comments about the interpretations we provided in the preamble to the proposed rule, have instead provided us with an opportunity to further and more fully dissect and digest the public comments. Therefore, we are not finalizing our proposals to revise the cost reporting regulations and the provider appeals regulations as set forth in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978). We note that, in this final rule, we are not addressing public comments received with respect to the provisions of the proposed rule that we are not finalizing at this time. Rather, we will address them at a later time, in a subsequent rulemaking document, as appropriate.

B. Proposed and Final Changes To Conform Terminology From “Intermediary” to “Contractor”.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978), we proposed to conform the terminology in Part 405, Subpart R and all subparts of Part 413 from “intermediary” or “fiscal intermediary” to “contractor” pursuant to sections 1816, 1874A and 1878 of the Act.

We did not receive any public comments on our proposal to conform the terminology in Part 405, Subpart R and all subparts of Part 413 from “intermediary” or “fiscal intermediary” to “contractor” pursuant to sections 1816, 1874A and 1878 of the Act. Therefore, we are finalizing our proposal to conform the terminology in Part 405, Subpart R and all subparts of Part 413 from “intermediary” or “fiscal intermediary” to “contractor”.

C. Technical Correction to § 405.1835 of the Regulations and Corresponding Amendment to § 405.1811 of the Regulations

1. Background and Technical Correction to §§ 405.1811 and 405.1835 of the Regulations

Section 1878(a) of the Act allows providers to appeal to the Provider Reimbursement Review Board (the Board) final determinations of program reimbursement made by a contractor, as well as certain final determinations by the Secretary involving payment under section 1886(d) (the inpatient hospital prospective payment system) and section 1886(b) (commonly known as the Tax Equity and Fiscal Responsibility Act of 1982 (TEFRA) payment system) of the Act. In addition, by regulation, providers are given the right to appeal to the Board or to contractor hearing officers certain other determinations.

Under section 1878(a)(1)(A), (2), and (3) of the Act, and § 405.1835(a)(1), (2), and (3)(i) of the regulations, a provider may obtain a Board hearing if it meets three jurisdictional requirements: (1) the provider is dissatisfied with a final determination of the contractor or the Secretary; (2) the amount in controversy is at least $10,000; and (3) the provider files a request for a hearing to the Board within 180 days of notice of the final determination of the contractor or the Secretary. The same jurisdictional requirements govern provider appeals to contractor hearing officers under § 405.1811(a)(1), (a)(2), and (a)(3)(i) of the regulations, except that the amount in controversy requirement is at least $1,000 but less than $10,000.

However, the statutory requirements for Board jurisdiction are somewhat different if the provider does not receive a final determination of the contractor on a timely basis. Under section 1878(a)(1)(B), (a)(2), and (a)(3) of the Act, a provider may obtain a Board hearing if: (1) the provider does not receive a final determination of the contractor on a timely basis; (2) the amount in controversy is at least $10,000; and (3) the provider files a request for a hearing to the Board within 180 days after notice of the contractor's final determination would have been received if such contractor determination had been issued on a timely basis. Moreover, § 405.1835(a)(3)(ii) of the regulations provides that a contractor determination is not timely if it is not issued, through no fault of the provider, within 12 months of the contractor's receipt of the provider's perfected cost report or amended cost report (as specified in § 413.24(f) of the regulations). The same jurisdictional requirements govern provider appeals to contractor hearing officers, based on an untimely contractor determination, under § 405.1811(a), except that the amount in controversy requirement is at least $1,000 but less than $10,000.

As noted, section 1878(a)(1)(A) of the Act requires that the provider “is dissatisfied with a final determination” of the contractor or the Secretary. However, section 1878(a)(1)(B) of the Act does not require provider dissatisfaction for Board appeals based on an untimely final contractor determination.

Before a 2008 final rule (73 FR 30190; May 23, 2008) substantially amended the appeals rules, the regulations tracked fully the statute as to whether provider dissatisfaction was a prerequisite for Board jurisdiction. In the 2007 edition of the appeals regulations, § 405.1835(a) addressed the requirements for Board appeals of final contractor determinations, and referred to § 405.1841(a), which required the provider to set forth its dissatisfaction with specific aspects of the contractor determination. Thus, consistent with section 1878(a)(1)(A) of the Act, § 405.1835(a) and § 405.1841(a) of the 2007 regulations required provider dissatisfaction for Board appeals of final contractor determinations.

By contrast, Board appeals based on untimely contractor determinations were addressed in § 405.1835(c), which did not reference provider dissatisfaction. Instead, § 405.1835(c) simply provided that notwithstanding the provisions of paragraph (a)(1) of the section, the provider also has a right to a hearing before the Board if an intermediary's determination is not rendered within 12 months after receipt of a provider's perfected cost report or amended cost report provided such delay was not occasioned by the fault of the provider. Thus, as with section 1878(a)(1)(B) of the Act, § 405.1835(c) of the 2007 regulations did not require provider dissatisfaction for Board appeals based on untimely final contractor determinations.

In the 2008 final rule (73 FR 30190), § 405.1835 was substantially revised, § 405.1841 was removed, and the prior provisions in paragraph (c) of § 405.1835 for Board appeals based on untimely contractor determinations were also eliminated. As amended, § 405.1835(a) now states that a provider has a right to a Board hearing “only if” three criteria are satisfied. First, the provider must have “preserved its right to claim dissatisfaction with the amount of Medicare payment” by making a cost report claim for the item in dispute, or by “self-disallowing” the item by listing it as a “protested amount” in the cost report. Second, the amount in controversy must be at least $10,000. Third, the Board must receive the provider's hearing request within 180 days after the provider received the final determination of the intermediary or the Secretary. However, if a final contractor determination is not issued (through no fault of the provider) within 12 months of the contractor's receipt of the provider's perfected cost report or amended cost report, a Board hearing must be requested within 180 days after the expiration of that 12 month period. Under the existing regulations, provider dissatisfaction is a requirement for Board jurisdiction over appeals based on an untimely contractor determination, as well as for appeals of a final determination of the contractor or the Secretary.

As amended by the 2008 final rule (73 FR 30190), § 405.1835(a)'s provisions for Board appeals based on untimely contractor determinations no longer track fully the provisions for such appeals in section 1878(a)(1)(B) of the Act. Specifically, § 405.1835(a) of the regulations now requires provider dissatisfaction as a condition for Board jurisdiction over appeals based on an untimely contractor determination, but section 1878(a)(1)(B) of the Act does not impose a provider dissatisfaction requirement for such appeals.

When this difference between § 405.1835(a) of the regulations and section 1878(a)(1)(B) of the Act came to our attention, we looked into this matter. After reviewing the 2008 final rule and the corresponding parts of the 2004 proposed rule (69 FR 35716; June 25, 2004), we determined that the inclusion in § 405.1835(a) of a provider dissatisfaction requirement for Board appeals based on an untimely contractor determination reflects an inadvertent error in the drafting of the 2008 final rule and the 2004 proposed rule.

In this final rule, we are revising § 405.1835 of the regulations to eliminate provider dissatisfaction as a requirement for Board jurisdiction over appeals based on untimely contractor determinations. This is simply a technical correction inasmuch as this amendment to § 405.1835 conforms the regulations to the provisions in section 1878(a)(1)(B) of the Act for Board appeals based on an untimely contractor determination. In effect, this amendment to § 405.1835 of the regulations restores the full conformity of the regulations with the statutory requirements for Board jurisdiction over appeals based on untimely contractor determinations—a conformity that obtained before the 2008 final rule (73 FR 30190) inadvertently imposed a provider dissatisfaction requirement for Board appeals based on untimely contractor determinations. Moreover, in order to maintain consistency between the regulations for Board appeals and the rules for contractor hearing officer appeals, we also are revising § 405.1811 of the regulations to eliminate provider dissatisfaction as a requirement for contractor hearing officer jurisdiction over appeals based on untimely contractor determinations.

2. Waiver of Notice of Proposed Rulemaking

We ordinarily publish a notice of proposed rulemaking in the Federal Register to provide a period for public comment before the provisions of a rule take effect in accordance with section 553(b) of the Administrative Procedure Act (APA) (5 U.S.C. 553(b)). However, we can waive this notice and comment procedure if the Secretary finds, for good cause, that the notice and comment process is impracticable, unnecessary, or contrary to the public interest, and incorporates a statement of the finding and the reasons therefore in the notice.

We find it unnecessary to undertake notice-and-comment rulemaking for the above-described revisions because those revisions are simply technical corrections that bring § 405.1835 of the Board appeals regulations into full conformity with section 1878(a)(1)(B) of the Act, and maintain consistency between § 405.1811 of the intermediary (contractor) hearing officer appeals regulations and § 405.1835 of the Board appeals regulations. The revisions do not represent changes in policy, nor do they have a substantive effect, and the public interest would be best served by timely correction of these technical errors. Therefore, we find good cause to waive notice and comment procedures.

3. Effective Date and Applicability Date; Finality and Reopening

The technical correction to § 405.1835 of the Board appeals regulations and the corresponding revision to § 405.1811 of the intermediary (contractor) hearing officer appeals regulations is effective October 1, 2014. The revisions to § 405.1835 of the Board appeals regulations and § 405.1811 of the intermediary (contractor) hearing officer appeals regulations are applicable, subject to the rules of administrative finality and reopening in § 405.1807 and § 405.1885 of the regulations, to appeals pending or filed on or after the August 21, 2008 effective date of the 2008 final rule (73 FR 30190).

The technical correction to § 405.1835 of the Board appeals regulations and the corresponding revision to § 405.1811 of the intermediary (contractor) hearing officer appeals regulations apply automatically to appeals, based on an untimely contractor determination, pending or filed on or after the October 1, 2014 effective date of this final rule. If the Board or the Administrator of CMS finally dismissed an appeal, based on an untimely contractor determination, due to the provider's failure to make an appropriate cost report claim for the matter at issue, the provider may ask the Board or the Administrator, as applicable, to reopen such decision pursuant to § 405.1885 of the regulations and apply this technical correction to § 405.1835 of the Board appeals regulations, provided that such final jurisdictional dismissal decision was issued no more than 3 years before the October 1, 2014 effective date of this final rule. Similarly, if the contractor hearing officer or the CMS reviewing official finally dismissed an appeal, based on an untimely contractor determination, due to the provider's failure to make an appropriate cost report claim for the matter at issue, the provider may ask the contractor hearing officer or the CMS reviewing official, as applicable, to reopen such decision pursuant to § 405.1885 of the regulations and apply this technical correction to § 405.1811 of the intermediary (contractor) hearing officer appeals regulations, provided that such final jurisdictional dismissal decision was issued no more than 3 years before the October 1, 2014 effective date of this final rule.

We believe that, because the above-described regulatory amendments are simply technical corrections that do not make substantive changes to the regulations for appeals to the Board and the contractor hearing officers, the public interest is served by correcting the inadvertent drafting errors in the 2008 final rule's provisions for appeals to the Board and the contractor hearing officers based on untimely contractor determinations. As technical corrections to the 2008 final rule, we believe the above-described amendments to § 405.1811 and § 405.1835 should apply as of the August 21, 2008 effective date of the 2008 final rule, subject to the rules of administrative finality and reopening in § 405.1807 and § 405.1885 of the regulations.

We believe that fixing the applicability date, subject to the rules of administrative finality and reopening in § 405.1807 and § 405.1885 of the regulations, of these amendments by reference to the August 21, 2008 effective date of the 2008 final rule is not impermissibly retroactive in effect because the amendments simply correct and clarify longstanding agency policy and practice, and are procedural in nature. For example, we refer readers to Heimmermann v. First Union Mortgage Corp., 305 F.3d 1257, 1260-61 (11th Cir. 2002) (a rule clarifying the law, especially in an unsettled or confusing area of the law, is not a substantive change in the law, and thus the rule may apply to matters that preceded issuance of the rule.) However, if the above-described amendments to § 405.1811 and § 405.1835 were deemed a retroactive application of a substantive change to a regulation, section 1871(e)(1)(A) of the Act permits retroactive application of a substantive change to a regulation if the Secretary determines that such retroactive application is necessary to comply with statutory requirements or that failure to apply the change retroactively would be contrary to the public interest. We have determined that any retroactive application of these amendments to § 405.1811 and § 405.1835 is necessary to ensure full compliance with the statutory provisions for Board appeals based on untimely contractor determinations (under section 1878(a)(1)(B) of the Act). We have further determined that it would be in the public interest to apply these amendments, subject to the rules of administrative finality and reopening in § 405.1807 and § 405.1885 of the regulations, to Board appeals and contractor hearing officer appeals that were initiated or pending on or after the August 21, 2008 effective date of the 2008 final rule. The alternative, of not applying these amendments to § 405.1811 and § 405.1835 to Board appeals and contractor hearing officer appeals that were initiated or pending on or after the August 21, 2008 effective date of the 2008 final rule, would be inconsistent with the statutory provisions for Board appeals based on untimely contractor determinations (under section 1878(a)(1)(B) of the Act) and would undermine the public interest in maintaining consistency between the requirements for Board appeals and contractor hearing officer appeals.

IX. Quality Data Reporting Requirements for Specific Providers and Suppliers

We seek to promote higher quality and more efficient health care for Medicare beneficiaries. This effort is supported by the adoption of widely agreed-upon quality measures. We have worked with relevant stakeholders to define quality measures for most settings and to measure various aspects of care for most Medicare beneficiaries. These measures assess structural aspects of care, clinical processes, patient experiences with care, care coordination, and improving patient outcomes.

We have implemented quality reporting programs for multiple care settings, including:

  • Hospital inpatient services under the Hospital Inpatient Quality Reporting (IQR) Program (formerly referred to as the Reporting Hospital Quality Data for Annual Payment Update (RHQDAPU) Program);
  • Hospital outpatient services under the Hospital Outpatient Quality Reporting (OQR) Program (formerly referred to as the Hospital Outpatient Quality Data Reporting Program (HOP QDRP));
  • Care furnished by physicians and other eligible professionals under the Physician Quality Reporting System (PQRS, formerly referred to as the Physician Quality Reporting Program Initiative (PQRI));
  • Inpatient rehabilitation facilities under the Inpatient Rehabilitation Facility Quality Reporting Program (IRF QRP);
  • Long-term care hospitals under the Long-Term Care Hospital Quality Reporting (LTCHQR) Program;
  • PPS-exempt cancer hospitals under the PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program;
  • Ambulatory surgical centers under the Ambulatory Surgical Center Quality Reporting (ASCQR) Program;
  • Inpatient psychiatric facilities under the Inpatient Psychiatric Facilities Quality Reporting (IPFQR) Program;
  • Home health agencies under the home health quality reporting program (HH QRP); and,
  • Hospice facilities under the Hospice Quality Reporting Program.

We have also implemented the End-Stage Renal Disease Quality Incentive Program and Hospital Value-Based Purchasing Program (described further below) that link payment to performance.

In implementing the Hospital IQR Program and other quality reporting programs, we have focused on measures that have high impact and support CMS and HHS priorities for improved quality and efficiency of care for Medicare beneficiaries. Our goal for the future is to align the clinical quality measure requirements of the Hospital IQR Program with various other Medicare and Medicaid programs, including those authorized by the Health Information Technology for Economic and Clinical Health (HITECH) Act, so that the reporting burden on providers will be reduced. As appropriate, we will consider the adoption of clinical quality measures with electronic specifications so that the electronic collection of performance information is part of care delivery. Establishing such a system will require interoperability between EHRs and CMS data collection systems, additional infrastructural development on the part of hospitals and CMS, and adoption of standards for capturing, formatting, and transmitting the data elements that make up the measures. However, once these activities are accomplished, adoption of measures that rely on data obtained directly from EHRs will enable us to expand the Hospital IQR Program measure set with less cost and reporting burden to hospitals. We believe that in the near future, collection and reporting of data elements through EHRs will greatly simplify and streamline reporting for various CMS quality reporting programs, and that hospitals will be able to switch primarily to EHR-based data reporting for many measures that are currently manually chart-abstracted and submitted to CMS for the Hospital IQR Program.

We also have implemented a Hospital Value-Based Purchasing (VBP) Program under section 1886(o) of the Act. In 2011, we issued the Hospital Inpatient VBP Program final rule (76 FR 26490 through 26547). We most recently adopted additional policies for the Hospital VBP Program in section XIV. of the CY 2014 OPPS/ASC final rule with comment period (78 FR 75120 through 75121). We are finalizing additional policies for this program in section IV.I. of the preamble of this final rule. Under the Hospital VBP Program, hospitals will receive value-based incentive payments based on their quality performance with respect to performance standards for a performance period for the fiscal year involved. The measures under the Hospital VBP Program must be selected from the measures (other than readmission measures) specified under the Hospital IQR Program as required by section 1886(o)(2)(A) of the Act.

In selecting measures for the Hospital IQR Program, we are mindful of the conceptual framework we have described for the Hospital VBP Program. The Hospital IQR Program is linked with the Hospital VBP Program because many of the measures and the reporting infrastructure for the programs overlap. We view the Hospital VBP Program as the next step in promoting higher quality care for Medicare beneficiaries by transforming Medicare from a passive payer of claims into an active purchaser of quality healthcare for its beneficiaries. Value-based purchasing is an important step to revamping how care and services are paid for, moving increasingly toward rewarding better value, outcomes, and innovations instead of merely volume.

We also view the Hospital-Acquired Condition (HAC) payment adjustment program authorized by section 1886(p) of the Act, as added by section 3008 of the Affordable Care Act, and the Hospital VBP Program, as related but separate efforts to reduce HACs. The Hospital VBP Program is an incentive program that awards payments to hospitals based on quality performance on a wide variety of measures, while the HAC Reduction Program creates a payment adjustment resulting in payment reductions for the lowest performing hospitals based on their rates of HACs. Newly finalized policies for the Hospital VBP Program are included in section IV.I. of the preamble of this final rule. Newly finalized policies for the HAC Reduction Program are included in section IV.J. of the preamble of this final rule.

Although we intend to monitor the various interactions of programs authorized by the Affordable Care Act and their overall impact on providers and suppliers, we also view programs that could potentially affect a hospital's Medicaid payment as separate from programs that could potentially affect a hospital's Medicare payment.

In the preamble of this final rule, we are finalizing changes to the following Medicare quality reporting systems:

  • In section IX.A., the Hospital IQR Program.
  • In section IX.B., the PCHQR Program.
  • In section IX.C., the LTCHQR Program.

In addition, in section IX.D. of the preamble of this final rule, we are finalizing changes to the Medicare EHR Incentive Program.

A. Hospital Inpatient Quality Reporting (IQR) Program

1. Background

a. History of the Hospital IQR Program

We refer readers to the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 43860 through 43861) and the FY 2011 IPPS/LTCH PPS final rule (75 FR 50180 through 50181) for detailed discussions of the history of the Hospital IQR Program, including the statutory history, and to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50789 through 50807) for the measures we have adopted for the Hospital IQR measure set through the FY 2016 payment determination and subsequent years.

b. Maintenance of Technical Specifications for Quality Measures

The technical specifications for the Hospital IQR Program measures, or links to Web sites hosting technical specifications, are contained in the CMS/The Joint Commission (TJC) Specifications Manual for National Hospital Quality Measures (Specifications Manual). This Specifications Manual is posted on the QualityNet Web site at http://www.qualitynet.org/. We generally update the Specifications Manual on a semiannual basis and include in the updates detailed instructions and calculation algorithms for hospitals to use when collecting and submitting data on required measures. These semiannual updates are accompanied by notifications to users, providing sufficient time between the change and the effective date in order to allow users to incorporate changes and updates to the specifications into data collection systems.

The technical specifications for the HCAHPS patient experience of care survey are contained in the current HCAHPS Quality Assurance Guidelines manual, which is available at the HCAHPS On-Line Web site, http://www.hcahpsonline.org. We maintain the HCAHPS technical specifications by updating the HCAHPS Quality Assurance Guidelines manual annually, and include detailed instructions on survey implementation, data collection, data submission and other relevant topics. As necessary, HCAHPS Bulletins are issued to provide notice of changes and updates to technical specifications in HCAHPS data collection systems.

Many of the quality measures used in different Medicare and Medicaid reporting programs are endorsed by the National Quality Forum (NQF). As part of its regular maintenance process for endorsed performance measures, the NQF requires measure stewards to submit annual measure maintenance updates and undergo maintenance of endorsement review every 3 years. In the measure maintenance process, the measure steward (owner/developer) is responsible for updating and maintaining the currency and relevance of the measure and will confirm existing or minor specification changes with NQF on an annual basis. NQF solicits information from measure stewards for annual reviews, and it reviews measures for continued endorsement in a specific 3-year cycle.

The NQF regularly maintains its endorsed measures through annual and triennial reviews, which may result in the NQF making updates to the measures. We believe that it is important to have in place a subregulatory process to incorporate nonsubstantive updates made by the NQF into the measure specifications we have adopted for the Hospital IQR Program so that these measures remain up-to-date. We also recognize that some changes the NQF might make to its endorsed measures are substantive in nature and might not be appropriate for adoption using a subregulatory process.

Therefore, In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53504 through 53505), we finalized a policy under which we use a subregulatory process to make nonsubstantive updates to measures used for the Hospital IQR Program. With respect to what constitutes substantive versus nonsubstantive changes, we expect to make this determination on a case-by-case basis. Examples of nonsubstantive changes to measures might include updated diagnosis or procedure codes, medication updates for categories of medications, broadening of age ranges, and exclusions for a measure (such as the addition of a hospice exclusion to the 30-day mortality measures). We believe that nonsubstantive changes may include updates to NQF-endorsed measures based upon changes to guidelines upon which the measures are based.

We will continue to use rulemaking to adopt substantive updates made to measures we have adopted for the Hospital IQR Program. Examples of changes that we might consider to be substantive would be those in which the changes are so significant that the measure is no longer the same measure, or when a standard of performance assessed by a measure becomes more stringent (for example, changes in acceptable timing of medication, procedure/process, or test administration). Another example of a substantive change would be where the NQF has extended its endorsement of a previously endorsed measure to a new setting, such as extending a measure from the inpatient setting to hospice. These policies regarding what is considered substantive versus nonsubstantive would apply to all measures in the Hospital IQR Program. We also note that the NQF process incorporates an opportunity for public comment and engagement in the measure maintenance process.

We believe this policy adequately balances our need to incorporate updates to Hospital IQR Program measures in the most expeditious manner possible while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted.

c. Public Display of Quality Measures

Section 1886(b)(3)(B)(viii)(VII) of the Act, as amended by section 3001(a)(2) of the Affordable Care Act, requires that the Secretary establish procedures for making information regarding measures submitted available to the public after ensuring that a hospital has the opportunity to review its data before they are made public. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28218 through 28219), we did not propose to change our current policy of reporting data from the Hospital IQR Program as soon as it is feasible on CMS Web sites such as the Hospital Compare Web site (http://www.medicare.gov/hospitalcompare) and/or the interactive https://data.medicare.gov Web site, after a preview period.

The Hospital Compare Web site is an interactive Web tool that assists beneficiaries by providing information on hospital quality of care to those who need to select a hospital. For more information on measures reported to Hospital Compare, please see http://www.medicare.gov/hospitalcompare. Other information not reported to Hospital Compare may be made available on other CMS Web sites such as http://www.cms.hhs.gov/HospitalQualityInits/ or https://data.medicare.gov.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50777 through 50778) we responded to public comments on what additional quality measures and information featured on Hospital Compare may be highly relevant to patients and other consumers of health care, and how we may better display this information on the Hospital Compare Web site.

2. Removal and Suspension of Hospital IQR Program Measures

a. Considerations in Removing Quality Measures From the Hospital IQR Program

As discussed further below, we generally retain measures from the previous year's Hospital IQR Program measure set for subsequent years' measure sets except when we specifically propose to remove or replace them. As we stated in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50185), the criteria that we consider when determining whether to remove Hospital IQR Program measures are the following: (1) Measure performance among hospitals is so high and unvarying that meaningful distinctions and improvements in performance can no longer be made (“topped-out” measures); (2) availability of alternative measures with a stronger relationship to patient outcomes; (3) a measure does not align with current clinical guidelines or practice; (4) the availability of a more broadly applicable (across settings, populations, or the availability of a measure that is more proximal in time to desired patient outcomes for the particular topic; (5) performance or improvement on a measure does not result in better patient outcomes; (6) the availability of a measure that is more strongly associated with desired patient outcomes for the particular topic; and (7) collection or public reporting of a measure leads to negative unintended consequences other than patient harm. We also take into account the views of the Measure Applications Partnership (MAP) when determining when a measure should be removed, and we strive to eliminate redundancy of similar measures (77 FR 53505 through 53506).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28219), we proposed to change the criteria for determining when a measure is “topped-out.” A measure is “topped-out” when measure performance among hospitals is so high and unvarying that meaningful distinctions and improvements in performance can no longer be made (“topped-out” measures) (77 FR 53505 through 53506). We do not believe that measuring hospital performance on “topped-out” measures provides meaningful information on the quality of care provided by hospitals. We further believe that quality measures, once “topped-out,” represent care standards that have been widely adopted by hospitals. We believe such measures should be considered for removal from the Hospital IQR Program because their associated reporting burden may outweigh the value of the quality information they provide.

In order to determine “topped-out” status, we proposed to apply the following two criteria, the first of which was previously adopted by the Hospital VBP Program in the Hospital Inpatient VBP Program final rule (76 FR 26496 through 26497), to Hospital IQR Program measures. The second criterion is a modified version of what was previously adopted by the Hospital VBP Program in the above mentioned final rule, with the change from the “less than” operator (<) to the “less than or equal to” operator (≥):

  • Statistically indistinguishable performance at the 75th and 90th percentiles; and
  • Truncated coefficient of variation ≤ 0.10.

The coefficient of variation (CV) is a common statistic that expresses the standard deviation as a percentage of the sample mean in a way that is independent of the units of observation. Applied to this analysis, a large CV would indicate a broad distribution of individual hospital scores, with large and presumably meaningful differences between hospitals in relative performance. A small CV would indicate that the distribution of individual hospital scores is clustered tightly around the mean value, suggesting that it is not useful to draw distinctions among individual hospitals' measure performance. By adopting “less than or equal to” in our “topped-out” test, we are clarifying the interpretation of the CV when a tie at 0.1 occurs due to rounding. We believe that the proposed criteria distinguish measures with significant variation in performance among hospitals.

In the Hospital VBP Program context, we used a modified version of the CV, namely a truncated CV, for each measure, in which the 5 percent of hospitals with the lowest scores, and the 5 percent of hospitals with highest scores were first truncated (set aside) before calculating the CV. This was done to avoid undue effects of the highest and lowest outlier hospitals, which if included, would tend to greatly widen the dispersion of the distribution and make the measure appear to be more reliable or discerning.

Comment: A number of commenters supported the criteria for determining when a measure is “topped-out.” Some commenters specifically noted that removing “topped-out” measures will reduce hospital reporting burden.

Several commenters supported removing “topped-out” chart-abstracted measures. Some commenters specifically supported the removal of structural measures.

Response: We appreciate the commenters' support for removing “topped-out” measures. We will consider removal of topped-out structural measures in future years consistent with our measure removal and topped-out status policies.

Comment: A commenter stated that the process of care measures that are “topped-out” should be removed both in their chart-abstracted and electronic clinical quality measure versions. The commenter believed that interpreting disparate and incorrect performance rates for the measures as reported in their electronic versions is burdensome to stakeholders, and that the specifications for the chart-abstracted and electronic versions of measures would be misaligned which may lead to issues in capturing the full range of patient care. The commenter also expressed concern about which electronic versions of these measures will be submitted to CMS. Finally, the commenter stated that process of care measures, whether submitted as chart-abstracted or electronic versions, distract from measures of outcomes and hospital-acquired conditions.

Response: We would like to clarify that we consider both the chart-abstracted and the electronically specified versions to be “topped-out.” However, we would like to retain the electronically specified versions of these “topped-out” measures for the following reasons: (1) To align the Hospital IQR Program and the Medicare EHR Incentive Program, (2) to allow us to monitor the effectiveness of measure reporting by EHRs, and (3) to familiarize hospitals with reporting electronically specified measures to us.

As we continue aligning the Hospital IQR Program and the Medicare EHR Incentive Program, and we believe collecting this measure on a voluntary basis enables us to continue collecting quality data on this topic while working to minimize reporting burden on participating hospitals. We believe that the benefits outweigh the possible disadvantages to reporting the electronic clinical quality measure versions of these measures. Collecting the electronic version of these measures would prepare hospitals for data submission using our electronic measure specifications prior to electronic clinical quality measures becoming a requirement in the Hospital IQR Program. Retaining of the electronic versions of these topped-out measures creates alignment with the Medicare EHR Incentive Program.

We remind commenters that hospitals could choose whether to submit the voluntary electronic clinical quality measures. We also would allow the voluntary submission of the chart-abstracted version of the “topped-out” measures for those hospitals that prefer to submit measure data in that format. In this way, we believe that we are representing the full range of care provided to patients and responding to commenters' concerns.

We acknowledge the commenter's concerns that with multiple versions of a particular electronic clinical quality measure creates confusion for hospitals to determine which one to use. To address this concern, we are modifying our proposal to finalize a policy that hospitals must submit the April 2014 version of the electronic clinical quality measures as discussed in section IX.A.2.h.(1) of the preamble of this final rule.

Comment: Several commenters urged CMS to consider the broader context and uses of measures before removing them based on quantitative data only, noting that some measures meeting the “topped-out” criteria may still provide value to patients and hospitals.

Response: We agree that both quantitative criteria and clinically-based qualitative criteria should be used in assessing “topped-out” measures. These criteria are part of the existing criteria available to us to determine whether to remove a measure from the Hospital IQR Program. As we stated in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50185), the criteria that we consider when determining whether to remove Hospital IQR Program measures are the following: (1) Measure performance among hospitals is so high and unvarying that meaningful distinctions and improvements in performance can no longer be made (“topped-out” measures); (2) availability of alternative measures with a stronger relationship to patient outcomes; (3) a measure does not align with current clinical guidelines or practice; (4) the availability of a more broadly applicable (across settings, populations, or the availability of a measure that is more proximal in time to desired patient outcomes for the particular topic; (5) performance or improvement on a measure does not result in better patient outcomes; (6) the availability of a measure that is more strongly associated with desired patient outcomes for the particular topic; and (7) collection or public reporting of a measure leads to negative unintended consequences other than patient harm.

We also take into account the views of the Measure Applications Partnership (MAP) when determining when a measure should be removed, and we strive to eliminate redundancy of similar measures (77 FR 53505 through 53506).

Comment: A few commenters wanted CMS to continue publicly reporting topped-out measures used in pay-for-performance or payment penalty programs or to maintain focus on issues hospitals achieved high performance.

Response: We will allow those hospitals that would like to submit the voluntary measures in chart-abstracted format or as electronic clinical quality measures.

After consideration of the public comments we received, we are finalizing our proposal to update the criteria to determine “topped-out” measure status as proposed.

b. Removal of Hospital IQR Program Measures for the FY 2017 Payment Determination and Subsequent Years

As we continue moving towards including more clinical outcomes measures as opposed to process-of-care measures in the Hospital IQR Program measure set, we have considered removing additional measures using our previously-adopted removal criteria. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28219 through 28220), we proposed to remove five measures from the Hospital IQR Program for the FY 2017 payment determination and subsequent years, which begins in the CY 2015 reporting period: (1) AMI-1 Aspirin at arrival (NQF #0132); (2) AMI-3 ACEI/ARB for left ventricular systolic dysfunction (NQF #0137); (3) AMI-5 Beta-blocker prescribed at discharge (NQF #0160); (4) SCIP Inf-6 Appropriate Hair Removal; and (5) Participation in a systematic database for cardiac surgery (NQF #0113).

We proposed to remove the first four measures because they were previously determined to be “topped-out” and suspended (77 FR 53509). We proposed to remove the fifth measure because the MAP recommended the measure's removal in its MAP Pre-Rulemaking Report: 2014 Recommendations on Measures for More than 20 Federal Programs, which is available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. The MAP report states that the measure's NQF endorsement has been placed on reserve status because the measure is “topped-out.” The purpose of reserve status is to retain endorsement of reliable and valid quality performance measures that have overall high levels of performance with little variability so that performance could be monitored in the future if necessary to ensure that performance does not decline. This status would apply only to highly credible, reliable, and valid measures that have high levels of performance due to quality improvement actions (often facilitated or motivated through public reporting and other accountability programs). More information about NQF reserve status is available at: https://www.qualityforum.org/docs/Reserve_Endorsement_Status.aspx.

By removing these measures, we would alleviate the maintenance costs and administrative burden to hospitals associated with retaining them. Should we determine that hospital adherence to these practices has unacceptably declined, we would propose to resume data collection in future rulemaking. In addition, we would comply with any requirements imposed by the Paperwork Reduction Act before re-proposing these measures.

We also analyzed the remainder of the Hospital IQR Program measure set for other potential “topped-out” measures using the previously adopted criteria. The analysis was based on the most recent two quarters of clinical process of care data available in the CMS Clinical Data Warehouse for IPPS eligible hospitals, which covers a measurement period from 01/01/2013 to 06/30/2013 (Q1 2013-Q2 2013). Based on this analysis and using the previously adopted criteria, we noted that an additional 15 chart-abstracted measures were “topped-out,” and we proposed to remove them from the measure set for the FY 2017 payment determination and subsequent years.

However, we proposed to retain the electronic clinical quality measure version of 10 of these chart-abstracted measures for Hospital IQR Program reporting as discussed further in section IX.A.7.f. of the preamble of this final rule. As we continue aligning the Hospital IQR Program and Medicare EHR Incentive Program, and we believe collecting this measure on a voluntary basis enables us to continue collecting quality data on this topic while working to minimize reporting burden on participating hospitals.

Further, allowing hospitals the option to electronically report topped-out measures will provide hospitals with an opportunity to test the accuracy of their electronic health record reporting systems. We believe that retaining “topped-out” measures under certain circumstances enables us to continue monitoring the clinical topic covered by the measure to ensure that hospitals continue to maintain high levels of performance. Further, we believe the additional reporting burden associated with retaining these measures is mitigated by retaining electronic versions of those measures, which are more easily reported by hospitals. These 10 measures are denoted in the chart below by an asterisk.

“Topped-Out” Chart-Abstracted Measures Proposed for Removal for the FY 2017 Payment Determination

AMI-1: Aspirin at Arrival (previously suspended)
AMI-3: ACEI or ARB for left ventricular systolic dysfunction—Acute Myocardial Infarction (AMI) Patients (previously suspended) (NQF #0137)
AMI-5: Beta-Blocker Prescribed at Discharge for AMI (previously suspended) (NQF #0160)
AMI-8a: Primary PCI received within 90 minutes of hospital arrival * (NQF #0163)
HF-2: Evaluation of left ventricular systolic function (NQF #0135)
PN-6: Initial antibiotic selection for community-acquired pneumonia (CAP) in immunocompetent patients* (NQF #0147)
SCIP-Card-2: Surgery patients on beta blocker therapy prior to arrival who received a beta blocker during the perioperative period (NQF #0284)
SCIP-Inf-1: Prophylactic antibiotic received within one hour prior to surgical incision* (NQF #0527)
SCIP-Inf-2: Prophylactic antibiotic selection for surgical patients* (NQF #0528)
SCIP-Inf-3: Prophylactic antibiotics discontinued within 24 hours after surgery end time (48 hours for cardiac surgery) (NQF #0529)
SCIP-Inf-4: Cardiac surgery patients with controlled postoperative blood glucose (NQF #0300)
SCIP-Inf-6: Surgery patients with appropriate hair removal (previously suspended) (NQF #0301)
SCIP-Inf-9: Urinary catheter removed on Postoperative Day 1 (POD1) or Postoperative Day 2 (POD2) with day of surgery being day zero* (NQF #0453)
SCIP-VTE-2: Surgery Patients Who Received Appropriate Venous Thromboembolism (VTE) Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After Surgery (NQF #0218)
STK-10: Assessed for rehabilitation* (NQF #0441)
STK-2: Discharged on antithrombotic therapy * (NQF #0435)
STK-3: Anticoagulation therapy for atrial fibrillation/flutter* (NQF #0436)
STK-5: Antithrombotic therapy by the end of hospital day two* (NQF #0438)
VTE-4: Patients receiving un-fractionated Heparin with doses/labs monitored by protocol*
Participation in a systematic database for cardiac surgery (NQF #0113)
* To be retained as an electronic clinical quality measure.

We welcomed public comments on our proposal to remove these measures.

Comment: Many commenters supported the removal of “topped-out” measures, some saying that by doing so CMS is reducing hospital burden.

Response: We thank the commenters for their support.

Comment: A commenter opposed the removal of the AMI-1 measure. The commenter noted that aspirin after a myocardial infarction is a potentially life-saving measure and should continue to be tracked.

Response: We thank the commenter for their recommendation. We are removing AMI-1 because the measure is “topped-out” and was previously suspended in FY 2012 IPPS/LTCH PPS final rule. We believe that the practice of providing aspirin to patients on arrival to the hospital addressed by this measure continues to be routinely practiced. As the practice measured by the AMI-1 measure is standard procedure among most hospitals, we do not believe that retaining it as a chart-abstracted measure would be a value to hospitals or for monitoring quality performance.

Comment: A commenter opposed the removal of AMI-8a: Primary PCI Received within 90 Minutes of Hospital Arrival because it is “topped-out.” The commenter did not believe that it is appropriate to retire a measure without first finding a replacement measure. The commenter was concerned that the retirement of numerous AMI and heart failure measures may unintentionally shift hospital resources to other measures and adversely affect the quality of care received by these patients.

Response: We respectfully disagree with the commenter that we should not remove a measure until a replacement is found. We believe that we should retire measures once we determine that there is no further value to hospitals or patients because the process of care the measure is monitoring has become standard practice. We believe that removing “topped-out” measures are appropriate and necessary to improve patient care. As we stated in the proposed rule, we believe that quality measures, once “topped-out,” represent care standards that have been widely adopted by hospitals (79 FR 28219). Therefore, it makes sense to remove the “topped-out” measures and adopt other measures which may represent care standards that are not widely adopted by hospitals, but which we believe should be widely adopted.

We invite the commenter to recommend measures for the Hospital IQR Program through the Measures Under Consideration process for our consideration. Information on how to recommend measures for the Hospital IQR Program is available at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/CallForMeasures.html.

Comment: One commenter opposed the removal of HF-2: Evaluation of Left Ventricular Systolic Function because it is “topped-out.” With the removal of this measure, the commenter noted that the only heart failure measures left in the program will be the 30-day readmission and 30-day mortality measures. The commenter is concerned that removing this measure will signal to hospitals that heart failure is not a CMS priority.

Response: We respectfully disagree with the commenter that the removal of “topped-out” measures will result in hospitals no longer focusing on the practice the measure is monitoring. Hospitals are committed to providing good quality care to patients and we do not have any indication that they will stop doing so in these areas for which the quality of care measured has become standard practice.

Comment: A commenter suggested that CMS continue to collect chart-abstracted data on SCIP-Inf-3 for another year because is inappropriate to assume that the measure will be “topped-out” given that the measure had significant data definition changes effective January 1, 2014. SCIP-Inf-3 no longer excludes for patients on home antibiotics or that do not receive general anesthesia.

Response: We acknowledge that SCIP-Inf-3 no longer excludes for patients on home antibiotics, however our analysis showed that these patients were being excluded by documentation of infection. For this reason, this change was not considered to be substantive enough to withhold removal of the measure. With regard to the concern about the exclusion for patients that do not receive general anesthesia, SCIP-Inf 3 measure has never had an exclusion for anesthesia type so this would have no impact on the measure results, and would not change our topped-out status analysis. We continue to believe SCIP-Inf-3 is “topped-out” and should be removed from the Hospital IQR Program.

Comment: Several commenters questioned the removal of SCIP-Inf-4, stating that CMS cannot assess whether the measure is topped-out. These commenters stated that CMS revised the specifications for the SCIP-Inf-4: Cardiac Surgery Patients with Controlled Postoperative Blood Glucose measure to incorporate the recent NQF endorsement maintenance decisions, beginning with January 1, 2014, discharges. These commenters stated that the NQF changed the measure from controlled glucose at 6AM to a more comprehensive measure of controlled glucose 18-24 hours post-cardiac surgery, and required that corrective action be documented if post-operative glucose is over 180mg/dl. These commenters expressed concern that these substantial changes would change the performance scores.

Response: We acknowledge that there were refinements made to SCIP-Inf-4 that were finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50787 through 50788). The “topped-out” analysis cited in the proposed rule (79 FR 28220) was completed using SCIP-Inf-4 data before these refinements were implemented. Because we do not yet have sufficient data to accurately assess whether this refined measure meets “topped-out” criteria, we are modifying our proposal and will not remove this measure. Instead, we will continue to require reporting on SCIP-Inf-4 in the Hospital IQR Program as previously finalized.

Comment: Several commenters supported the removal of STK-2, STK-3, STK-5, and STK-10.

Response: We thank the commenters for their support. We believe that these four measures are “topped-out” and will be removed from the Hospital IQR Program in their chart-abstracted measure version. Please note, however, that we will continue to accept STK-2, STK-3, STK-5, and STK-10 data as electronic clinical quality measures.

Comment: A commenter opposed the removal of STK-2, STK-3, STK-5, and STK-10 measures because they are “topped-out.” The commenter believed that CMS should allow hospitals to choose whether they wish to report these measures via EHR or via claims registry. The commenter stated that providing hospitals with alternate mechanisms for reporting is important at this juncture, and can allow for the measure developer to identify any issues with the electronic specifications of the measures.

Response: We note that the commenter seeks alternative reporting mechanisms for measures. However, submission via a claims registry, which would be such an alternative reporting mechanism, is not a feasible option at this time as these measures do not have claims-based specifications nor do we have a claims registry for the Hospital IQR Program. Hospitals may report on these measures using the electronic clinical quality measure specifications and submit using QRDA Category I. We believe that these four measures are “topped-out” and should be removed as a requirement from the Hospital IQR Program in their chart-abstracted measure versions.

Comment: A commenter supported the proposal to remove the six “topped-out” measures noted for permanent removal and the 4 “topped-out,” previously suspended measures proposed for permanent removal. This same commenter did not support the retention of the electronic version of 10 measures to support the voluntary electronic reporting option due to the cost of implementing electronic tools, and having the loss or convenience of chart abstracted measures that help the commenter keep track of their performance of these medical conditions. The commenter was also concerned that without clearly established goals and expectations for core measures by CMS and TJC that there will be discrepancies in performance.

Response: We thank the commenter for this feedback. We appreciate how the commenter is making full use of the “topped-out” measures and applaud their striving towards constant quality improvement. We note, however that we are encouraging, through alignment with Medicare EHR Incentive Program, to have all facilities move to electronic measures. We also believe that aligning electronic measures across facilities will minimize confusion between quality reporting programs. Regarding the concern that without clearly established goals and expectations for core measures by CMS and TJC there will be discrepancies in performance, we appreciate this concern and will take this into consideration during our daily operations.

Comment: A commenter asked CMS to clarify how the SCIP measures can be topped-out for the Hospital IQR Program but required for PPS-exempt Cancer Hospitals (PCHs). The commenter asked whether the measures specifications will be provided in a manual other than the Inpatient Specifications Manual if they are removed from the Hospital IQR Program. The commenter also asked whether the measures will still be programmed into the CMS Abstraction and Reporting Tool (CART).

Response: Although the SCIP measures are “topped-out” under the Hospital IQR Program, for the reasons discussed in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50840 through 50841), we believe that the SCIP measures are appropriate for the PCH setting. At this time, we do not have sufficient data to determine whether these SCIP measures are “topped-out” in the PCH setting, given that hospital inpatient facilities and PCHs treat different patient populations and the lack of evidence that the SCIP measures are “topped-out” in the PCH setting.

We will assess “topped-out” status of the SCIP measures as part of our PCHQR measure analysis in our annual measures consideration. We believe that this analysis must focus on evidence specific to the PCH setting. We recognize that the PCHQR patient population is exclusively comprised of cancer patients, unlike “subsection (d)” hospitals included in the Hospital IQR Program. We will, however, continue to monitor and evaluate the PCHQR SCIP measures. In addition, we will consider adopting the “topped-out” criteria and measure removal policies for the PCHQR Program similar to those adopted by other quality reporting programs, including the Hospital IQR Program, in future years. We will also support PCHQR program reporting of patient level data to QualityNet by updating the CART tool to reflect the current SCIP measure specifications.

We intend to post SCIP and other PCHQR measures in the PCHQR Specifications Manual. As a result, the existing information technology infrastructure will be available for the PCHQR Program.

Comment: A commenter supported the transition of SCIP-Inf-1, SCIP-Inf-2, and SCIP-Inf-9 to voluntary electronic clinical quality measures.

Response: We thank the commenter for their support.

Comment: Some commenters opposed the proposal to retain the electronic versions of 10 of the “topped-out” chart-abstracted measures to support the voluntary electronic measure reporting option. A commenter stated that the proposed modification in the voluntary electronic reporting program holds the form of the data collected for quality measurement to a higher scientific significance than the data collected as a metric to assess the delivery of care. The commenter stated that this proposal would neither lead to improved hospital quality nor offer us insight on how to improve electronic clinical quality measures. The commenter recommended that CMS work with the Office of the National Coordinator (ONC) and the Agency for Healthcare Research and Quality (AHRQ) to study the feasibility, reliability and validity of electronic clinical quality measures to effectively calculate and report clinical quality measures that are at least as accurate as chart-abstracted measures.

Response: As discussed above, we believe that retaining electronic versions of chart-abstracted measures in certain circumstances enables us to continue monitoring the covered clinical topic while reducing hospitals' reporting burden, and we view both of those actions as desirable. We note further that we are encouraging hospitals to familiarize themselves with the electronic measure submission process by retaining electronic versions of certain measures, and we will also be able to assess differences in clinical quality measure data between the two data capture methods. We believe that understanding any discrepancies between the two data capture methods will help us as we transition to electronic reporting of clinical quality measures. This also will lead to hospitals improving how they report clinical quality data electronically, which can be used to improve patient care.

We respectfully disagree that the proposed measures lack scientific significance. Each measure, as it is fully described, provides evidence of its significance.

We thank the commenter for their suggestion to work with the ONC and AHRQ to study the feasibility, reliability and validity of electronic clinical quality measures to effectively calculate and report clinical quality measures that are at least as accurate as chart-abstracted measures. We will take this suggestion under consideration.

Comment: Some commenters asked CMS to delay adopting “topped-out” measures as voluntary electronic clinical quality measures for one year to allow hospitals time to prepare to collect the measure electronically.

Another commenter suggested that including these measures sends the wrong message about the goals of the Hospital IQR Program and the Stage 3 Meaningful Use Program and inappropriately distracts resources from areas that would more readily benefit from targeted attention. Instead, the commenter recommended that we address further alignment through the advancement of electronic quality measures required for the Medicare EHR Incentive Program. If CMS decide to move forward with this policy, the commenter urged CMS to publicly report the measures somewhere other than Hospital Compare to leave the space for measures that are more meaningful to consumers and purchasers.

Response: We respectfully disagree with the commenters. We do not agree that delaying by one year the adoption of “topped-out” measures as voluntary electronic clinical quality measures would be useful because reporting is voluntary. Any hospital can choose not to report these “topped-out” measures as electronic clinical quality measures. By retaining “topped-out” chart-abstracted measures as voluntary electronic clinical quality measures, we are encouraging hospital to familiarize themselves with the electronic measure submission process and we can assess differences in clinical quality measure data between the two data capture methods. Allowing voluntary submission of the “topped-out” measures will help us monitor for declines in performance.

We also disagree with the commenter that the removal of “topped-out” measures will result in hospitals no longer focusing on the practice the measure is monitoring. We believe that hospitals are committed to providing good quality care to patients and we do not have any indication that they will stop doing so in these areas for which the quality of care measured has become standard practice.

We thank the commenter for their suggestion to publicly report the measures somewhere other than Hospital Compare. We will take this suggestion under consideration. We welcome any suggestions commenters have on further aligning the Hospital IQR Program with the EHR Incentive Program.

Comment: A few commenters advised that although CMS may no longer require hospitals to submit data on topped-out measures, hospitals will be required to submit data on measures required by TJC for accreditation. The commenters stated that this lack of alignment creates a burden for hospitals and does not allow hospitals to plan for the future. A commenter encouraged us to work with TJC when proposing measures to remove from the Hospital IQR Program because many of these measures remain core measure reporting requirements for TJC.

Response: We wish to reduce burden on hospitals for reporting “topped-out” measures to us, and believe that our proposal accomplishes that intent and focuses measurement on quality areas that can be improved. We invite the commenter to relay their concerns to TJC as to why TJC requires hospitals to report “topped-out” measures.

Comment: A commenter asked that CMS move cautiously with respect to removing measures and adopting more clinical outcome measures noting it should be done with ample opportunity for public comment to ensure these measures are tested and validated prior to adoption. The commenter noted that vetting is important, as hospitals need sufficient lead in time to implement measures, especially those with information technology requirements.

Response: We thank the commenter for the suggestion and will provide the public the necessary time period to comment. We have six criteria for determining whether to remove a measure from the Hospital IQR Program, including a measure's “topped-out” status as described above in section IX.A.2.a. of the preamble of this final rule.

We would like to clarify that the public has many opportunities to comment on potential measures through the measure adoption process, which includes the public posting of the MUC (Measures Under Consideration) list, the NQF measure endorsement process, and comments on the annual rulemaking process for the Hospital IQR Program.

Comment: A commenter requested clarification regarding why CMS is proposing to remove all of the suspended/voluntary measures except IMM-1 and if IMM-1 will continue to be suspended for FY 2017.

Response: We proposed to remove the suspended voluntary measures because of their “topped-out” status. IMM-1 was not proposed for removal because this measure will be reported in another program and we are responding to the need for more harmonized and global clinical quality measures. This measure was finalized for reporting in the PQRS in the CY 2013 Medicare Physician Fee Schedule final rule with comment period (see Table 95 at 77 FR 69215). As we stated above in section IXA.2.(a), “topped-out” status is only one of the six considerations we use in determining whether to remove a clinical quality measure from the Hospital IQR Program.

Comment: One commenter expressed concern that CMS may use a subregulatory process to make “nonsubstantive” updates to measures and that CMS may consider changes to age groups to be “nonsubstantive.” The commenter recommended that any review of changes to include individuals under the age of 18 in measures that were initially developed for adult populations include a process for review by a panel of pediatric experts, opportunity for broad stakeholder comment and appropriate testing of the revised measure.

Response: We thank the commenter for the suggestion. We will consider the suggestion to include a pediatric expert review process when considering the inclusion of the under 18 population to measures exclusively including the adult population.

After consideration of the public comments we received, we are finalizing our policy as proposed with one modification. We are finalizing removal of 19 measures for the FY 2017 payment determination and subsequent years as noted in the chart above with the exception of the SCIP-Inf-4 measure, which we are retaining in the Hospital IQR Program measure set in its chart-abstracted form as previously finalized.

We are also finalizing our proposal to retain reporting for 10 of these “topped-out” measures as electronic clinical quality measures as noted in the chart above. We believe this approach provides CMS an opportunity to monitor topped-out measures for performance decline. This policy simplifies alignment between the Hospital IQR and Medicare EHR Incentive Programs for eligible hospitals and provides a more straight-forward approach to educate stakeholders on electronic reporting options.

3. Process for Retaining Previously Adopted Hospital IQR Program Measures for Subsequent Payment Determinations

We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53512 through 53513), for our finalized measure retention policy. When we adopt measures for the Hospital IQR Program beginning with a particular payment determination, these measures are automatically adopted for all subsequent payment determinations unless we propose to remove, suspend, or replace the measures.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28220) we did not propose any changes to our policy for retaining previously adopted measures for subsequent payment determinations.

4. Additional Considerations in Expanding and Updating Quality Measures Under the Hospital IQR Program

We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53510 through 53512) for a discussion of the considerations we use to expand and update quality measures under the Hospital IQR Program. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28220) we did not propose any changes to the considerations in expanding or updating quality measures.

5. Previously Adopted Hospital IQR Program Measures for the FY 2016 Payment Determination and Subsequent Years

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28220 through 28221), for currently adopted and future condition-specific, claims-based measures, beginning with the FY 2017 payment determination and subsequent years, we proposed to use 3 years of data to calculate measures unless otherwise specified. In other words, this reporting period would apply to all future calculations of condition specific measures already adopted in the Hospital IQR Program and any condition-specific measures that may be subsequently adopted in future years. The currently adopted, applicable measures are:

  • Hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following acute myocardial infarction (AMI) hospitalization for patients 18 and older (NQF #0230).
  • Hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following heart failure (HF) hospitalization for patients 18 and older (NQF #0229).
  • Hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following pneumonia hospitalization (NQF #0468).
  • Stroke 30-day mortality rate.
  • Hospital 30-Day, All-Cause, Risk-Standardized Mortality Rate (RSMR) following Chronic Obstructive Pulmonary Disease (COPD) Hospitalization (NQF #1893).
  • 30-day all-cause, Acute Myocardial Infarction (AMI) 30-day risk standardized readmission rate (RSMR) following Acute Myocardial Infarction (AMI) hospitalization (NQF #0505).
  • 30-day all-cause, risk standardized readmission rate (RSMR) following Heart Failure (HF) hospitalization (NQF #0330).
  • 30-day all-cause, risk standardized readmission rate (RSMR) following Pneumonia (PN) hospitalization (NQF #0506).
  • 30-day risk standardized readmission rate (RSMR) following Total Hip/Total Knee Arthroplasty (NQF #1551).
  • 30-day risk standardized readmission rate (RSMR) following Stroke hospitalization.
  • 30-day risk standardized readmission rate (RSMR) following COPD hospitalization (NQF #1891).
  • Hip/Knee Complication: Hospital-level Risk-Standardized Complication Rate (RSCR) following Elective Primary Total Hip Arthroplasty (NQF #1550).

We welcomed public comments on our proposal to use 3 years of data to calculate current and future condition-specific, claims-based measures.

Comment: Several commenters supported CMS' proposal to use 3 years of claim-based data for all currently adopted and future condition-specific, claims-based measures, for the FY 2017 payment determination and subsequent years.

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing our proposal to use 3 years of data to calculate current and future condition-specific, claims-based measures as proposed.

The following table shows measures previously adopted for the Hospital IQR Program, including suspended measures.

Hospital IQR Program Measures Previously Adopted for the FY 2016 Payment Determination and Subsequent Years

Short nameMeasure nameNQF No.FY 2016 payment determination
AMI-1Aspirin at ArrivalN/AData collection suspended.
AMI-3ACEI or ARB for LVSDNQF #0137Data collection suspended.
AMI-5Beta-Blocker Prescribed at DischargeNQF #0160Data collection suspended.
AMI-7aFibrinolytic Therapy Received Within 30 Minutes of Hospital ArrivalNQF #0164Required.
AMI-8aPrimary PCI Received Within 90 Minutes of Hospital ArrivalNQF #0163Required.
HF-2Evaluation of LVS FunctionNQF #0135Required.
PN-6Initial Antibiotic Selection for community-acquired pneumonia (CAP) in Immunocompetent PatientsNQF #0147Required.
SCIP-Inf-1Prophylactic Antibiotic Received Within One Hour Prior to Surgical IncisionNQF #0527Required.
SCIP-Inf-2Prophylactic Antibiotic Selection for Surgical PatientsNQF #0528Required.
SCIP-Inf-3Prophylactic Antibiotics Discontinued Within 24 Hours After Surgery End Time—Overall RateNQF #0529Required.
SCIP-Inf-4Cardiac Surgery Patients with Controlled Postoperative Blood GlucoseNQF #0300Refined measure specifications.
SCIP-Inf-6Surgery Patients with Appropriate Hair RemovalNQF #0301Data collection suspended.
SCIP-Inf-9Urinary catheter removed on Postoperative Day 1 (POD 1) or Postoperative Day 2 (POD 2) with day of surgery being day zeroNQF #0453Required.
SCIP-Card-2Surgery Patients on Beta-Blocker Therapy Prior to Arrival Who Received a Beta-Blocker During the Perioperative PeriodNQF #0284Required.
SCIP-VTE-2Surgery Patients Who Received Appropriate Venous Thromboembolism Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After SurgeryNQF #0218Required.
CLABSINational Healthcare Safety Network (NHSN) Central line-associated Bloodstream Infection (CLABSI) Outcome MeasureNQF #0139Required.
SSIAmerican College of Surgeons—Centers for Disease Control and Prevention (ACS-CDC) Harmonized Procedure Specific Surgical Site Infection (SSI) Outcome Measure Colon procedures Hysterectomy proceduresNQF #0753Required.
CAUTINational Healthcare Safety Network (NHSN) Catheter-associated Urinary Tract Infection (CAUTI) Outcome MeasureNQF #0138Required.
MRSANational Healthcare Safety Network (NHSN) Facility-wide Inpatient Hospital-onset Methicillin-resistant Staphylococcus aureus (MRSA) Bacteremia Outcome MeasureNQF #1716Required.
CDINational Healthcare Safety Network (NHSN) Facility-wide Inpatient Hospital-onset Clostridium difficile Infection (CDI) Outcome MeasureNQF #1717Required.
HCPInfluenza vaccination coverage among healthcare personnel (HCP)NQF #0431Required.
ED-1Median time from ED arrival to ED departure for admitted ED patientsNQF #0495Required submission, but voluntary electronic clinical quality measure.
ED-2Admit Decision Time to ED Departure Time for Admitted PatientsNQF #0497Required submission, but voluntary electronic clinical quality measure.
Imm-1Pneumoccocal ImmunizationNQF #1653Data collection suspended.
Imm-2Influenza ImmunizationNQF #1659Required.
Stroke-1Venous thromboembolism (VTE) prophylaxisNQF #0434Required.
Stroke-2Discharged on antithrombotic therapyNQF #0435Required submission, but voluntary electronic clinical quality measure.
Stroke-3Anticoagulation therapy for atrial fibrillation/flutterNQF #0436Required submission, but voluntary electronic clinical quality measure.
Stroke-4Thrombolytic therapyNQF #0437Required submission, but voluntary electronic clinical quality measure.
Stroke-5Antithrombotic therapy by the end of hospital day twoNQF #0438Required submission, but voluntary electronic clinical quality measure.
Stroke-6Discharged on statin medicationNQF #0439Required submission, but voluntary electronic clinical quality measure.
Stroke-8Stroke educationN/ARequired submission, but voluntary electronic clinical quality measure.
Stroke-10Assessed for rehabilitationNQF #0441Required submission, but voluntary electronic clinical quality measure.
VTE-1Venous thromboembolism prophylaxisNQF #0371Required submission, but voluntary electronic clinical quality measure.
VTE-2Intensive care unit venous thromboembolism prophylaxisNQF #0372Required submission, but voluntary electronic clinical quality measure.
VTE-3Venous thromboembolism patients with anticoagulation overlap therapyNQF #0373Required submission, but voluntary electronic clinical quality measure.
VTE-4Patients receiving un-fractionated Heparin with doses/labs monitored by protocolN/ARequired submission, but voluntary electronic clinical quality measure.
VTE-5VTE discharge instructionsN/ARequired submission, but voluntary electronic clinical quality measure.
VTE-6Incidence of potentially preventable VTEN/ARequired submission, but voluntary electronic clinical quality measure.
PC-01Elective delivery (Collected in aggregate, submitted via Web-based tool or electronic clinical quality measure)NQF #0469Required submission, but voluntary electronic clinical quality measure.
MORT-30-AMIHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following acute myocardial infarction (AMI) hospitalization for patients 18 and olderNQF #0230Required.
MORT-30-HFHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following heart failure (HF) hospitalization for patients 18 and olderNQF #0229Required.
MORT-30-PNHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following pneumonia hospitalizationNQF #0468Required.
COPD MortalityHospital 30-Day, All-Cause, Risk-Standardized Mortality Rate (RSMR) following Chronic Obstructive Pulmonary Disease (COPD) HospitalizationNQF #1893Required.
STK MortalityStroke 30-day mortality rateN/ARequired.
READM-30-AMIHospital 30-day all-cause risk-standardized readmission rate (RSRR) following acute myocardial infarction (AMI) hospitalizationNQF #0505Required.
READM-30-HFHospital 30-day, all-cause, risk-standardized readmission rate (RSRR) following heart failure hospitalizationNQF #0330Required.
READM-30-PNHospital 30-day, all-cause, risk-standardized readmission rate (RSRR) following pneumonia hospitalizationNQF #0506Required.
READM-30-TH/TKAHospital-level 30-day, all-cause risk-standardized readmission rate (RSRR) following elective primary total hip arthroplasty (THA) and/or total knee arthroplasty (TKA)NQF #1551Required.
READM-30-HWRHospital-Wide All-Cause Unplanned Readmission (HWR)NQF #1789Required.
COPD READMITHospital 30-Day, All-Cause, Risk-Standardized Readmission Rate (RSRR) following Chronic Obstructive Pulmonary Disease (COPD) HospitalizationNQF #1891Required.
STK READMIT30-day risk standardized readmission rate (RSMR) following Stroke hospitalizationN/ARequired.
MSPBPayment-Standardized Medicare Spending Per Beneficiary (MSPB)NQF #2158Required.
AMI paymentAMI Payment per Episode of CareN/ARequired.
Hip/knee complicationsHospital-level risk-standardized complication rate (RSCR) following elective primary total hip arthroplasty (THA) and/or total knee arthroplasty (TKA)NQF #1550Required.
PSI 4 (PSI/NSI)Death among surgical inpatients with serious, treatable complicationsNQF #0351Required.
PSI 90Patient safety for selected indicators (composite)NQF #0531Required.
Database for Cardiac SurgeryParticipation in a systematic database for cardiac surgeryNQF #0113Required.
Registry for Nursing Sensitive CareParticipation in a Systematic Clinical Database Registry for Nursing Sensitive CareN/ARequired.
Registry for General SurgeryParticipation in a Systematic Clinical Database Registry for General SurgeryN/ARequired.
Safe Surgery ChecklistSafe Surgery Checklist UseN/ARequired.
HCAHPSHCAHPS + CTM-3NQF #0166 NQF #0228Required.

6. Refinements and Clarification to Existing Measures in the Hospital IQR Program

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28223 through 28226), we proposed to incorporate refinements for several measures that were previously adopted in the Hospital IQR Program. These refinements have either arisen out of the NQF endorsement maintenance process, or during our internal efforts to harmonize measure approaches. The measure refinements include the following: (1) refining the planned readmission algorithm for all seven readmission measures included in the Hospital IQR Program; (2) modifying the hip/knee readmission and complication measure cohorts to exclude index admissions with a secondary fracture diagnosis; and (3) modifying the hip/knee complication measure to not count as complications coded as “present on admission” (POA) during the index admission.

We received one general comment on our proposed refinements.

Comment: One commenter supported CMS' continued refinements to the readmission measures.

Response: We thank the commenter for their support.

a. Refinement of Planned Readmission Algorithm for 30-Day Readmission Measures

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50785 through 50787) we adopted the CMS Planned Readmission Algorithm Version 2.1 (the Algorithm) for the Hospital IQR Program. In the same final rule (78 FR 50785 through 50787, 50790 through 50792, and 50794 through 50798), we also finalized the use of the CMS Planned Readmission Algorithm Version 2.1 in the AMI, HF, PN, THA/TKA, HWR, and COPD measures. This algorithm identifies readmissions that are planned and occur within 30 days of discharge from the hospital. A complete description of the Algorithm, which includes lists of planned diagnoses and procedures, is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html in the “Planned Readmission” folder. NQF has endorsed the use of the Algorithm for these measures.

In that final rule (78 FR 50652) and in response to comments, we agreed to continually review the Algorithm and make updates as needed. Since its development, we have identified and made improvements to the Algorithm. As a result, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28223 through 28224) we proposed to use an updated, revised version, the CMS Planned Readmission Algorithm Version 3.0, for the AMI, HF, PN, THA/TKA, HWR, COPD, and Stroke readmission measures for the FY 2015 payment determination and subsequent years. As discussed further below, we also proposed to use Version 3.0 of this algorithm for the CABG readmission measure that we proposed to include in the Hospital IQR Program starting in FY 2017, proposed in section IX.A.7.a. of the preamble of the proposed rule.

Version 3.0 incorporates improvements made based on a validation study of the algorithm. Researchers reviewed 634 patients' charts at 7 hospitals, classified readmission as planned or unplanned based on the chart review, and compared the results to the claims-based algorithm's classification of the readmissions. The findings suggested the algorithm was working well but could be improved.

Specifically, the study suggested the need to make small changes to the tables of procedures and conditions used in the algorithm to classify readmission as planned or unplanned. The algorithm uses AHRQ's Clinical Classification Software (CCS) to group thousands of procedure and diagnosis codes into fewer categories of related procedures or diagnoses. The algorithm then uses four tables of procedures and diagnoses categories and a flow diagram to classify tables as planned or unplanned. Additional information on this software is available at: http://www.hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp. For all measures, the first table identifies procedures that, if present in a readmission, classify the readmission as planned. The second table identifies primary discharge diagnoses that always classify readmissions as planned. Because almost all planned admissions are for procedures or surgeries, a third table identifies procedures for which patients are typically admitted; if any of these procedures is coded in the readmission, we classify a readmission as planned as long as that readmission does not have an acute (unplanned) primary discharge diagnosis. The fourth table lists the acute (unplanned) primary discharge diagnoses that disqualify readmissions that include one or more of the potentially planned procedure in the third table as planned. These tables are structured similarly across all measures, but the specific procedure and conditions they contain vary slightly for certain measures based on clinical considerations for each cohort. The current tables for each measure can be found in the measure methodology reports at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

Version 3.0 modifies two of these tables by removing or adding procedures or conditions to improve the accuracy of the algorithm. First, the validation study revealed that the algorithm could be improved by removing two procedure CCS categories from the third table, the potentially planned procedure table: CCS 211—Therapeutic Radiation and CCS 224—Cancer Chemotherapy. Typically, patients do not require admission for scheduled Therapeutic Radiation treatments (CCS 211). The study found that readmissions that were classified as planned because they included Therapeutic Radiation were largely unplanned.

The algorithm was also more accurate when CCS 224—Cancer Chemotherapy was removed from the potentially planned procedure table. The second table of the algorithm classifies all readmissions with a principal diagnosis of Maintenance Chemotherapy as planned. Most patients who receive cancer chemotherapy have both a code for Cancer Chemotherapy (CCS 224) and a principal discharge diagnosis of Maintenance Chemotherapy (CCS 45). In the validation study, the readmissions for patients who received Cancer Chemotherapy (CCS 224), but who did not have a principal diagnosis of Maintenance Chemotherapy were largely unplanned, therefore removing CCS 224 from the potentially planned procedure table improved the algorithm's accuracy. Therefore, Version 3.0 removes CCS 211 and CCS 224 from the list of potentially planned procedures to improve the accuracy of algorithm.

As noted above, the algorithm uses a table of acute principal discharge diagnoses to help identify unplanned readmissions. Readmissions that have a principal diagnosis listed in the table are classified as unplanned, regardless of whether they include a procedure in the potentially planned procedure table. The validation study identified one diagnosis CCS that should be added to the table of acute diagnoses to more accurately identify truly unplanned admissions as unplanned: Hypertension with Complications (CCS 99). Hypertension with complications is a diagnosis that is rarely associated with planned readmissions.

In addition, the validation study identified a subset of ICD-9 diagnosis codes within two CCS diagnosis categories that should be added to the acute diagnosis table to improve the algorithm. CCS 149, Pancreatic Disorders, includes the code for acute pancreatitis; clinically there is no situation in which a patient with this acute condition would be admitted for a planned procedure. Therefore, Version 3.0 adds the ICD-9 code for acute pancreatitis, 577.0, to the acute primary diagnosis table to better identify unplanned readmissions. Finally, CCS 149, Biliary Tract Disease, is a mix of acute and non-acute diagnoses. Adding the subset of ICD-9 codes within this CCS group that are for acute diagnoses to the list of acute conditions improves the accuracy of the algorithm for these acute conditions while still ensuring that readmissions for planned procedures, like cholecystectomies, are counted accurately as planned. For more detailed information on how the algorithm is structured and the use of tables to identify planned procedures and diagnoses, we refer readers to CMS' Planned Readmission Algorithm Version 2.1: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. As noted above, readers can find the specific Version 3.0 tables for each measure in the measure updates and specifications reports at the above link.

We invited public comment on our proposal to use the CMS Planned Readmission Algorithm Version 3.0, for the AMI, HF, PN, THA/TKA, HWR, COPD, and Stroke readmission measures for the FY 2015 payment determination and subsequent years.

Comment: One commenter supported the use of the planned readmission algorithm for the COPD readmission measure. Several commenters believed updates to the COPD readmission rate calculation will increase the measures precision.

Response: We thank the commenters for their support.

Comment: Several commenters did not support the Cancer Exclusions and urged CMS to continue excluding therapeutic radiation and cancer chemotherapy from readmissions penalties. Commenters stated that given the immunosuppression associated with these conditions and treatments, it is in the best interest of the patients to be sent home as soon as possible as it reduces their chances of getting hospital acquired infections that are often more virulent than community-acquired pathogens. One commenter was concerned that the proposed exclusion may not be able to fully account for the increased readmissions associated with this population that are often not preventable. Another commenter also noted that some hospitals may treat more patients who receive these treatments compared to other hospitals, which would not be accounted for in the measures. Another commenter did not believe that CMS presented convincing evidence that the cancer codes proposed for exclusion are appropriate to exclude at this time. The commenter urged CMS to report its findings to NQF for a transparent review prior to implementation.

Response: We recognize that cancer care readmissions are often not preventable. In response to commenters' concerns regarding the cancer exclusions and certain hospitals treating more cancer patients than other hospitals, we are removing both CCS 211—Therapeutic Radiation and CCS 224—Cancer Chemotherapy from the potentially planned procedure table of the planned readmission algorithm to improve the accuracy of the algorithm. We are removing Therapeutic Radiation because patients are not typically admitted for therapeutic radiation, and admissions with this treatment in a validation study we conducted of the algorithm were generally unplanned. Further, our validation study showed admissions for people who receive cancer chemotherapy, but do not have a principal diagnosis of maintenance chemotherapy are typically unplanned admissions. Therefore, we expect that removal of CCS 211 and CCS 224 will improve the algorithm's accuracy and we do not anticipate it will have the unintended consequence of discouraging needed cancer care.

We acknowledge that in many cases it is in the best interest of the patients to be sent home as soon as possible as it reduces their chances of getting hospital acquired infections that are often more virulent than community-acquired pathogens.

As we are removing these cancer exclusions, we believe that we would not need to report additional information to NQF, as requested by the commenter.

Comment: Several commenters believed that the readmission algorithm is critically important in the appropriate attribution of readmissions. One commenter was disappointed that CMS have not sent the planned readmissions algorithm back to the NQF and several suggested that CMS seek an ad hoc review before proposing changes to the readmission measures that are used in the Hospital IQR Program and the Hospital Readmissions Reduction Program.

Response: We would like to reassure the commenters that our proposed changes to the readmission algorithm will have minimal effect on how it attributes readmissions. We believe the changes should undergo NQF review as part of the endorsement maintenance and annual update processes for individual measures instead of an ad hoc review because the changes to the algorithm have a minimal effect on the planned readmission rates for each measure as detailed in the proposed rule (Table IV.H.1) (79 FR 28107 through 28108) and improve the accuracy of the algorithm. We have submitted changes related to the heart failure, pneumonia, and hip/knee, COPD and CABG readmission measures with Version 3.0 to NQF, all under annual update review with the exception of the CABG readmission measures which are new. For the AMI measure, endorsement maintenance occurred in 2013 prior to CMS' updating the algorithm to Version 3.0; therefore, we will submit the AMI readmission measure with the revised algorithm in the next NQF review cycle.

We acknowledge the commenter's view that the readmission algorithm is critically important in the appropriate attribution of readmissions.

After consideration of the public comments we received, we are finalizing our policy to use the CMS Planned Readmission Algorithm Version 3.0, for the AMI, HF, PN, THA/TKA, HWR, COPD, and Stroke readmission measures for the FY 2015 payment determination and subsequent years as proposed.

b. Refinement of Total Hip Arthroplasty and Total Knee Arthroplasty (THA/TKA) 30-Day Complication and Readmission Measures

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28224 through 28225), for the FY 2015 payment determination and subsequent years, we proposed to refine: (1) the measure outcome and cohort for the Elective Primary THA/TKA All-Cause 30-Day Risk-Standardized Complication Measure (NQF #1550); and (2) the measure cohort for the Elective Primary THA/TKA All-Cause Unplanned 30-Day Risk-Standardized Readmission Measure (NQF #1551).

As part of measure implementation, we conducted a dry run for both the THA/TKA readmission and complication measures in September/October of 2012. More information on the dry run is available at: https://www.qualitynet.org/dcs/BlobServer?blobkey=id&blobnocache=true&blobwhere=1228889945763&blobheader=multipart%2Foctet-stream&blobheadername1=Content-Disposition&blobheadervalue1=attachment%3Bfilename%3DDryRun_HWR-HK_SummRept_122112.pdf&blobcol=urldata&blobtable=MungoBlobs.

During the dry run, several commenters suggested that we evaluate the use of Present on Admission (POA) codes for both the hip/knee readmission and complication measures. We agreed with the suggestion and have been monitoring POA data collection and testing its readiness for use in claims-based measures. We also noted our intent to evaluate the use of POA codes in Hospital IQR Program measures, such as the stroke mortality rate measure, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50801). We have since tested the use of the POA codes and proposed to incorporate POA codes into the hip/knee complication measure for FY 2015 payment determination and subsequent years in order to prevent identifying a condition as a complication of care if it was present during admission.

In addition, currently, the THA/TKA Readmission Measure (NQF #1551) adopted for the Hospital IQR Program is intended to only include patients who have an elective THA or TKA. Currently, this measure excludes patients who have a principal discharge diagnosis of femur, hip, or pelvic fracture on their index admission since hip replacement for hip fracture is not an elective procedure. However, after hospitals reviewed their hospital-specific THA/TKA Readmission Measure data during the national dry-run, we learned that hospitals code hip fractures that occur during the same admission as a THA as not only a principal diagnosis, but also alternatively, a secondary diagnosis, instead of just a principal diagnosis as currently specified by the measure. According to feedback received from hospitals participating in the dry-run, the measure methodology failed to identify, and, appropriately exclude, a small number of patients (that is, 0.42 percent of patients in 2009-2010 data) with a hip fracture that had non-elective total hip arthroplasty as captured by these secondary diagnoses.

Therefore, to ensure that all such non-elective hip fracture patients are excluded from the measure, we proposed to refine the measure to exclude patients with hip fractures coded as either a principal or secondary diagnosis during the index admission beginning with the FY 2015 payment determination and subsequent years. We believe this refinement is responsive to comments previously received from hospitals (78 FR 50709) and will allow us to accurately exclude patients who were initially admitted for a hip fracture and who then subsequently underwent total hip arthroplasty, making their procedure non-elective.

We invited public comment on these proposed refinements.

Comment: Many commenters supported these refinements. Specifically, commenters supported CMS' proposals to:

  • Add POA condition codes to the THA/TKA measures, contending that doing so will minimize the misidentification of pre-existing conditions as complications related to the procedure.
  • Exclude from this 30-day readmission measure cohort patients with hip fracture who had a non-elective total hip anthroplasty.
  • Exclude patients who have a hip fracture coded as either a principal or secondary diagnosis during the index admission from the THA/TKA complication and readmission measures.
  • Remove cases where the hip/knee complication was present prior to the relevant admission as such complications should accrue to the hospitals furnishing the procedure prior to follow-up care.
  • Evaluate the performance of the Risk Standardized Readmission and Complication Rate (RSRR and RSCR) measures for total hip and total knee arthroplasty.

Response: We thank the commenters for their support.

Comment: Several commenters appreciated CMS' efforts to make measure improvements but explained that they did not support the update until measures have completed the NQF measure maintenance process, arguing that changes should not be made through the subregulatory process.

Response: To clarify, since we are using the notice and comment rulemaking process to make these measure refinements here, we are not making these changes using subregulatory methods. We believe these refinements are necessary to ensure that the measure accurately reflects the care provided to patients. We do not believe that we should delay making efforts to improve the measure's accuracy.

Comment: One commenter did not support the modifications to the THA and TKA readmission and complication measures, noting that the need to make corrections reinforces the view that there should be sufficient comprehensive testing before they are adopted for use.

Response: We agree with the commenter that the measures should undergo extensive testing prior to inclusion in reporting programs. The modifications here were identified during field testing of the THA/TKA readmission and complication measures and were incorporated prior to inclusion of the measures in the Hospital IQR Program. In addition, we reevaluate our measures on an annual basis in order to make methodological refinements required by: (1) Ongoing changes in clinical practice; (2) coding update; and (3) evolving input from stakeholders.

Comment: One commenter was concerned about the accuracy of administrative claims data used for the Hip/Knee Complication measure. The commenter suggested that the claims data used for the measure has been known to underreport significant comorbidities, particularly obesity.

Response: We believe that the administrative claims data used for the Hip/Knee Complication measure are accurate. We have validated the AMI, HF, and pneumonia readmission and mortality measures by building comparable models using medical record data for risk adjustment for heart failure patients (National Heart Failure data), AMI patients (Cooperative Cardiovascular Project data), and pneumonia patients (National Pneumonia Project dataset). When the medical record-based models were applied to the corresponding patient population, the hospital risk-standardized rates estimated using the claims-based risk adjustment models had a high level of agreement with the results based on the medical record model. This supports the use of the claims-based models for public reporting.

Regarding the commenters' concern about under-reporting significant co-morbidities, particularly morbid obesity, we have also conducted a medical record validation study of the THA/TKA complications measure. The goal of that study was to determine the overall agreement between arthroplasty patients identified as having a complication (or no complication) in the claims-based measure and those who had a complication (or no complication) also documented in the medical record. Overall measure data agreement was 93 percent (598/644 patients) before any changes were made to the model specifications. After the measure specifications were changed based upon the results of this validation study, the measure agreement between claims data and the medical record was 99 percent (635/644).

We also acknowledge the commenters' concern that obesity is associated with poorer outcomes after joint replacement; however, evidence supports that the potential greatest risk lies in patients who are morbidly obese. Administrative codes for morbid obesity have been shown to have greater sensitivity and specificity than obesity codes overall, with a specificity of 99 percent, and morbid obesity (ICD-9-CM code 278.01) is currently included in the measure risk model.

Horan F. Obesity and joint replacement. J Bone Joint Surg [Br] 2006;88-B:1269-71.

Nicholas S. Golinvaux, Daniel D. Bohl, Bryce A. Basques, Michael C. Fu, Elizabeth C. Gardner, Jonathan N. Grauer. Limitations of Administrative Databases In Spine Research: A Study in Obesity. Spine Journal, In Press, Accepted Manuscript, Available online 26 April 2014.

Comment: Several commenters requested that the Hip/Knee Complication measure be adjusted for socioeconomic status (SES).

Response: We appreciate the commenters' concerns and note that these concerns were addressed in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50653 through 50654, 50673 through 50674). As described in prior rulemaking, we do not currently risk adjust for SES in the Hospital IQR Program. However, we do risk adjust for comorbidities (that is, correlated illnesses) and other factors to ensure that hospitals are not penalized for serving populations that are sicker or have higher incidences of chronic disease.

We are aware that there are differing opinions regarding this approach. We appreciate the commenters' suggestions on the importance of addressing SES in the Hospital IQR Program. We have continued to consider and evaluate stakeholder concerns regarding the influence of patient socioeconomic status on clinical quality measures. We refer readers to section IV.H.4. of the preamble of this final rule for a discussion of the use of SES in our quality programs.

After consideration of the public comments we received, we are finalizing the refinements to the THA/TKA measure as proposed.

c. Anticipated Effect of Refinements to Existing Measures

Based on our analyses of discharges between July 2009 and June 2012, our proposal to use the Planned Readmission Algorithm Version 3.0 would have the following effects on measures had these changes been applied for the FY 2014 payment determination as an example. We are sharing this information to provide the public with a sense of the extent to which these refinements to the measures will change the measure scores. As the results show, while the refinements improve the accuracy of the measures, the changes in actual scores are very slight.

The proposed 30-day readmission rate (excluding the planned readmissions) would increase by 0.1 percentage points for AMI; 0.2 percentage points for HF; 0.1 percentage points for PN; 0.1 percentage points for COPD; 0.0 percentage points for hip/knee; 0.1 percentage points for HWR; and 0.0 percentage points for stroke.

The new national measure (unplanned) rate for each condition would have been 18.4 percent for AMI; 23.2 percent for HF; 17.7 percent for PN; 21.1 percent for COPD; 5.4 percent for hip/knee; 16.1 percent for HWR; and 13.8 percent for stroke.

The number of readmissions considered planned (and, therefore, not counted as a readmission) would decrease by 334 for AMI; 1,375 for HF; 981 for PN, 574 for COPD; 309 for hip/knee; 7,417 for HWR; and 242 for stroke.

d. Clarification Regarding Influenza Vaccination for Healthcare Personnel

The Influenza Vaccination Coverage Among Healthcare Personnel (HCP) (NQF #0431) measure was finalized for the Hospital IQR Program in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51633) and the Hospital Outpatient Quality Reporting (HOQR) in the CY 2014 OPPS/ASC final rule with comment period (78 FR 75099). We received public comments regarding the burden of separately collecting and reporting HCP influenza vaccination statuses for both the inpatient and outpatient settings. In response to these concerns, we clarified that beginning with the 2014-2015 influenza season (CY 2014 reporting period and FY 2016 payment determination), facilities should collect and report a single vaccination count for each healthcare facility by CMS Certification Number (CCN), instead of separately by inpatient or outpatient setting, in order to reduce burden. We announced this clarification regarding how to designate HCP for this measure in an Operational Guidance document which can be found on our on our Web page at: http://origin.glb.cdc.gov/nhsn/PDFs/HCP/Operational-Guidance-ACH-HCP-Flu.pdf. Using the CCN will allow healthcare facilities with multiple care settings to simplify data collection and submit a single count applicable across the inpatient and outpatient settings. We will then publicly report the percentage of HCP who received an influenza vaccination per CCN. This single count per CCN will inform the public of the percentage of vaccinated HCP at a particular healthcare facility, which would still provide meaningful data and help to improve the quality of care. Specific details on data submission for this measure can be found at: http://www.cdc.gov/nhsn/acute-care-hospital/hcp-vaccination/ and at http://www.cdc.gov/nhsn/acute-care-hospital/index.html.

(We discussed this clarification in section IX.A.5. of the preamble to the proposed rule (79 FR 28221).)

Comment: Several commenters supported the collection and submission of the influenza vaccination measure as a single facility count, which the commenters agreed will reduce the burden on providers and lead to more meaningful results. One commenter specifically supported the healthcare personnel influenza vaccination coverage clarification because it accommodates injectable and nasal spray vaccines.

Response: We thank the commenters for their support.

Comment: A commenter requested clarification on reporting for the inpatient and outpatient settings, stating that it reports to NHSN separately for these settings through a Facility Organization Identification (Org ID) rather than by CCN. The commenter believed that, after all data have been submitted by Org ID, the CDC will roll-up the data reported by Org ID to the CCN level, in order to report data to CMS.

Response: We agree with the commenter's assessment and clarify that hospitals should report by enrolled facility, according to their NHSN OrgID, in order to be consistent with CDC NHSN infrastructure. These data are to be reported for all patient care units included within the enrolled facility's OrgID that also share the same CCN (some patient care units within the OrgID may have separate CCNs and those should not be included in these counts). Therefore, data will be submitted to NHSN by facility Org ID, not CCN. CDC will then aggregate the facility level data into a CCN HCP rate and submit aggregate hospital-level measure rates at the CCN level to us on behalf of facilities for Hospital Compare public reporting purposes.

After consideration of public comments we received, we are clarifying that hospitals should report a single count per enrolled facility, and not CCN, for the previously finalized Influenza Vaccination Coverage Among Healthcare Personnel (HCP) (NQF #0431) measure. We will require facilities to collect and submit a single vaccination count for each health care facility enrolled in NHSN by facility OrgID. This modifies our statement in the proposed rule indicating that facilities should submit data by CCN, and better aligns with the FY 2015 OPPS Proposed rule (79 FR 41035) as well as NHSN guidance documents.

7. Additional Hospital IQR Program Measures for the FY 2017 Payment Determination and Subsequent Years

For purposes of the Hospital IQR Program, section 1886(b)(3)(B)(IX)(aa) of the Act requires that any measure specified by the Secretary must have been endorsed by the entity with a contract under section 1890(a) of the Act. However, the statutory requirements under section 1886(b)(3)(B)(IX)(bb) of the Act provide an exception that, in the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28227 through 28243) we proposed to add a total of 11 measures to measure set for the FY 2017 payment determination and subsequent years. The first nine new measures are: (1) Hospital 30-day, all-cause, unplanned, risk-standardized readmission rate (RSRR) following coronary artery bypass graft (CABG) surgery (claims-based); (2) Hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following coronary artery bypass graft (CABG) surgery (claims-based); (3) Hospital-level, risk-standardized 30-day episode-of-care payment measure for pneumonia (claims-based); (4) Hospital-level, risk-standardized 30-day episode-of-care payment measure for heart failure (claims-based); (5) Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) (chart-abstracted); (6) EHDI-1a Hearing Screening Prior to Hospital Discharge (NQF #1354) (electronic clinical quality measure); (7) PC-05 Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (NQF #0480) (electronic clinical quality measure); (8) CAC-3 Home Management Plan of Care (HMPC) Document Given to Patient/Caregiver (electronic clinical quality measure); and, (9) Healthy Term Newborn (NQF #0716) (electronic clinical quality measure).

In addition, to align the Hospital IQR Program with the Medicare EHR Incentive Program for Eligible Hospitals and CAHs and allow hospitals as many measure options as possible that overlap both programs, we proposed to readopt two measures previously removed from the Hospital IQR Program as voluntary electronic clinical quality measures: (10) AMI-2 Aspirin Prescribed at Discharge for AMI (NQF #0142) (electronic clinical quality measure); and (11) AMI-10 Statin Prescribed at Discharge (NQF #0639) (electronic clinical quality measure). These two measures are part of the Stage 2 Medicare EHR Incentive Program measure set for eligible hospitals and CAHs.

The four proposed claims-based measures (1-4, above) were included on a publicly available document entitled “List of Measures Under Consideration for December 1, 2013” in compliance with section 1890A(a)(2) of the Act, and they were reviewed by the MAP in its MAP 2014 Recommendations on Measures for More Than 20 Federal Programs final report, available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report__2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx.

The proposed chart-abstracted measure (5 above) Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) was included in the MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS final report, available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738.

The proposed measures 6-9 above were included on a publicly available document entitled “List of Measures Under Consideration for December 1, 2012” in compliance with section 1890A(a)(2) of the Act, and they were reviewed by the MAP in its MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS final report, available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738.

Measures 10 and 11 were included on a publicly available document entitled “Measures Under Consideration for Calendar Year 2012” in compliance with section 1890A(a)(2) of the Act, and they were reviewed by the MAP in its Pre-Rulemaking Report: Input on Measures Under Consideration by HHS for 2012 Rulemaking available at https://www.qualityforum.org/Publications/2012/02/MAP_Pre-Rulemaking_Report__Input_on_Measures_Under_Consideration_by_HHS_for_2012_Rulemaking.aspx.

We received a number of comments applying across proposed measures and will address those comments first before individually addressing comments related to specific measures.

Comment: One commenter supported the inclusion of the CABG mortality and readmission, heart failure payment, and pneumonia payment measures in the Hospital IQR Program. The commenter preferred that the measures be NQF-endorsed. Another commenter supported CMS' proposal to increase the number of outcome measures.

Response: We thank the commenters for their support.

Comment: Several commenters expressed concern that four of the five measures proposed for adoption under the Hospital IQR Program were not NQF-endorsed and have not been recommended by the MAP. The most frequently expressed concerns were in regards to the CABG mortality, CABG readmission, heart failure and pneumonia payment measures, although there were several comments addressing the other proposed measures that are not NQF-endorsed. A commenter noted the NQF process is important to the reliability and validity of the measures used in the programs and to monitor adverse events.

Response: As described above, we may adopt non-NQF-endorsed measures under the Hospital IQR Program exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act. This provision provides that, in the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary. Although we proposed some measures that are not currently NQF-endorsed, they are pending NQF endorsement. We also considered other available measures that have been endorsed by the NQF and found no other feasible and practical measures. In addition, the MAP has supported or conditionally supported several of the measures. We are actively seeking NQF endorsement for the claims-based measures. More detailed discussions for individual measures are below.

Comment: A commenter requested that CMS outline its standards for conducting an environmental scan of available measures in the absence of a non-NQF-endorsed measure.

Response: We conduct thorough environmental scans of available measures using a standardized system set out in A Blueprint for the CMS Measures Management System (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/MeasuresManagementSystemBlueprint.html). We follow these core processes set out in the Blueprint as we develop, implement, and maintain quality measures. Our process for conducting an environmental scan of existing or related measures is set out below.

First we search for similar or related measures (existing or in development) that will help achieve the quality goals. We keep the search parameters broad to obtain an overall understanding of the measures in existence, including measures that closely meet the contract requirements and other potential sources of information. We then look for measures endorsed and recommended by multi-stakeholder organizations whenever applicable and include a search for measures developed and/or implemented by the private sector. Then we determine what types of measures are needed to promote the quality goals for a particular topic/condition or setting and determine what measurement gaps exist for the topic area, as well as existing measures that may be adopted or adapted for the project. For example, if the objective is the development of immunization measures for use in the home health setting, it will be necessary to identify and review existing home health measures. In addition, it might also be helpful to analyze immunization measures used in other settings such as nursing homes and hospitals.

The CMS Measures Management staff assists in identifying measures in development to ensure that no duplication occurs or to ensure related measures are developed with harmonization in mind. Search parameters include: (1) Measures in the same setting, but for a different topic; (2) Measures in a different setting, but for the same topic; (3) Measures that are constructed in a similar manner; (4) Quality indicators; (5) Accreditation standards; and (5) NQF preferred practices for the same topic.

Searching for existing and related measures may involve two steps: (1) searching databases, and (2) searching for other sources of information, such as performance indicators, accreditation standards, or preferred practices. We use a variety of databases and sources to search for existing and related measures. Below are links to a few readily available sources:

We also search other HHS agency pipeline measures. We search for other sources of information, such as performance indicators, accreditation standards, or preferred practices, that may pertain to the contract topic. Though they may not be as fully developed as a quality measure, quality indicators could be further developed to create a quality measure by providing detailed and precise specifications. Measures aligned with those standards may be easier to implement and be more readily accepted by the providers. These standards are linked to specific desired outcomes, and quality measures may be partially derived from the preferred practices reflected in the standards.

Comment: Several commenters believed that all measures should be risk-adjusted for SES, explaining that failing to risk adjust for SES factors will skew our data measurements and produce inaccurate and unreliable outcomes. One commenter emphasized the need for adjusting for SES factors in all outcomes measures, arguing that such variables have an impact on patient outcomes, but are outside of a hospitals control. The commenter added that CMS as not provided data that shows this point to be untrue. One commenter stated that CMS should analyze the differences in performance for safety net providers to other hospitals by determining if the means of performance distribution are significantly different thus resulting in penalties. If it does, the commenter believed that SES risk adjustment would provide incentives for hospitals to improve as quality differences for reasons outside of a hospital's control would be illuminated.

Another commenter explained that many studies show reliable statistical results that SES is a risk factor for patient outcomes and that we have not demonstrated otherwise. As a result, the commenter believed that not adjusting for this risk factor obscures quality differences. One commenter believed that empirical studies demonstrate that patient SES impacts outcomes and failure to account for such impact disadvantages hospitals that treat them. Another commenter believed that hospitals should not be accountable for outcomes attributable to patient risk factors. Instead, the commenter believed that risk adjustment should be performed if data-stratified by SES show that safety net hospitals are providing poorer care for reasons unrelated to quality.

Another commenter suggested that CMS' argument for not risk adjusting for SES factors is that it would hold hospitals serving these areas to a different standard than others. The commenter stated that CMS' belief that risk adjusting for SES obscures true quality differences is based on the assumption that SES is not a risk factor beyond the hospital's control. Another commenter listed unintended consequences that may result from not risk adjusting for SES which were echoed by several commenters. These potential consequences included not providing care for disadvantaged patients so as to not be labeled a poor performer, shifts in funds to hospitals caring for affluent patients, and consumers avoiding providers labeled poor performers when they are not. Several commenters were concerned that not risk-adjusting for SES could result in safety net providers losing scarce resources that are necessary to care for vulnerable patients, which would potentially make disparities worse.

Further, one commenter stated that current CMS measures do not improve quality and weaken the social safety net. Another commenter believed that the current policy to exclude “factors related to the disparities in care” from all measures creates a “one size fits all” approach that ignores fundamentally the challenges that many academic health centers face in delivering high-quality care to their entire patient population, regardless of race, income, or other socioeconomic characteristics. Commenters urged CMS to review important studies published about risk adjustment for SES and revise measure methodology to account for SES. One commenter suggested that CMS comply with the NQF's recommendations related to the use of risk adjustment versus stratification for patient SES.

Response: We have received many comments regards risk-adjusting measures for SES in several quality programs. We appreciate the commenters' concerns and note that these concerns were addressed in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50653 through 50654, 50673 through 50674). As described in prior rulemaking, we do not currently risk adjust for SES in the Hospital IQR Program. However, we do risk adjust for comorbidities (that is, correlated illnesses) and other factors to ensure that hospitals are not penalized for serving populations that are sicker or have higher incidences of chronic disease.

We are aware that there are differing opinions regarding this approach. We appreciate the commenters' suggestions on the importance of addressing SES in the Hospital IQR Program. We have continued to consider and evaluate stakeholder concerns regarding the influence of patient socioeconomic status on clinical quality measures. We refer readers to section IV.H.4. of the preamble of this final rule for further discussion of this issue.

Comment: One commenter stated that “a large proportion of low-income patients sometimes achieve good quality scores even as compared the scores for hospitals that have a lower proportion of low-income patients. But this is simply an anecdotal observation. It is not a statistically acceptable and reliable analysis.”

Response: We thank the commenter for their feedback, we understand this comment to mean a hospital with a high proportion of low SES patients can perform high in comparison with hospitals with a relatively low proportion of SES patients. We note similar findings in our Chartbook that follows the trends of hospital performance on readmission, mortality, and complication (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/-Medicare-Hospital-Quality-Chartbook-2013.pdf.) The statement referred to was based on descriptive statistics of the measure scores that can be found in our 2013 Medicare Hospital Quality Chartbook at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/-Medicare-Hospital-Quality-Chartbook-2013.pdf.

The risk-adjustment for clinical factors likely captures much of the variation due to SES, therefore resulting in an attenuation of the impact of SES factors on hospitals' results. We continue to monitor related activities at NQF, such as the July 23, 2014 decision by the NQF Board to approve a trial period to test the impact of sociodemographic factor risk adjustment of performance measures (available at: http://www.qualityforum.org/Press_Releases/2014/NQF_Board_Approves_Trial_Risk_Adjustment.aspx), and in Congress. As we stated in the past, we are committed to working with the NQF and other stakeholder communities to continuously refine our measures and to address the concerns associated with SES and risk adjustment. We believe that continued collaboration with the stakeholder communities will enable us to identify feasible ways to appropriately address any unintended consequences for providers serving high proportions of low SES patients.

Comment: A commenter was concerned that CMS proposed several new measures for the Hospital IQR Program that the commenter believes lack the scientific rigor needed for public reporting. However, the commenter did not specify which proposed measures caused concern.

Response: We respectfully disagree with the commenter that the proposed measures lack the scientific rigor needed for public reporting. We believe that these measures, as they are detailed below, are scientifically rigorous as they are described.

Comment: Several commenters did not support the use of the pneumonia payment measure in the Hospital IQR Program since it is not NQF-endorsed. One commenter believed that, because the measure is not NQF endorsed, it is too soon to finalize the measure for the FY 2017 Hospital IQR Program.

Response: We received numerous comments that concerned both the Hospital-level, risk-standardized 30-day episode-of-care payment measure for pneumonia and Hospital-level, risk-standardized 30-day episode-of-care payment measure for heart failure payment measures. We are addressing those comments here first before addressing the individual measures.

a. Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery

(1) Background

CABG is a priority area for outcomes measure development because it is a common procedure associated with considerable morbidity, mortality, and health care spending. In 2007, there were 114,028 hospitalizations for CABG surgery and 137,721 hospitalizations for combined surgeries for CABG and valve procedures (“CABG plus valve” surgeries) in the U.S.

Drye E, Krumholz H, Vellanky S, Wang Y. Probing New Conditions and Procedures for New Measure Development: Yale New Haven Health Systems Corporation; Center for Outcomes Research and Evaluation.; 2009:1-7.

Readmission rates following CABG surgery are high and vary across hospitals. For example, in 2009 Medicare fee-for-service (FFS) data, the median hospital-level risk-standardized readmission rate after CABG was 17.2 percent and ranged from 13.9 percent to 22.1 percent. This is consistent with published data as the average 30-day all-cause, hospital-level readmission rate in New York state was 16.5 percent and ranged from 8.3 percent to 21.1 percent among all patients who underwent CABG surgery between January 1, 2005 and November 30, 2007. Among patients readmitted within 30 days, 87.3 percent of readmissions were for reasons related to CABG surgery, with a 30-day rate of readmissions due to complications of CABG surgery of 14.4 percent. Patients readmitted within 30 days also experienced a 2.8 percent in-hospital mortality rate during their readmission(s), three-fold higher than the 30-day mortality rate for patients without readmissions. Hence, addressing the causes of readmission will improve outcomes for patients.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Hannan EL, Zhong Y, Lahey SJ, et al. 30-day readmissions after coronary artery bypass graft surgery in New York State. JACC Cardiovasc Interv. 2011;4(5):569-576.

Ibid.

Readmissions after CABG also impose significant health care costs. In 2007, the Medicare Payment Advisory Committee (MedPAC) published a report to Congress in which it identified the seven conditions associated with the most costly potentially preventable readmissions in the U.S. Among these seven, CABG ranked as having the highest potentially preventable readmission rate within 15 days following discharge (13.5 percent) and the second highest average Medicare payment per readmission ($8,136). The annual cost to Medicare for potentially preventable CABG readmissions was estimated at $151 million.

Medicare Payment Advisory Committee. Report to the Congress: Promoting Greater Efficiency in Medicare, 2007.

Ibid.

High readmission rates and wide variation in these rates suggest that there is room for improvement. Reducing readmissions after CABG surgery has been identified as a target for quality measurement. An all-cause readmission measure for patients who undergo CABG surgery will provide hospitals with an incentive to reduce readmissions through prevention and/or early recognition and treatment of postoperative complications, and improved coordination of peri-operative care and discharge planning.

The specifics of the measure methodology are included in the measure methodology report we have posted on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. We refer readers to the report for further details on the risk-adjustment statistical model.

We proposed to include this non-NQF-endorsed measure in the Hospital IQR Program under the exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act as previously discussed in section IX.A.7. of the preamble of this final rule. Although the proposed measure is not currently NQF-endorsed, we considered available measures that have been endorsed or adopted by the NQF. We also are not aware of any other 30-day, all-cause, unplanned, risk-standardized readmission rate (RSRR) following coronary artery bypass graft (CABG) surgery measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. The measure has been reviewed by the MAP and was conditionally supported pending NQF endorsement as detailed in its Pre-Rulemaking 2014 Map Recommendations Report available at: https://www.qualityforum.org/Setting_Priorities/Partnership/MAP_Final_Reports.aspx. This measure was submitted to NQF on February 5, 2014 and is currently under review.

(2) Overview of Measure

The CABG readmission measure assesses hospitals' 30-day, all-cause risk-standardized rate of unplanned readmission following admission for a CABG procedure. In general, the measure uses the same approach to risk adjustment and hierarchical logistic modeling (HLM) methodology that is specified for CMS' other readmission measures previously adopted for this program. Information on how the measure employs HLM can be found in the 2012 CABG Readmission Measure Methodology Report (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

(3) Data Sources

The proposed measure is claims-based. It uses Medicare administrative data from hospitalizations for Medicare FFS beneficiaries hospitalized for a CABG procedure.

(4) Outcome

The outcome for this measure is 30-day, all-cause readmission, defined as an unplanned subsequent inpatient admission to any applicable acute care facility for any cause within 30 days of the date of discharge from the index hospitalization. This outcome period is consistent with other NQF-endorsed publicly reported readmission measures (AMI, HF, PN, COPD, HWR, and THA/TKA).

The measure assesses all-cause unplanned readmissions (excluding planned readmissions) rather than readmissions for CABG only for several reasons. First, from the patient perspective, a readmission for any reason is likely to be an undesirable outcome of care, even though not all readmissions are preventable. Second, limiting the measure to CABG-related readmissions may limit the effort focus too narrowly rather than encouraging broader initiatives aimed at improving the overall care within the hospital and transitions from the hospital setting. Moreover, it is often hard to exclude quality issues and accountability based on the documented cause of readmission. For example, a patient who underwent a CABG surgery and develops a hospital-acquired infection may ultimately be readmitted for sepsis. It would be inappropriate to consider such a readmission to be unrelated to the care the patient received for their CABG surgery. Finally, while the measure does not presume that each readmission is preventable, interventions generally have shown reductions in all types of readmissions.

Gulshan Sharma, Kou Yong-Fang, Freeman Jean L, Zhang Dong D, Goodwin James S.: Outpatient Follow-up Visit and 30-Day Emergency Department Visit and Readmission in Patients Hospitalized for Chronic Obstructive Pulmonary Disease. Arch Intern Med. Oct. 2010;170:1664-1670.

Nelson EA, Maruish ME, Axler JL.: Effects of Discharge Planning and Compliance with Outpatient Appointments on Readmission Rates. Psychiatr Serv. July 1 2000;51(7):885-889.

The measure does not count planned readmissions as readmissions. Planned readmissions would be identified in claims data using the CMS Planned Readmission Algorithm Version 3.0 that detects planned readmissions that may occur within 30 days of discharge from the hospital. Version 2.1 of the algorithm was finalized for use in the current Hospital IQR Program readmission measures in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50785 through 50787, 50790 through 50792 and 50794 through 50798). However, we proposed to update the algorithm to version 3.0, and details on the updates to this algorithm can be found in section IX.A.6.a. of the preamble of this final rule. The proposed CABG readmission measure uses the planned readmission algorithm tailored for CABG patients. We adapted the algorithm for this group of patients with input from CABG surgeons and other experts, narrowing the types of readmissions considered planned since planned readmissions following CABG are less common and less varied than among patients discharged from the hospital following a medical admission. More detailed information on how the CABG measure incorporates the Planned Readmission Algorithm Version 3.0 can be found on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. Once at the Web site, users should open the Coronary Artery Bypass Graft (CABG) Readmission ZIP file, then open the file labeled, “Version10_Readmission_CABG_Measure_Methodology_Report_3 19 2014” and refer to Section 2.3.3. For the CABG measure, unplanned readmissions that fall within the 30-day post-discharge timeframe from the index admission would not be counted as readmissions for the index admission if they were preceded by a planned readmission.

(5) Cohort

The cohort includes patients aged 65 years and older who received a qualifying CABG procedure at an acute care facility. Patients are eligible for inclusion if they had a qualifying CABG procedure and continuous enrollment in Medicare FFS one year prior to the first day of the index hospital stay and through 30 days post-discharge. The index stay is the stay that triggers the 30-day measurement period.

In order to include a clinically-coherent set of patients in the measure, we sought input from clinical experts regarding the inclusion of other concomitant cardiac and non-cardiac procedures, such as valve replacement and carotid endarterectomy. Adverse clinical outcomes following such procedures are higher than those following “isolated” CABG procedures, that is, CABG procedures performed without concomitant high-risk cardiac and non-cardiac procedures. Limiting the measure cohort to “isolated” CABG patients is consistent with published reports of CABG outcomes9; therefore, the measure cohort considers only patients undergoing isolated CABG as eligible for inclusion in the measure. We defined isolated CABG patients as those undergoing CABG procedures without concomitant valve or other major cardiac, vascular or thoracic procedures. In addition, our clinical experts, consultants, and Technical Expert Panel (TEP) members agreed that an isolated CABG cohort is a clinically coherent cohort for quality measurement. For detailed information on the cohort definition, we refer readers to the 2012 CABG Readmission Measure Methodology Report on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

Hannan EL, Zhong Y, Lahey SJ, et al. 30-day readmissions after coronary artery bypass graft surgery in New York State. JACC Cardiovasc Interv. 2011;4(5):569-576.

(6) Inclusion and Exclusion Criteria

The measure includes hospitalizations for patients who are 65 years of age or older at the time of index admission and for whom there was a complete 12 months of Medicare FFS enrollment to allow for adequate risk adjustment. The measure excludes the following admissions from the measure cohort: (1) Admissions for patients who are discharged against medical advice (excluded because providers do not have the opportunity to deliver full care and prepare the patient for discharge); (2) admissions for patients who die during the initial hospitalization (these patients are not eligible for readmission); (3) admissions for patients with subsequent qualifying CABG procedures during the measurement period (a repeat CABG procedure during the measurement period very likely represents a complication of the original CABG procedure and is a clinically more complex and higher risk surgery, therefore we select the first CABG admission for inclusion in the measure and exclude subsequent CABG admissions from the cohort); and (4) admissions for patients without at least 30 days post-discharge enrollment in Medicare FFS (excluded because the 30-day readmission outcome cannot be assessed in this group).

(7) Risk-Adjustment

The measure adjusts for differences across hospitals in how at risk their patients are for readmission relative to patients cared for by other hospitals. The measure uses claims data to identify patient clinical conditions and comorbidities to adjust patient risk for readmission across hospitals, but does not adjust for potential complications of care. We refer readers to section IV.4.H of the reamble of this final rule for further discussion of risk-adjustment for socioeconomic factors.

(8) Calculating the Risk-Standardized Readmission Ratio (RSRR)

The measure is calculated using hierarchical logistic modeling (HLM). This approach appropriately accounts for the types of patients a hospital treats (that is, hospital case mix), the number of patients it treats, and the quality of care it provides. The HLM is an appropriate statistical approach to measuring quality based on patient outcomes when the patients are clustered within hospitals (and therefore the patients' outcomes are not statistically independent) and the number of eligible patients for the measure varies from hospital to hospital. As noted above, the measure methodology defines hospital case mix based on the clinical diagnoses provided in the hospital claims for their patients' inpatient and outpatient visits for the 12 months prior to the CABG hospitalization, as well as those present in the claims for care at admission. The methodology, however, specifically does not account for diagnoses present in the index admission that may indicate complications rather than patient comorbidities.

The RSRR is calculated as the ratio of the number of predicted readmissions to the number of expected readmissions and then the ratio is multiplied by the national unadjusted readmission rate. The ratio is greater than one for hospitals that have more readmissions that would be expected for an average hospital with similar cases and less than one if the hospital has fewer readmissions than would be expected for an average hospital with similar cases. This approach is analogous to a ratio of “observed” or “crude” rate to an “expected” or risk-adjusted rate used in other similar types of statistical analyses.

The RSRR is a point estimate—the best estimate of a hospital's readmission rate based on the hospital's case mix. For displaying the measure for the Hospital IQR Program, we computed an interval estimate, which is similar to the concept of a confidence interval, to characterize the level of uncertainty around the point estimate. We use the point estimate and interval estimate to determine hospital performance (for example, higher than expected, as expected, or lower than expected). For more detailed information on the calculation methodology, we refer readers to our Web site at: http://cms.gov/Medicare/QualityInitiativesPatientAssessmentInstruments/HospitalQualityInits/Measure-Methodology.html.

We invited public comment on this proposal.

Comment: Several commenters supported the inclusion of CABG readmission into the Hospital IQR Program. One commenter specifically believes the CABG measure will lead to increased attention to care after discharge.

Response: We thank the commenters for their support.

Comment: Several commenters recommended that CMS focus on developing an electronically specified measure based on the ICD-10-CM/PCS coding system for future adoption instead of the current proposed measure.

Response: We thank the commenters and will take this suggestion into consideration as we move towards use of electronic clinical quality measures for CABG measures.

Comment: Several commenters did not support inclusion of the CABG readmission measure in the Hospital IQR Program because the measure is not NQF-endorsed.

Response: We proposed to include this non-NQF-endorsed measure under the Hospital IQR Program exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act. Although the proposed measure is not currently NQF-endorsed, we considered available measures that have been endorsed or adopted by the NQF. We also are not aware of any other similar measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. We refer readers to section IX.A.7. of the preamble of this final rule where we discuss other comments on our adoption of non-NQF-endorsed measures. On February 5, 2014, we submitted the Hospital-Level 30-Day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft (CABG) Surgery measure to NQF for endorsement.

Comment: Several commenters oppose the CABG readmission measure because it is not risk-adjusted for SES. Another commenter also suggested removing other readmission measures from the Hospital IQR Program until they are risk-adjusted for SES. Several commenters suggest following NQF-endorsed panel recommendations that propose evidence be presented in either support for or against the inclusion of SES in the measure. A commenter requested we risk-adjust the measure for SES and stated that this materially impacts the patient's likelihood of being readmitted, and the members on NQF's panel to examine adjusting for SES recommended adjusting for SES when appropriate. A commenter stated that the lack of risk-adjustment of this measure materially impacts the patient's likelihood of being readmitted.

Response: We appreciate the commenters' concerns and note that these concerns were addressed in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50653 through 50654, 50673 through 50674). As described in prior rulemaking, we do not currently risk adjust for SES in the Hospital IQR Program. However, we do risk adjust for comorbidities (that is, correlated illnesses) and other factors to ensure that hospitals are not penalized for serving populations that are sicker or have higher incidences of chronic disease.

We are aware that there are differing opinions regarding this approach. We appreciate the commenters' suggestions on the importance of addressing SES in the Hospital IQR Program. We have continued to consider and evaluate stakeholder concerns regarding the influence of patient socioeconomic status on clinical quality measures. We have received many comments regards risk-adjusting measures for SES in several quality programs. We refer readers to section IV.H.4. of the preamble of this final rule for further discussion of this issue.

Comment: Several commenters noted that there could be unintended consequences from adopting this measure. One commenter believed it is equally likely to result in hospitals avoiding complex cases in order to avoid potential penalty. Another commenter requested that CMS carefully monitor CABG utilization in high‐risk, older patients to ensure hospitals are not avoiding performing them for high risk patients in order to appear as lower readmission. If evidence is found that CABG surgeries are not being offered to high‐risk patients, the commenters suggested that CMS may need to reconsider its risk adjustment methodology to mitigate this unintended consequence.

Response: We note that the measures are risk-adjusted to take into account clinically complicated conditions. We appreciate commenters' concerns for potential unintended consequences of the measure. We believe the measure is adequately risk-adjusted for high-risk patients and so will not create a disincentive to treat these patients, but we will consider monitoring for any shift in their care (for example, by evaluating the risk profile of Medicare patients undergoing surgery before and after commencement of public reporting). The proposed CABG readmission measure adjusts for differences across hospitals in the level of risk their patients have for readmission relative to patients cared for by other hospitals. The measure uses administrative claims data to identify patient clinical conditions and comorbidities to adjust patient risk for readmission across hospitals, but does not adjust for potential complications of care.

Comment: One commenter opposed the measure construction and risk-adjustment methodology, citing concerns that the low R-squared meant that the measure does not truly differentiate performance between hospitals.

Response: The commenter refers to the R-squared statistic, but this is not a statistic used to evaluate the CABG measures. Therefore, we are assuming the commenter's primary concern is with the c-statistic of the measure. The c-statistic evaluates the measure's ability to discriminate or differentiate among low- versus high-risk patients. For measures used to profile hospital performance the goal is not always to achieve the highest c-statistic possible. The role of risk-adjustment in hospital profiling models is to level the playing field for hospitals for measures that assess relative performance—that is, how well hospitals fare compared to others with a similar patient case-mix. The risk-adjustment variables should be only those that are inherent to the patient and present at the time of admission. Some variables that might increase predictive power, such as complications of care, would not be appropriate for inclusion in an outcome quality measure, even if they would lead to a higher c-statistic. The c-statistic of this CABG measure is similar to other measures that are NQF-endorsed and in use, such as the AMI/HF/PN readmission measures.

In addition, this measure's risk model has been validated using registry data from the Society of Thoracic Surgeons' (STS) Adult Cardiac Surgery Database, which produced nearly identical c-statistics in a matched set of patients with correlation coefficients between 0.92 and 0.96, depending upon the statistic used.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Comment: One commenter does not support the CABG readmission measure, because it holds hospitals responsible for pre-existing underlying conditions. The commenter expected that risk adjusting is unlikely to be sufficient.

Response: We believe that the measure should appropriately account for patient case mix, including pre-existing conditions. This measure's risk model includes a range of medical and surgical comorbidities predictive of complications and readmissions following CABG surgery. In addition, as noted above, the risk model has been validated against a clinical risk model using registry data from the STS' Adult Cardiac Surgery Database.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Comment: One commenter noted that the CABG readmission measure has poor discrimination ability.

Response: As outlined above, we believe the commenter's primary concern is with the c-statistic of the measure. Discrimination refers to the ability to distinguish high risk subjects from low risk. The c-statistic is one of the statistical tools used to assess discrimination. We would like to clarify the important difference between predictive models intended for patient-level risk-stratification versus models used to profile hospital performance. In a patient-level predictive model, the objective is to best predict patient outcomes; the risk-adjustment variables are a means to better predict these outcomes. As an example, a patient who has a serious complication of care may be at higher risk of mortality and readmission; therefore, complications might be useful to include in a model used for patient-level prediction.

By contrast, the role of risk-adjustment in a hospital profiling model is to level the playing field for hospitals for measures that assess relative performance—that is, how well hospitals are doing compared to others with similar patients. The risk-adjustment variables should be only those that are inherent to the patient and present at admission. Although risk adjusting for complications of care could increase the statistical power of a profiling model, it would not make sense to risk-adjust for complications here since it could lead hospitals with high rates of complications to appear to be performing better than hospitals with similar patients even though the quality of care is worse.

In addition, as noted above, this measure's risk model has been validated using registry data from the STS' Adult Cardiac Surgery Database and produced nearly identical c-statistics in a matched set of patients with correlation coefficients between 0.92 and 0.96, depending upon the statistic used.

Suter L.G., Wang, C., Vellanky S., Potteiger J., Curtis J., Lin Z., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Unplanned Readmission Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

Comment: One commenter requested that the measure differentiate between readmissions within and outside the control of the bypass surgeon.

Response: We interpret readmissions “within and outside the control of the bypass surgeon” to mean those that are only related to the CABG surgery. We proposed this measure for hospital-specific performance measurement, not for measurement of surgeon-level performance. The measure defines the outcome as “all-cause” unplanned readmissions rather than readmissions only related to the CABG surgery for several reasons. First, from the patient perspective, readmission for any reason is likely to be an undesirable outcome of care.

Second, there is no reliable way to determine whether a readmission is related to the previous hospitalization based on the documented cause of readmission. For example, a CABG patient with post-operative left ventricular dysfunction inadequately managed by the hospital performing the surgery may ultimately be readmitted for heart failure. It would be inappropriate to treat this readmission as unrelated to the care the patient received for their CABG surgery.

Third, the range of potentially avoidable readmissions also includes those not directly related to the index condition such as those resulting from medication reconciliation errors, poor communication at discharge, or inadequate follow-up post-discharge. Therefore, we believe that creating a comprehensive list of potentially avoidable readmissions related to the previous hospitalization's condition category would be arbitrary and, ultimately, challenging to implement.

Fourth, all existing CMS readmission measures report all-cause readmission, making this approach consistent with existing measures.

Fifth, research shows that readmission reduction interventions can reduce all-cause readmission, not only condition-specific readmission.

Finally, defining the outcome as all-cause readmissions may encourage hospitals to implement broader initiatives aimed at improving the overall care within the hospital and transitions from the hospital setting instead of limiting the focus to a narrow set of condition-specific approaches.

Comment: One commenter cited a study that concluded that readmission rates for the majority of hospitals were unreliable due to low case volume over the measurement period.

Shih and Dimick. Reliability of readmission rates as a hospital quality measure in cardiac surgery. Annals of Thoracic Surgery. April 2014; 97:1214-1219.

Response: The study cited uses a different approach to calculate hospital-specific risk-adjusted readmission rates, including a logistic regression model and distinct risk variables, than that used in our proposed measure. Our proposed measure uses a hierarchical logistic regression model to account for the clustering of patients within hospitals while risk-adjusting for differences in patient case-mix. Therefore, we do not believe that conclusions drawn from this study are generalizable to this measure. Reliability testing of this measure score using a split-sample approach, in which each hospitals' patients are divided into two completely distinct groups and the measure score is calculated for each group and compared, produces an intraclass correlation coefficient of 0.33 on a three year data sample (which is the equivalent of a year and a half of data for each comparison group).

One limitation of this split-sample approach is that the reliability is estimated under the assumption of only half the number of patients per hospital that would normally be used. Using the Spearman Brown prophecy formula to estimate the reliability of the measure if the entire three year cohort was used (that is, if the number of items in a test increases by a factor of N, then the new reliability ρ' can be estimated from the original reliability. Validity for this measure has been documented by both: (1) face validity assessment by a Technical Expert Panel (TEP)—79 percent of TEP members agreed (71 percent moderately or strongly agreed) that the measure will provide an accurate reflection of quality, and (2) in a formal validation study against clinical registry data that documented correlations in excess of 0.90 between clinical data and claims-based risk models.

Traub, R. (1994).MMSS Reliability for the Social Sciences: Theory and Applications (Page 100). Sage Publications. Newbury Park CT.

Suter L.G., Wang C., Araas M., Joyce E., Vellanky S., Potteiger J., Lin Z., Curtis J., Geary L.L., Krumholz H.M., Drye E.D. Hospital-level 30-day All-Cause Mortality Following Coronary Artery Bypass Graft Surgery: Report prepared for the Centers for Medicare & Medicaid Services. 2012.

To assess face validity, we surveyed the Technical Expert Panel and asked each member to rate the following statement using a six-point scale (1=Strongly Disagree, 2=Moderately Disagree, 3=Somewhat Disagree, 4=Somewhat Agree, 5=Moderately Agree, and 6=Strongly Agree): “The readmission rates obtained from the readmission measure as specified will provide an accurate reflection of quality.” Fourteen TEP members provided the following responses: Moderately Disagreed (2), Somewhat Disagreed (2), Somewhat Agreed (4), Moderately Agreed (5), and Strongly Agreed (1). Therefore, 71 percent of TEP members agreed (43 percent moderately or strongly agreed) that the measure will provide an accurate reflection of quality.

After consideration of the public comments we received, we are finalizing the Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure as proposed.

b. Hospital 30-day, All-cause, Risk-Standardized Mortality Rate (RSMR) Following Coronary Artery Bypass Graft (CABG) Surgery.

Krumholz H. CABG Mortality Measure Methodology Report Section 1, Subtask 3.1, Deliverable #49a: Yale New Haven Systems Corporation; Center for Outcomes Research and Evaluation; 2012.

(1) Background

CABG is a priority area for outcomes measure development because it is a common procedure associated with considerable morbidity, mortality, and health care spending. In 2007, there were 114,028 hospitalizations for CABG surgery and 137,721 hospitalizations for combined surgeries for CABG and valve procedures (“CABG plus valve” surgeries) among Medicare FFS patients in the U.S.

Drye E, Krumholz H, Vellanky S, Wang Y. Probing New Conditions and Procedures for New Measure Development: Yale New Haven Systems Corporation; Center for Outcomes Research and Evaluation; 2009:7.

CABG surgeries are costly procedures that account for the majority of major cardiac surgeries performed nationally. In FY 2009, isolated CABG surgeries accounted for almost half (47.6 percent) of all cardiac surgery hospital admissions in Massachusetts. This provides an example of the frequency in which a CABG Is performed for a patient admitted for cardiac surgery. In 2008, the average Medicare payment was $30,546 for CABG without valve and $47,669 for CABG plus valve surgeries.

Massachusetts Data Analysis Center. Adult Coronary Artery Bypass Graft Surgery in the Commonwealth of Massachusetts: Hospital and Surgeons Risk-Standardized 30-Day Mortality Rates. In: Health MDoP, ed. Boston; 2009:77.

Pennsylvania Health Care Cost Containment Council. Cardiac Surgery in Pennsylvania 2008-2009. Harrisburg; 2011:60.

Mortality rates following CABG surgery are not insignificant and vary across hospitals. For example, in 2009 Medicare FFS data indicated that the median hospital-level, risk-standardized mortality rate after CABG was 3.0 percent and ranged from 1.5 percent to 7.9 percent. Even within a single state, the observed in-hospital, 30-day all-cause, hospital-level mortality rate was 1.81 percent and ranged from 0.0 percent to 5.6 percent among patients who were discharged after CABG surgery (without any other major heart surgery earlier in the hospital stay) in New York in 2008. The risk-adjusted mortality rate ranged from 0.0 percent to 8.2 percent.

Ibid.

New York State Department of Health. Adult Cardiac Surgery in New York State 2006-2008; 2010:54.

Variation in these rates suggests that there is room for improvement. An all-cause mortality measure for patients who undergo CABG surgery will provide hospitals with an incentive to reduce mortality through improved coordination of perioperative care and discharge planning. This is further supported by the success of registry-based mortality measures in reducing CABG mortality rates. For example, California reports that CABG mortality in that state has steadily declined from 2.9 percent in 2003, the first year of mandatory reporting of their state registry measure, to 2.2 percent in 2008.

California CABG Outcomes Reporting Program. The California Report on Coronary Artery Bypass Graft Surgery: 2007-2008 Hospital and Surgeon Data. 2011:119.

The specifics of the measure methodology are included in the measure methodology report we have posted on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. We refer readers to the report for further details on the risk-adjustment statistical model.

We proposed to include this non-NQF-endorsed measure in the Hospital IQR Program under the exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act as previously discussed in section IX.A.7. of the preamble of this final rule. Although the proposed measure is not currently NQF-endorsed, we considered available measures that have been endorsed or adopted by the NQF, and we were unable to identify any measures that assess hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following coronary artery bypass graft (CABG) surgery. We also are not aware of any other 30-day, all-cause, RSMR measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. The measure has been reviewed by the MAP and was conditionally supported pending NQF endorsement as detailed in its Pre-Rulemaking 2014 Map Recommendations Report available at: https://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report__2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx.This measure was submitted to NQF on March 17, 2014 and is currently under review.

(2) Overview of Measure

The CABG mortality measure assesses hospitals' 30-day, all-cause risk-standardized rate of mortality following admission for a CABG procedure. In general, the measure uses the same approach to risk adjustment and hierarchical logistic modeling (HLM) methodology that is specified for CMS' other mortality measures previously adopted for this program. Information on how the measure employs HLM can be found in the 2012 CABG Mortality Measure Methodology Report (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

(3) Data Sources

The proposed measure is claims-based. It uses Medicare administrative data from hospitalizations for Medicare FFS beneficiaries hospitalized for a CABG procedure.

(4) Outcome

The outcome for this measure is 30-day, all-cause mortality, defined as death for any cause within 30 days of the date of the index procedure date. We use a standard period of assessment so that the outcome for each patient is measured consistently. Without a standard period, variation in length of stay would have an undue influence on mortality rates, and institutions would have an incentive to adopt strategies to shift deaths out of the hospital without improving quality. The measure differs from the timeframe used in the other 30-day mortality measures in the Hospital IQR Program by starting the outcome window from the procedure date rather than the admission date. Data from 2009 Medicare FFS patients demonstrates that 25 percent of CABG procedures occurred more than 3 days after the admission date. Therefore, dating the measurement period from admission would potentially underestimate the period of risk for a substantial number of hospitals.

We chose 30-day mortality because it is an outcome that can be strongly influenced by hospital care and the early transition to the outpatient setting. Clinical experts concur that a 30-day timeframe is clinically sensible for measuring outcomes following CABG surgery.

The measure assesses all-cause mortality rather than CABG-specific mortality for several reasons. First, limiting the measure to CABG-related mortalities may limit the focus of efforts to improve care to a narrow set of approaches as opposed to encouraging broader initiatives aimed at improving the overall in-hospital care. Second, cause of death may be unreliably recorded and it is often not possible to exclude quality issues and accountability based on the documented cause of mortality. Finally, from a patient perspective, death due to any cause is the outcome that matters.

(5) Cohort

The cohort includes patients aged 65 years and older who received a qualifying CABG procedure at an acute care facility. Patients are eligible for inclusion if they had a qualifying CABG procedure and continuous enrollment in Medicare FFS one year prior to the first day of the index hospital stay and through 30 days post-procedure.

In order to include a clinically-coherent set of patients in the measure, we sought input from clinical experts regarding the inclusion of other concomitant cardiac and non-cardiac procedures, such as valve replacement and carotid endarterectomy. Adverse clinical outcomes following such procedures are higher than those following “isolated” CABG procedures, that is, CABG procedures performed without concomitant high-risk cardiac and non-cardiac procedures. Limiting the measure cohort to “isolated” CABG patients is consistent with published reports of CABG outcomes; therefore, the measure cohort considers only patients undergoing isolated CABG as eligible for inclusion in the measure. We defined isolated CABG patients as those undergoing CABG procedures without concomitant valve or other major cardiac, vascular or thoracic procedures. In addition, our clinical experts, consultants, and Technical Expert Panel (TEP) members agreed that an isolated CABG cohort is a clinically coherent cohort for quality measurement. For detailed information on the cohort definition, we refer readers to the 2012 CABG Mortality Measure Methodology Report on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

Hannan EL, Zhong Y, Lahey SJ, et al. 30-day readmissions after coronary artery bypass graft surgery in New York State. JACC Cardiovasc Interv. 2011;4(5):569-576.

Ibid.

(6) Inclusion and Exclusion Criteria

The measure includes hospitalizations for patients who are 65 years of age or older at the time of index admission and for whom there was a complete 12 months of Medicare FFS enrollment to allow for adequate risk adjustment. The measure excludes the following admissions from the measure cohort: (1) Admissions for patients who leave hospital against medical advice excluded because providers do not have the opportunity to deliver full care and prepare the patient for discharge); and (2) admissions for patients with subsequent qualifying CABG procedures during the measurement period (a repeat CABG procedure during the measurement period very likely represents a complication of the original CABG procedure and is a clinically more complex and higher risk surgery, therefore we select the first CABG admission for inclusion in the measure and exclude subsequent CABG admissions from the cohort).

(7) Risk-Adjustment

The measure adjusts for differences across hospitals in how at risk their patients are for death relative to patients cared for by other hospitals. The measure uses claims data to identify patient clinical conditions and comorbidities to adjust patient risk for readmission across hospitals, but does not adjust for potential complications of care. We refer readers to section IV.H.4 of the preamble of this final rule for further discussion of risk-adjustment for socioeconomic factors.

(8) Calculating the Risk-Standardized Mortality Ratio (RSMR)

The measure is calculated using hierarchical logistic modeling (HLM). This approach appropriately accounts for the types of patients a hospital treats (that is, hospital case mix), the number of patients it treats, and the quality of care it provides. The HLM is an appropriate statistical approach to measuring quality based on patient outcomes when the patients are clustered within hospitals (and therefore the patients' outcomes are not statistically independent) and the number of eligible patients for the measure varies from hospital to hospital. As noted above, the measure methodology defines hospital case mix based on the clinical diagnoses provided in the hospital claims for their patients' inpatient and outpatient visits for the 12 months prior to the CABG hospitalization, as well as those present in the claims for care at admission. The methodology, however, specifically does not account for diagnoses present in the index admission that may indicate complications rather than patient comorbidities.

The RSMR is calculated as the ratio of the number of predicted deaths to the number of expected deaths and then the ratio is multiplied by the national unadjusted mortality rate. The ratio is greater than one for hospitals that have more deaths than would be expected for an average hospital with similar cases and less than one if the hospital has fewer deaths than would be expected for an average hospital with similar cases. This approach is analogous to a ratio of “observed” or “crude” rate to an “expected” or risk-adjusted rate used in other similar types of statistical analyses.

The RSMR is a point estimate—the best estimate of a hospital's mortality rate based on the hospital's case mix. For displaying the measure for the Hospital IQR Program, we computed an interval estimate, which is similar to the concept of a confidence interval, to characterize the level of uncertainty around the point estimate. We use the point estimate and interval estimate to determine hospital performance (for example, higher than expected, as expected, or lower than expected). For more detailed information on the calculation methodology, we refer readers to our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

We invited public comment on this proposal.

Comment: Several commenters supported the inclusion of CABG mortality into the Hospital IQR Program.

Response: We thank the commenters for their support.

Comment: Several commenters opposed adoption of this measure because it is not NQF-endorsed.

Response: We proposed to include this non-NQF-endorsed measure under the Hospital IQR Program exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act. This provision provides that, in the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary. We refer readers to section IX.A.7. of the preamble of this final rule where we discuss other commenters concerns regarding our use of non-NQF-endorsed measures.

Although the proposed measure is not currently NQF-endorsed, it is pending NQF endorsement. We considered available measures that have been endorsed or adopted by the NQF. We also are not aware of any other similar measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. We refer readers to section IX.A.7. of the preamble of this final rule for a general discussion on adoption of non-NQF-endorsed measures. This measure was submitted to NQF for endorsement and is currently under review.

Comment: One commenter recommended that CMS focus on developing an electronically-specified measure based on ICD-10-CM/PCS for future adoption instead of the current proposed measure.

Response: We will take this suggestion into consideration as we move towards use of electronic clinical quality measures for CABG measures.

Comment: One commenter did not support the measure construction and risk-adjustment methodology, citing concerns that the low R-squared meant that the measure does not truly differentiate performance between hospitals.

Response: We refer readers to our discussion of this issue above in response to the same concern regarding our proposed Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure.

Comment: One commenter expressed concerns about the reliability and validity of CMS' mortality measures. Several commenters opposed this measure because they believed that a more robust methodology is needed to appropriately hold hospitals accountable.

Response: We refer readers to our discussion of this issue above in response to the same concern expressed for reliability, validity, and robust methodology regarding our proposed Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure.

We understand “robust” as having good reliability and validity and we believe we demonstrated this in the response below which is the similar to the response for the CABG readmission measure.

Comment: Several commenters believed that the CABG mortality measure has poor discrimination ability. One commenter expressed concern regarding the ability of claims data to adequately adjust for mortality risk. The commenter suggested comparing results for this measure with results for the STS CABG mortality measure. A commenter expressed concern regarding the ability of claims data to adequately adjust for mortality risk. The commenter suggested comparing results for this measure with results for the Risk-Adjusted Operative Mortality for CABG mortality measure.

Response: We thank the commenters for their feedback. The STS CABG measure provides a clinical model based upon registry data and the CMS CABG mortality measures uses administrative claims data. These measures have similar but not identical mortality outcomes STS NQF #0119, includes inpatient deaths beyond 30 days, and NQF #2558, excludes inpatient deaths beyond 30 days. For these reasons we would not compare the results of these measures. We refer readers to our discussion of this issue in response to the same concern above regarding our proposed Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure.

Comment: Several commenters did not support the CABG mortality measure as it does not risk adjust for SES. Commenters requested CMS risk adjust the measure for SES and stated that this materially impacts the patient's likelihood of death and the members on NQF's panel to examine adjusting for SES recommended adjusting for SES when appropriate.

Response: We refer readers to our earlier responses in sections IX.A.6. and 7. Of the preamble to this final rule under our Hospital IQR Program discussion. We also refer readers to our responses in section IV.H.4. of the preamble to this final rule for further discussion of this issue.

Comment: One commenter requested that CMS carefully monitor CABG utilization in high‐risk, older patients to ensure hospitals are not avoiding performing them for high risk patients in order to appear as lower mortality. The commenter noted that if evidence is found that CABG surgeries are not being offered to high‐risk patients, CMS may need to reconsider its risk adjustment methodology to mitigate this unintended consequence.

Response: We refer readers to our discussion of this issue in response to the same concern regarding our proposed Hospital 30-day, All-cause, Unplanned, Risk-Standardized Readmission Rate (RSRR) Following Coronary Artery Bypass Graft (CABG) Surgery measure.

After consideration of the public comments we received, we are finalizing the Hospital 30-day, All-cause, Risk-standardized Mortality Rate (RSMR) Following Coronary Artery Bypass Graft (CABG) Surgery measure as proposed.

c. Hospital-level, Risk-standardized 30-day Episode-of-Care Payment Measure for Pneumonia

(1) Background

Providing high-value care is an essential part of our mission to provide better health care for individuals, better health for populations, and lower costs for health care. In order to incentivize innovation that promotes high-quality care at high value it is critical to examine measures of payment and patient outcomes concurrently. There is evidence of variation in payments at hospitals for pneumonia patients; mean 30-day risk-standardized payment among Medicare FFS patients aged 65 or older hospitalized for pneumonia in 2008-2009 was $13,237, and ranged from $8,281 to $27,975 across 4,155 hospitals. However, high or low payments to hospitals are difficult to interpret in isolation. Some high payment hospitals may have better clinical outcomes when compared with low payment hospitals while other high payment hospitals may not have better outcomes. For this reason, the value of hospital care is more clearly assessed when pairing hospital payments with hospital quality. Therefore, we proposed to include this non-NQF-endorsed measure in the Hospital IQR Program under the exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act as previously discussed in section IX.A.7. of the preamble of this final rule. Although the proposed measure is not currently NQF-endorsed, we considered available measures that have been endorsed or adopted by the NQF, and we were unable to identify any measures that assess hospital risk- standardized payment associated with a 30-day episode-of-care for pneumonia. We also are not aware of any other 30-day episode-of- care pneumonia measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. The MAP supports this measure but reiterated the need for this measure to be submitted for NQF-endorsement: https://www.qualityforum.org/Setting_Priorities/Partnership/MAP_Final_Reports.aspx. This measure was submitted to the NQF for endorsement on April 18, 2014.

We believe it is important to adopt this measure as pneumonia is one of the leading causes of hospitalization for Americans 65 and over, and pneumonia patients incur roughly $10 billion in aggregate health care costs. Furthermore, because 30-day all-cause mortality and readmission measures for pneumonia are already publicly reported, pneumonia serves as a model condition for assessing relative value for an episode of care that begins with an acute hospitalization because including this measure in the Hospital IQR Program and publicly reporting it on Hospital Compare will allow stakeholders to assess information about a hospital's quality and cost of care for pneumonia. The measure reflects differences in the management of care for patients with pneumonia both during hospitalization and immediately post-discharge. By focusing on one specific condition, value assessments may provide actionable feedback to hospitals and incentivize targeted improvements in care.

Lindenauer PK, Lagu T, Shieh M, Pekow PS, Rothberg MB. Association of diagnostic coding with trends in hospitalizations and mortality of patients with pneumonia, 2003-2009. JAMA: The Journal of the American Medical Association. 2012;307(13):1405-1413.

(2) Overview of Measure and Rationale for Examining Payments for a 30-Day Episode-of-Care

The pneumonia payment measure assesses hospital risk-standardized payment associated with a 30-day episode-of-care for pneumonia for any hospital participating in the Hospital IQR Program. The measure includes Medicare FFS patients aged 65 or older admitted for pneumonia and calculates payments for these patients over a 30-day episode-of-care beginning with the index admission. In general, the measure uses the same approach to risk-adjustment as our 30-day outcome measures previously adopted for the Hospital IQR Program. We refer readers to our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

When examining variation in payments, consideration of the episode-of-care triggered by admission is meaningful for several reasons. First, hospitalizations represent a brief period of illness that requires ongoing management post-discharge and decisions made at the admitting hospital affect payments for care in the immediate post-discharge period. Second, attributing payments for a continuous episode-of-care to admitting hospitals may reveal practice variations in the full care of the illness that can result in increased payments. Third, a 30-day preset window provides a standard observation period by which to compare all hospitals. Lastly, the pneumonia payment measure is intended to be paired with our 30-day pneumonia mortality and readmission measures and capture payments for Medicare patients across care settings, services, and supplies, except for Medicare Part D (that is, inpatient, outpatient, skilled nursing facility, home health, hospice, physician/clinical laboratory/ambulance services, supplier Part B items, and durable medical equipment, prosthetics/orthotics, and supplies).

We have posted the measure methodology report on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. We refer readers to the report for further details on the risk adjustment statistical model as well as the model results.

(3) Data Sources

The proposed measure is claims-based and uses Medicare administrative data that contain hospitalizations and payments for Medicare FFS beneficiaries hospitalized with pneumonia.

(4) Outcome

The primary outcome of the pneumonia payment measure is the hospital-level risk-standardized payment for a pneumonia episode-of-care. The measure captures payments for Medicare patients across all care settings, services, and supplies, except Part D. By risk-standardizing the payment measure, we are able to adjust for case-mix at any given hospital and compare a specific hospital's pneumonia payment to other hospitals with the same case-mix. The analytic time frame for the pneumonia payment measure begins with the index admission for pneumonia and ends 30 days post-admission.

In order to isolate payment variation that reflects practice patterns rather than CMS payment adjustments, the pneumonia payment measure excludes policy and geography payment adjustments unrelated to clinical care decisions. We achieve this by “stripping” or “standardizing” payments for each care setting. Stripping refers to removing geographic differences and policy adjustments in payment rates for individual services from the total payment for that service. Standardizing refers to averaging payments across geographic areas for those services where geographic differences in payment cannot be stripped. Stripping and standardizing the payment amounts allows for a fair comparison across hospitals based solely on payments for decisions related to clinical care of pneumonia.

(5) Cohort

We created the pneumonia payment measure cohort to be aligned with the publicly reported pneumonia mortality measure cohort. Consistent with these measures, the pneumonia payment measure includes hospitalizations with a principal hospital discharge diagnosis of pneumonia using the International Classification of Diseases, 9th Edition, Clinical Modification (ICD-9-CM). These measures will use data from July 2010-June 2013, which does not yet include the period for which ICD-10 codes are mandatory. We refer readers to our discussion of data collection for this measure during the transition period from ICD-9-CM codes to ICD-10-CM/PCS codes (79 FR 28232). A full list of ICD-9-CM codes included in the final cohort can be found in Appendix B of the technical report on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. The measure includes only those hospitalizations from short-stay acute care hospitals in the index cohort and restricts the cohort to patients enrolled in FFS Medicare Parts A and B (with no Medicare Advantage coverage).

(6) Inclusion and Exclusion Criteria

The pneumonia payment measure includes hospitalizations for patients 65 years or older at the time of index admission and for whom there was a complete 12 months of FFS enrollment to allow for adequate risk adjustment. An index admission/hospitalization is the initial pneumonia admission that triggers the 30-day episode-of-care for this payment calculation. The measure excludes the following admissions from the measure cohort: (1) Admissions for patients with fewer than 30 days of post-admission enrollment in Medicare because this is necessary in order to identify the outcome (payments) in the sample over the analytic period; (2) admissions for patients having a principal diagnosis of pneumonia during the index hospitalization who were transferred from another acute care facility are excluded, because the hospital where the patient was initially admitted made the critical acute care decisions (including the decision to transfer and where to transfer); (3) admissions for pneumonia patients who were discharged on the same or next day as the index admission and did not die or get transferred are excluded, because it is unlikely these patients suffered a clinically significant pneumonia; (4) admissions for patients enrolled in the Medicare Hospice program any time in the 12 months prior to the index hospitalization, including the first date of the index admission are excluded, because it is likely that these patients are continuing to seek comfort care and their goal may not be survival; (5) admissions for patients who are discharged alive and against medical advice are excluded because providers did not have the opportunity to deliver full care and prepare the patient for discharge; (6) admissions for patients transferred to or from federal or Veterans Administration hospitals are excluded, because we do not have claims data for these hospitals; thus, including these patients would systematically underestimate payments; and (7) admissions without a DRG or DRG weight for the index hospitalization are excluded, because we cannot calculate a payment for these patients' index admission using the IPPS; this would underestimate payments for the entire episode-of-care. There are two portions of the DRG system that determine how much a provider is reimbursed. The first is the DRG itself which indicates the reason a patient was admitted. The second is the DRG weight which determines the severity of the admission. Without either of these, we were unable to calculate the payment for the index admission.

(7) Risk Adjustment

The measure adjusts for differences across hospitals in how payments are affected by patient comorbidities relative to patients cared for by other hospitals. We refer readers to section IV.H.4 of the preamble of this final rule for further discussion of risk-adjustment for socioeconomic factors.

(8) Calculating the Risk-Standardized Payment (RSP)

The measure is calculated using a hierarchical generalized linear model with a log link and a Poisson error distribution. This is a widely accepted statistical method that enables fair evaluation of relative hospital performance by taking into account patient risk factors as well as the number of patients that a hospital treats. This statistical model accounts for the structure of the data (patients clustered within hospitals) and calculates: (1) how much variation in hospital payment overall is accounted for by patients' individual risk factors (such as age and other medical conditions); and (2) how much variation is accounted for by hospital-specific performance. This approach appropriately models a positive, continuous, right-skewed outcome like payment and also accounts for the types of patients a hospital treats (that is, hospital case mix), the number of patients it treats, and the quality of care it provides. The hierarchical generalized linear model is an appropriate statistical approach to measuring quality based on patient outcomes when the patients are clustered within hospitals and sample sizes vary across hospitals. Clustered patients are within the same hospital, and the quality of care of the hospital effects all patients, so the outcomes for each hospital's patients are not fully independent (that is, completely unrelated) as is assumed by many statistical models. As noted above, the measure methodology defines hospital case mix based on the clinical diagnoses provided in the hospital claims for their patients' inpatient and outpatient visits for the 12 months prior to the pneumonia hospitalization, as well as those present in the claims for care at admission. This methodology specifically does not, however, account for diagnoses present in the index admission that may indicate complications rather than patient comorbidities.

The RSP is calculated as the ratio of predicted payments to expected payments and then the ratio is multiplied by the national unadjusted average payment for an episode of care. The ratio is greater than one for hospitals that have higher payments than would be expected for an average hospital with similar cases and less than one if the hospital has lower payments than would be expected for an average hospital with similar cases. This approach is analogous to a ratio of “observed” or “crude” rate to an “expected” or “risk-adjusted” rate used in other similar types of statistical analyses.

The RSP is a point estimate—the best estimate of a hospital's payment based on the hospital's case mix. To calculate the measure for the Hospital IQR Program, we computed an interval estimate, which is similar to the concept of a confidence interval, to characterize the level of uncertainty around the point estimate, we use the point estimate and interval estimate to determine hospital performance (for example, higher than expected, as expected, or lower than expected). The interval estimate indicates that the true value of the payment ratio lies between the lower limit and the upper limit of the interval. For more detailed information on the calculation methodology, we refer readers to our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

This measure is meant to be paired with our 30-day pneumonia mortality and/or readmission measure in order for us to gain a better understanding of the value of care for a hospital's patients and the nation as a whole.

We invited public comment on this proposal.

Comment: One commenter did not support inclusion of the heart failure and pneumonia payment measures in the Hospital IQR Program because of concern that much of the variation in 30-day episode measures is attributable to factors outside of the hospitals control, most notably post-acute care (PAC) services. The commenter felt that measures of accountability should hold all entities accountable as opposed to focusing only on hospitals.

Response: We appreciate the commenter's suggestion and note that we addressed this question in the FY 2014 IPPS/LTCH PPS final rule. In that final rule (78 FR 50804), we stated that “when considering payments to hospitals, we attributed payments for a 30-day episode of care to the hospital since the episode is triggered by admission to an inpatient hospitalization. Hospitalizations represent a brief period of acute illness that requires ongoing management post- discharge and hospitals are often directly responsible for scheduling post-discharge follow-up. Therefore decisions made at the admitting hospital affect not only the hospitalization payments, but payments for care in the immediate post-discharge period.”

Comment: Several commenters generally supported and appreciated CMS' proposal to report 30-day risk-standardized episode of care payment measures for pneumonia and heart failure, as a way to optimally measure care for these patients. A commenter urged CMS to monitor measure results with respect to volume of procedures. A commenter supported condition-specific or more granular, episode-based payment measures over the Medicare Spending Per Beneficiary (MSPB) measure.

Response: We thank the commenters for their support, We will take their recommendations to monitor measure results with respect to volume of procedures and the request to add condition-specific or more granular, episode-based payment measures into consideration when planning future measure development.

Comment: Several commenters believed payment measures are necessary, but do not support payment measures that examine episodes of care beyond the inpatient admission due to variations in availability of PAC services.

Response: Because acute care providers make decisions that affect PAC spending, including scheduling follow-up care and others, we believe it is appropriate to attribute payments arising from the PAC setting to the acute care provider.

Comment: A commenter stated that CMS should adjust episode-based payment measures for outcome differences that accrue over clinically relevant time horizons.

Response: We thank the commenter for this suggestion, and will consider these comments in the future. We appreciate the commenter's feedback. However we believe that the proposed measure does account for outcome differences over clinically relevant time horizons as the measure captures payments for Medicare patients across all care settings, services, and supplies, except Part D.

Comment: Several commenters did not support the use of the Hospital-level, Risk-standardized 30-day Episode-of-Care Payment Measure for Pneumonia measure in the Hospital IQR Program because they believed that the proposed measure reflected actions of many health care entities that are beyond the hospital's control, such as cost variation in Medicare spending and notably PAC services. The commenters felt measures of accountability should hold all entities accountable as opposed to focusing only on hospitals. Commenters noted that hospitals are legally unable to direct patient toward high-quality, cost-efficient providers. Several commenters were concerned that the payment measures assume hospitals have more control over costs that occur post hospitalization than what is realistic and it reflects actions of many health care entities that are beyond the hospital's control. Therefore, the commenters stated that the pneumonia and heart failure payment measures were not appropriate for evaluating care exclusively in the inpatient setting.

Commenters suggested the measures would be more appropriate for physician, LTCH, home health, and PAC reporting programs. Several commenters believed measures should hold accountable all entities so that incentives are aligned across continuum. A commenter noted that legal and regulatory challenges at the State and federal level prevent hospitals from coordinating care as fully as possible and episode of care measures holding only the hospital accountable create misaligned incentives which could lead to unintended consequences.

Response: We appreciate the commenters' suggestions and note that we addressed many of these questions in the FY 2014 IPPS/LTCH PPS final rule. In that final rule (78 FR 50804), we stated that, “when considering payments to hospitals, we attributed payments for a 30-day episode of care to the hospital since the episode is triggered by admission to an inpatient hospitalization. Hospitalizations represent a brief period of acute illness that requires ongoing management post- discharge and hospitals are often directly responsible for scheduling post-discharge follow-up. Therefore decisions made at the admitting hospital affect not only the hospitalization payments, but payments for care in the immediate post-discharge period.” Finally, the objective of these episode of care payment measures is to encourage efficiencies gained by well-coordinated care across a patient's experience of illness.

We understand the commenters concerns about differences among hospitals in the availability of post-acute services, such as LTCHs. We hope that the differences in episode payments revealed by these measures will catalyze hospitals, other providers and communities to engage in an examination of local service availability to encourage efficient and sufficient services are available to all patients. Without the reporting of standardized episode payment measures, the knowledge of differences among hospitals payment patterns would not be available to provide incentives for such efforts. Although hospitals are not responsible for all differences in episode payments alone, they are well-positioned to participate in such collaborations.

Comment: A commenter was disappointed that CMS continue to develop and adopt measures that examine episodes of care beyond the inpatient admission. The commenter stated that measures of accountability, such as the proposed episode measures, should hold accountable all entities so that the incentives are aligned across the continuum.

Another commenter opposed measures that reflect the broad spectrum of care inside and outside of the hospital. The commenter did not believe that measures that encompass a range of services from admission until 30 days post-discharge should be used as an indicator of hospital-specific care.

Response: We respectfully disagree with the commenters. We refer readers to our discussion of this issue in response to the same concern in the response above. In addition, we believe that these measures should reflect the broad spectrum of care inside a hospital as well as care transitions, which are important for hospitals' and for the health care system's efforts to reduce readmissions and prevent hospitals from being financially penalized. We believe measures that look beyond the discharge will encourage hospitals to communicate more effectively with their patients and their peers thereby, improving care, reducing costs, and improving the health of the nation.

Comment: A commenter did not support inclusion of the pneumonia or heart failure payment measures in in the Hospital IQR Program, because they do not exclude certain high-cost patients (patients with ESRD, cancer, or HIV/AIDS).

Response: We appreciate the concern about high-cost patients. The payment measures are intended to assess differences in payment associated with different practice patterns for the broad range of patients cared for by a hospital. We note that the episode of care measures account for the fact that some hospitals care for more patients with needs for high-cost care by risk adjusting for patients' conditions, such as cancer, rather than excluding such patients. In the course of selecting variables for risk-adjustment, high-cost chronic conditions such as cancer, end-stage renal disease, HIV/AIDS, and others are considered. Once the variables are considered, we determined if the variable should be included in the measure. To be included in the measure, each risk variable must be found to be significantly and consistently related to the payment outcome in the risk-model selection process. We note that the Agency for Healthcare Research and Quality's Condition Categories for HIV/AIDS; Metastatic Cancer and Acute Leukemia; Lung, Upper Digestive Tract, and Other Severe Cancers; Lymphatic, Head And Neck, Brain, and Other Major Cancers; Dialysis Status; and Renal Failure are included in the final risk adjustment model for pneumonia payment. The Condition Categories for HIV/AIDS; Dialysis Status; and Renal Failure were also included the final risk-adjustment model for HF payment. The HF measures' risk-adjustment was discussed at length by the NQF Cost and Resource Steering Committee. In its final vote, the NQF Cost and Resource Steering Committee recommended endorsement of the episode-of-care payment measure for heart failure.

Comment: A commenter was concerned that the measures unfairly disadvantage hospitals that treat sicker patients. For example, patients with heart failure who receive a defibrillator are sicker, however they are not excluded from the measures, so hospitals that perform this service appear less efficient.

Response: We appreciate the commenter's concern about complex patient factors that may contribute to the cost of care. The payment measures are risk-adjusted in order to account for differences in case-mix, or patient complexity, between hospitals. For each patient, the claims for the 12-months prior to the measured hospitalization are examined to identify additional clinical conditions that patients may have which could contribute to costs of care. These conditions are included in the risk-model for the measure to ensure that all providers are assessed fairly and avoid putting providers at risk of appearing to have patient costs that are higher than other hospitals due to the clinical complexity of their patients. Although we do not believe that the use of defibrillators is likely to substantially change hospitals' results, we appreciate this comment and plan to investigate the prevalence of defibrillators in the heart failure cohort and its effect on the payment outcome.

Comment: A commenter did not support the inclusion of the PN and HF payment measures in the Hospital IQR Program and recommended using a single hospital-wide payment measure instead of condition-specific payment measures to pool information for all patients to increase sample size and improve reliability.

Response: We believe the condition-specific payment measures are useful for several reasons. By focusing on one specific condition, payment measures may provide actionable feedback to hospitals and incentivize targeted improvements in care. Heart failure and pneumonia are both common conditions in the elderly with a substantial range in payments due to different practice patterns. Furthermore, because 30-day all-cause mortality and readmission measures for heart failure and pneumonia are already publicly reported, heart failure and pneumonia serve as model conditions for examining both payments for an episode-of-care and the quality of a hospital's care for the same patient population.

Comment: Commenters recommended that CMS pilot the PN and HF payment measures prior to implementation.

Response: We thank the commenter for their recommendation. We will consider this as we plan dry runs in the future. A dry run provides the opportunity for hospitals to review their measure results and ask questions about the measure methodology. The measure results used during a dry run are based on data outside of the performance period designated for a given fiscal year, and the measure results are made available to hospitals on a secure Web site and are not publically reported. From our perspective, a dry run is type of pilot in which hospitals become familiar with their measure results and the measure methodology.

Comment: A commenter requested that CMS transparently assess the reliability of the PN and HF payment measures prior to adoption into the Hospital IQR Program.

Response: We appreciate this feedback. We note that we have been transparent in assessing the reliability of the PN and HF payment measures, in that the measure methodologies for these measures contain the reliability testing results and have been posted at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html in May 2015 We note that the Intraclass Correlation Coefficient (ICC) is a statistical process used to assess the reliability of measures. The ICC score can be used to determine the extent to which assessments of a hospital using different, but randomly selected subsets of patients produces similar measures of hospital performance. To the extent that the calculated measures of these two subsets agree, we have evidence that the measure assesses an attribute of the hospital, not of the patients. The agreement between the two independent assessments of each hospital was 0.825 for the PN measure and 0.752 for the HF measure, which according to the conventional interpretation, is “almost perfect” for the PN measure and “substantial” for the HF measure.

Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977:159-174.

Comment: A commenter was concerned about CMS measuring overuse, as there are patient scenarios that are not addressed by available evidence. The commenter stated that proper evaluation of validity and reliability is lacking; however, current registry-based measures are filling this gap. The commenter recommends halting the development and implementation of these measures.

Response: These measures are not specifically designed to identify overuse. We interpret overuse to mean using more resources than expected given how sick the patients are. Rather, the measures are designed to evaluate broad patterns of care, both within the inpatient environment and in the transition to the outpatient setting, that might lead to higher overall payments. As noted in another response above, the reliability and validity of these measures has been evaluated by both a Technical Expert Panel and the NQF Cost and Resource Use Standing Committee. We also analyzed the ICC score for these measures to help assess reliability. Although registry data offers some advantages, it is much more burdensome for hospitals to collect and is not uniformly available.

Comment: Several commenters requested that CMS adjust the payment measures for SES based on the NQFs expert panel recommendations.

Response: We refer readers to section IV.H.4. of the preamble to this final rule for further discussion of this issue.

After consideration of the public comments we received, we are finalizing the Hospital-Level, Risk-Standardized Payment Associated with a 30-Day Episode of Care for Pneumonia measure, as proposed.

d. Hospital-Level, Risk-Standardized 30-Day Episode-of-Care Payment Measure for Heart Failure

(1) Background

There is evidence of variation in payments at hospitals for heart failure patients; mean 30-day risk-standardized payment among Medicare FFS patients aged 65 or older hospitalized for heart failure in 2008-2009 was $13,922, and ranged from $9,630 to $20,646 across 3,714 hospitals. However, high or low payments to hospitals are difficult to interpret in isolation. Some high payment hospitals may have better clinical outcomes when compared with low payment hospitals while other high payment hospitals may not have better outcomes. For this reason, the value of hospital care is more clearly assessed when pairing hospital payments with hospital quality. Therefore, we proposed to include this non-NQF-endorsed measure: hospital risk-standardized payment associated with a 30-day episode-of-care for heart failure in the Hospital IQR Program under the exception authority in section 1886(b)(3)(B)(IX)(bb) of the Act as previously discussed in section IX.A.7. of the preamble of this final rule. Although the proposed measure is not currently NQF-endorsed, we considered available measures that have been endorsed or adopted by the NQF, and we were unable to identify any measures that assess hospital risk- standardized payment associated with a 30-day episode-of-care for heart failure. We also are not aware of any other 30-day episode-of-care heart failure measures that have been endorsed or adopted by a consensus organization, and found no other feasible and practical measures on this topic. The MAP supports this measure but reiterated the need for this measure to be submitted for NQF endorsement: https://www.qualityforum.org/Setting_Priorities/Partnership/MAP_Final_Reports.aspx. The HF measure was submitted to the NQF and is currently under review as part of the cost and resource use project.

We believe it is important to adopt this measure as heart failure is one of the leading causes of hospitalization for Americans 65 and over and costs roughly $34 billion annually. Furthermore, because 30-day all-cause mortality and readmission measures for heart failure are already publicly reported, heart failure serves as a model condition for assessing relative value for an episode of care that begins with an acute hospitalization. Including this measure in the Hospital IQR Program and publicly reporting it on Hospital Compare will allow stakeholders to assess information about a hospital's quality and cost of care for heart failure. The measure reflects differences in the management of care for patients with heart failure both during hospitalization and immediately post-discharge. By focusing on one specific condition, value assessments may provide actionable feedback to hospitals and incentivize targeted improvements in care.

Russo CA, Elixhauser, A. Hospitalizations in the Elderly Population, 2003. Agency for Healthcare Research and Quality. 2006.

Heidenriech PA, Trogdon JG, Khavjou OA, Butler J, Dracup K, Ezekowitz MD, et al. Forecasting the future of cardiovascular disease in the United States: a policy statement from the American Heart Association. Circulation. 2011;123(8):933-44.

(2) Overview of Measure and Rationale for Examining Payments for a 30-Day Episode-of-Care

The heart failure payment measure assesses hospital risk-standardized payment associated with a 30-day episode-of-care for heart failure for any hospital participating in the Hospital IQR Program. The measure includes Medicare FFS patients aged 65 or older admitted for heart failure and calculates payments for these patients over a 30-day episode-of-care beginning with the index admission. In general, the measure uses the same approach to risk-adjustment as our 30-day outcome measures previously adopted for the Hospital IQR Program. We refer readers to the measure methodology report on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

When examining variation in payments, consideration of the episode-of-care triggered by admission is meaningful for several reasons. First, hospitalizations represent brief periods of illness that require ongoing management post-discharge; and decisions made at the admitting hospital affect payments for care in the immediate post-discharge period. Second, attributing payments for a continuous episode-of-care to admitting hospitals may reveal practice variations in the full care of the illness that can result in increased payments. Third, a 30-day preset window provides a standard observation period by which to compare all hospitals. The term preset window means that every admission will be tracked 30 days post admission in order to apply a standardized measurement window. In order to compare payments across providers it is important that the comparison window is identical for each admission at each hospital. Lastly, the heart failure payment measure is intended to be paired with our 30-day heart failure mortality and readmission measures and capture payments for Medicare patients across all care settings, services, and supplies, except for Medicare Part D (that is, inpatient, outpatient, skilled nursing facility, home health, hospice, physician/clinical laboratory/ambulance services, supplier Part B items, and durable medical equipment, prosthetics/orthotics, and supplies).

We have posted the measure methodology report on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. We refer readers to the report for further details on the risk adjustment statistical model as well as the model results.

(3) Data Sources

The proposed measure is claims-based and uses Medicare administrative data that contain hospitalizations and payments for Medicare FFS beneficiaries hospitalized with heart failure.

(4) Outcome

The primary outcome of the heart failure payment measure is the hospital-level risk-standardized payment for a heart failure episode-of-care. The measure captures payments for Medicare patients across all care settings, services, and supplies, except Part D. By risk-standardizing the payment measure, we are able to adjust for case-mix at any given hospital and compare a specific hospital's heart failure payment to other hospitals with the same case-mix. The analytic time frame for the heart failure payment measure begins with the index admission for heart failure and ends 30 days post-admission. The index admission is any admission included in the measure calculation that begins the 30-day heart failure episode of care.

In order to isolate payment variation that reflects practice patterns rather than CMS payment adjustments, the heart failure payment measure excludes policy and geography payment adjustments unrelated to clinical care decisions. We achieve this by “stripping” or “standardizing” payments for each care setting. These concepts were also discussed previously in the proposed hospital-level, risk-standardized 30-day episode-of-care payment measure for pneumonia measure in section IX.A.7.c.(4) of the preamble of this final rule.

(5) Cohort

We created the heart failure payment measure cohort to be aligned with the publicly reported heart failure mortality measure cohort. Consistent with these measures, the heart failure payment measure includes hospitalizations with a principal hospital discharge diagnosis of heart failure using ICD-9-CM codes included in the final cohort can be found in Appendix B of the technical report on our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html. The measure will be using data from July 2010-June 2013, which does not yet include the period when ICD-10 codes are mandatory. We refer readers to our discussion of data collection for this measure during the transition period from ICD-9-CM codes to ICD-10-CM/PCS codes (79 FR 28234).

An index admission/hospitalization is the initial heart failure admission that triggers the 30-day episode-of-care for this payment calculation. The measure includes only those hospitalizations from short-stay acute care hospitals in the index cohort and restricts the cohort to patients enrolled in FFS Medicare Parts A and B (with no Medicare Advantage coverage). These hospitalizations are the admissions which were included in the measure after applying all inclusion/exclusion criteria.

(6) Inclusion and Exclusion Criteria

The heart failure payment measure includes hospitalizations for patients 65 years or older at the time of index admission and for whom there was a complete 12 months of FFS enrollment to allow for adequate risk adjustment. The measure excludes the following admissions from the measure cohort: (1) admissions for patients with fewer than 30 days of post-admission enrollment in Medicare because this is necessary in order to identify the outcome (payments) in the sample over the analytic period; (2) admissions for patients having a principal diagnosis of heart failure during the index hospitalization who were transferred from another acute care facility are excluded, because the hospital where the patient was initially admitted made the critical acute care decisions (including the decision to transfer and where to transfer); (3) admissions for heart failure patients who were discharged on the same or next day as the index admission and did not die or get transferred are excluded, because it is unlikely these patients suffered a clinically significant heart failure; (4) admissions for patients enrolled in the Medicare Hospice program any time in the 12 months prior to the index hospitalization, including the first date of the index admission are excluded, because it is likely that these patients are continuing to seek comfort care and their goal may not be survival; (5) admissions for patients who are discharged alive and against medical advice are excluded because providers did not have the opportunity to deliver full care and prepare the patient for discharge; (6) admissions for patients transferred to or from federal or Veterans Administration hospitals are excluded, because we do not have claims data for these hospitals; thus, including these patients would systematically underestimate payments; (7) admissions without a DRG or DRG weight for the index hospitalization are excluded, because we cannot calculate a payment for these patients' index admission using the IPPS; this would underestimate payments for the entire episode-of-care; and (8) admissions for patients who receive a heart transplant or LVAD during the index admissions or episode of care because these patients are clinically distinct, generally very high payment cases, and not representative of the typical heart failure patient that this measure aims to capture.

(7) Risk Adjustment

The measure adjusts for differences across hospitals in how payments are affected by patient comorbidities relative to patients cared for by other hospitals. We refer readers to section IV.H.4 of the preamble of this final rule for further discussion of risk-adjustment for socioeconomic factors.

(8) Calculating the Risk-Standardized Payment (RSP)

The measure is calculated using hierarchical generalized linear statistical models with a log link and a Gamma error distribution. This approach appropriately models a positive, continuous, right-skewed outcome like payment and also accounts for the types of patients a hospital treats (that is, hospital case-mix), the number of patients it treats, and the quality of care it provides. The hierarchical generalized linear model is an appropriate statistical approach to measuring quality based on patient outcomes when the patients are clustered within hospitals (and therefore the patients' outcomes are not statistically independent) and sample sizes vary across hospitals. As noted above, the measure methodology defines hospital case mix based on the clinical diagnoses provided in the hospital claims for their patients' inpatient and outpatient visits for the 12 months prior to the heart failure hospitalization, as well as those present in the claims for care at admission. This methodology specifically does not, however, account for diagnoses present in the index admission that may indicate complications rather than patient comorbidities.

The RSP is calculated as the ratio of predicted payments to expected payments and then the ratio is multiplied by the national unadjusted average payment for an episode of care. The ratio is greater than one for hospitals that have higher payments than would be expected for an average hospital with similar cases and less than one if the hospital has lower payments than would be expected for an average hospital with similar cases. This approach is analogous to a ratio of “observed” or “crude” rate to an “expected” or “risk-adjusted” rate used in other similar types of statistical analyses.

The RSP is a point estimate—the best estimate of a hospital's payment based on the hospital's case mix. For displaying the measure for the Hospital IQR Program, we computed an interval estimate, which is similar to the concept of a confidence interval, to characterize the level of uncertainty around the point estimate, we use the point estimate and interval estimate to determine hospital performance (for example, higher than expected, as expected, or lower than expected). For more detailed information on the calculation methodology, we refer readers to our Web site at: http://cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html.

This measure is meant to be paired with our 30-day heart failure mortality and/or readmission measure in order for us to gain a better understanding of the value of care for a hospital's patients and the nation as a whole.

We invited public comment on this proposal.

Comment: Several commenters opposed the payment measures because they did not believe it is fair to hold a hospital responsible for payments that occur outside of its walls. The commenters recommended that these mortality and readmissions measures instead be adopted in the PQRS, as well as long-term care, PAC, home health, and other entities that participate in the patient's care.

One commenter cited a study that stated that 80 percent of the variability in the payment measures is driven by PAC and noted that areas with more LTCHs will likely have higher spending. Several commenters believed measures should hold accountable all entities so that incentives are aligned across the continuum of care. Another commenter noted that legal and regulatory challenges at the State and federal levels prevent hospitals from coordinating care as fully as possible and episode of care measures holding only the hospital accountable create misaligned incentives, which could lead to unintended consequences.

Response: We interpret the commenter's statement that, “these types of measures should instead be adopted in the PQRS, as well as long-term care, PAC, home health, and other entities that participate in the patient's care,” to mean the Long-Term Care Quality Reporting (LTCHQR) Program, PAC (all care provided after a patient is discharged from an index hospitalization), Home Health Quality Reporting Program and other CMS quality reporting programs applicable to entities that participate in the patient's care. As described above, because heart failure is one of the leading causes of hospitalization for Americans 65 and over, and its associated care costs roughly $34 billion annually, we believe it is appropriate to pair a measure of Medicare payments for heart failure with the existing quality measures on this topic. We intend to closely monitor the measure's effects on hospitals' and PAC providers' behavior.

We developed these measures in accordance with national guidelines and in consultation with clinical and measurement experts, key stakeholders, and the public. Furthermore, the AMI/HF measures were recommended for endorsement by the NQF Standing Committee for Cost and Resource Use, Phase 2. This information can be located in the following report: http://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=76905.

Krumholz HM, Brindis RG, Brush JE, et al. Standards for statistical models used for public reporting of health outcomes: an American Heart Association Scientific Statement from the Quality of Care and Outcomes Research Interdisciplinary Writing Group: cosponsored by the Council on Epidemiology and Prevention and the Stroke Council. Endorsed by the American College of Cardiology Foundation. Circulation. Jan 24 2006;113(3):456-462) and “Standards for Measures Used for Public Reporting of Efficiency in Health Care” (Krumholz HM, Keenan PS, Brush JE, Jr., et al. Standards for measures used for public reporting of efficiency in health care: a scientific statement from the American Heart Association Interdisciplinary Council on Quality of Care and Outcomes Research and the American College of Cardiology Foundation. Circulation. Oct 28 2008;118(18):1885-1893.

Comment: Several commenters expressed concern that the heart failure payment measure did not receive NQF endorsement, and specifically, that the Cardiovascular Technical Advisory Panel or the Cost and Resource Use Standing Committee did not endorse the measure. These commenters noted that the Cost and Resource Use Standing Committee felt the risk model did not properly account for differences in patient risk and it was not until CMS pressed for a third vote that it received endorsement (see http://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=76905). Consequently, the commenters believed the measure is premature and should not be implemented.

Response: We appreciate this comment. We note the following sequence of events regarding the recommendation for endorsement of this measure confirms that this measure is not premature in consideration for implementation. Earlier this year the measure was assessed by the Cost and Resource Use Standing Committee. During this part of the endorsement process the Standing Committee did not reach consensus on a recommendation for endorsement and the measure was submitted for public comment. After review of CMS' responses to the public comments the Consensus Standards Approval Committee (CSAC) voted to recommend the Hospital-Level, Risk-standardized 30-day Episode-of-Care Payment Measure for Heart Failure for endorsement. The NQF Board is expected to review this measure in August 2014. We are actively seeking NQF endorsement for this measure. A Voting Draft Report of the Cost and Resource Use Standing Committee can be found at http://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=76905.

Comment: A commenter suggested the need for innovative solutions for providers in addition to outcome measures. This commenter believed that hospitals should: consider innovative ways to identify heart failure patients early in admission; implement evidence-based clinical pathways to assure the patient moves efficiently through their stay with optimal outcomes; develop a tight network of post-acute providers; and implement an enhanced communication system to identify where the patient is at any point in timed during the 30-day window.

Response: We agree with the commenter's points about the need for continued innovation to drive high-quality and efficient care. We believe the measures that we have selected will help drive hospitals to provide that care.

Comment: One commenter noted that much of the care expended during the first 30 days is aimed at increasing long-term survival and requested that CMS consider a measure with a longer outcome window to pair with the measure.

Response: We agree that it is possible that some of the variation in hospital payments will be due to patterns of care that are intended to improve longer term outcomes. However at this time, we are not aware of a publicly reported, or non-NQF endorsed NQF-endorsed quality metric that considers a longer-term outcome with which we can harmonize the payment measure. As part of ongoing measure reevaluation and surveillance, we will evaluate the relationship between payments and longer term outcomes to assess if the performance of hospitals differs when looking at a longer time frame. Our plan is to eventually compare 30-day payments with longer outcomes like 1-year mortality to determine if high upfront payments have a longer term benefit.

Comment: One commenter was concerned that the proposed payment measure will be used in isolation and not understood by practitioners and the public. The commenter recommended that CMS instead create a composite measure with both cost and quality.

Response: We will take into consideration the suggestion to create a single composite measure of cost and quality for future measure development. In order to ensure practitioners and the public appreciate out intent, which is to evaluate payment in the context of quality, we plan to report the payment measure alongside the outcomes measures on the Hospital Compare Web site.

Comment: A commenter noted that it will be difficult to determine value with the existing heart failure measures since mortality and readmission are inversely related and the process measures are almost “topped-out.”

Response: We appreciate this feedback. We intend this episode of care measure to be used in conjunction with the other outcome heart failure measures of readmission and mortality. We do not intend to use the outcome heart failure measures with the heart failure process measures as the outcome and process measure results would not provide useful and comparable information. Regarding the concern of not being able to determine the value of the heart failure episode of care measure since the heart failure mortality and readmission are inversely related, we believe that there is value in the episode of care measure because a hospital's performance on mortality and readmissions measures represents different aspects of quality. We also note that there does not appear to be a meaningful correlation between hospital risk-standardized mortality rates and readmission rates. Finally, we believe that this measure can determine value as it was specifically developed to align with the heart failure mortality and readmissions measure. A recent MedPAC report indicates that there may be an inverse correlation between readmission and mortality rates, but we note that this inverse relationship has been found to be modest (available at http://www.medpac.gov/documents/Jun13_EntireReport.pdf). We recognize the commenter's concern and will monitor changes in the strength of these inverse correlations over time.

Comment: Several commenters did not support adopting the heart failure payment measures for the Hospital IQR Program due to concerns regarding the measures' utility and its attribution specifications, all episode-of-care payments to the admitting hospital.

Response: We view the proposed measure of payments made for heart failure as an important component of quality improvement when paired with existing quality measures. We believe it is important for hospitals to be held accountable for care decisions made during acute care episodes, particularly when those decisions include, for example, scheduling post-discharge follow-up care. We believe the measure appropriately attributes spending during the heart failure episode to the admitting hospital, and we will monitor close hospitals' performance on the measure, as well as possible unintended consequences for patient care. We do not understand the commenter's concern regarding “all episode-of-care payments to the admitting hospital,” but welcome the opportunity to address it upon clarification.

Comment: A commenter recommended performing multi-level testing to determine the appropriate level for use of this measure.

Response: The episode-of-care payment measures are hospital-level measures. They account for risk at the patient-level, but attribute payments to the hospital. We interpret “multi-level testing” to mean the influence of community-level variables, like patient income levels or rural or urban setting, on the payment outcomes. Although hospitals cannot fully control all payments during the episode of care, they are well positioned to influence the outcome or the total episode-of-care payment. We will take into consideration the recommendation to test multiple levels.

Comment: A commenter did not support this measure due to concerns that the measure's risk adjustment model does not properly account for differences in patient case mix and severity, which may lead to the misinterpretation of differences in episode cost performance.

Response: We believe that the measure properly accounts for differences in patient case mix and severity. We developed the measure in accordance with national guidelines and in consultation with clinical and measurement experts, key stakeholders, and the public. The measure is consistent with the technical approach to outcomes measurement set forth in the NQF guidance for outcomes measures (http://www.qualityforum.org/projects/Patient_Outcome_Measures_Phases1-2.aspx), CMS' Measure Management System (http://www.cms.gov/Medicare/Quality-Initiatives-PatientAssessmentInstruments/MMS/index.html?redirect=/MMS/19_MeasuresManagementSystemBlueprint.asp), and the guidance articulated in two American Heart Association scientific statements. Furthermore, this measure was vetted by the NQF Standing Committee for Cost and Resource Use, Phase 2. Furthermore, this measure was recommended for endorsement by the NQF Standing Committee for Cost and Resource Use, Phase 2 and the Consensus Standards Approval Committee in the third quarter of 2014. It is anticipated to be reviewed by the NQF Board in August 2014.

Krumholz HM, Brindis RG, Brush JE, et al. Standards for statistical models used for public reporting of health outcomes: an American Heart Association Scientific Statement from the Quality of Care and Outcomes Research Interdisciplinary Writing Group: cosponsored by the Council on Epidemiology and Prevention and the Stroke Council. Endorsed by the American College of Cardiology Foundation. Circulation. Jan 24 2006;113(3):456-462.

Krumholz HM, Keenan PS, Brush JE, Jr., et al. Standards for measures used for public reporting of efficiency in health care: a scientific statement from the American Heart Association Interdisciplinary Council on Quality of Care and Outcomes Research and the American College of Cardiology Foundation. Circulation. Oct 28 2008;118(18):1885-1893.

Comment: One commenter agreed with comments made by the NQF Cardiovascular TEP that accountability for heart failure payment outcomes should be attributed to primary care providers. The commenter believed that there is a wide range of heart failure severity, which determines the level of accountability and that patients with heart failure are often cared for by a range of providers who vary in level and skill.

Response: Although many providers contribute to the cost of care, we attributed payments for a 30-day episode of care to the hospital because the episode is triggered by admission to an inpatient hospitalization. Inpatient hospitalizations represent a brief period of acute illness that require ongoing management post-discharge, and hospitals are often directly responsible for scheduling post-discharge follow-up. Therefore, decisions made at the admitting hospital affect not only the hospitalization payments, but payments for care in the immediate post-discharge period. Finally, the objective of this episode of care payment measure is to encourage efficiencies gained by well-coordinated care across a patient's experience of illness.

Comment: One commenter felt that the measure is counter to CMS' implementation of episode groupers since it would capture all costs associated with the patient instead of only the costs of medical and procedural services related to heart failure. The commenter recommended that CMS include episode groupers that assign specific services to certain episodes in the heart failure payment measure.

Response: Episode Groupers are designed to capture epsiodes of care in the Medicare Population. However, these groupers are used to evaluate physicians' resource use while our measure is constructed to capture hospitals' resource use.

Comment: One commenter did not agree with a 30-day outcome timeframe because it does not align with heart failure disease progression and recommended more focus be placed on the ambulatory care environment with a longer time period focused on outpatient care.

Response: Although heart failure is a chronic condition, patients often suffer acute decompensation requiring hospital admission. Acute decompensation is acute exacerbation that compromises the patient's cardiorespiratory status and requires admission. This measure focuses on this acutely decompensated cohort of heart failure patients, not on ambulatory patients. Heart failure admissions are associated with a substantial 30-day mortality rate as well as variation in costs. In addition, heart failure admissions have high rates of readmission prompting heart failure to be targeted in current readmission reduction programs. For these reasons, we believe that heart failure is an appropriate focus for a hospital-based episode-of-care measure.

Xiao, X, Li S-X, Normand SL, Kim N, Ott LS, Lagu T, Duan M, Kroch EA, Krumholz HM. ” `Phenotyping' Hospital Value of Care for Patients with Heart Failure” Health Services Research Early View, Article first published online: 28 Jun 2014.

Comment: Several commenters did not believe transfer patients should be attributed to the admitting hospital because the organization that initially admits a patient may not have as much control over the patient's course of care. Furthermore, the commenters were concerned that hospitals would have a stronger incentive to hold onto patients longer to avoid being held accountable for the costs of another facility.

Response: While we understand the commenters' concerns, attributing the outcome to the first admitting hospital makes the most sense given the focus of this particular payment measure, which is hospital risk-standardized payment associated with a 30-day episode-of-care for heart failure]. We define a transfer as any admission that requires acute inpatient care at two or more hospitals for the same HF. We attribute total episode payments that involve a transfer for acute care of HF to the transferring hospital because:

  • The episode of care begins at the time of the index admission, which thereby, provides a standard measure time frame for each hospital.
  • The transferring hospital is responsible for initial care decisions as well as the decision to transfer the patient, both of which can have a cascading effect on subsequent care decisions.
  • This method avoids incentivizing hospitals to transfer patients who are critically ill and at high risk of being very expensive to treat. As a result, we disagree with the commenter that hospitals would have a stronger incentive to hold onto patients longer to avoid being held accountable for the costs of another facility.
  • This method aligns with CMS' publicly reported measure for HF risk-standardized mortality.
  • The objective of this episode-of-care payment measure is to encourage efficiencies gained by well-coordinated care across a patient's experience of illness.

Comment: Several commenters believed the HF payment measure should exclude transplant or LVAD patients who underwent the procedure in the previous 12 months.

Response: We interpret this comment to mean that the measure should exclude patients with any type of transplant or a left ventricular assist device (LVAD) within 12 months of the index admission for heart failure. We agree that these patients will likely cost more than other HF patients. Accordingly, we plan to evaluate the data to see if either a heart transplant or LVAD placement occurred within 12 months prior to HF admission and exclude these patients from the measure beginning in FY 2016. We will then determine whether or not we should exclude patients from the measure with a history of LVAD or transplant.

Comment: One commenter was concerned that the measure may not adequately adjust for older and more frail patients who are at a higher risk for readmission.

Response: We note that the measure specifically adjusts for age and multiple indicators of patient frailty such as malnutrition and dementia. The measure is risk-adjusted in order to account for differences in case-mix, or patient complexity, between hospitals. For each patient, the claims for the 12-months prior to the measured hospitalization are examined to identify additional clinical conditions that patients may have which could contribute to costs of care. These conditions are included in the risk model for the measure to ensure providers are: 1)compared on their performance; 2) are not penalized for caring for sicker patients; and 3) to prevent putting providers at risk of being profiled as high cost facilities due to the clinical complexity of their patients.

Comment: Several commenters believed that the measure does not adequately adjust for patient risk and cited NQF concerns regarding R-square values of 0.03-0.05 in the development and validation samples.

Response: While we appreciate the concern that the measure does not adequately adjust for patient risk factors, we disagree for several reasons. First, the measure model was evaluated with a number of statistical methods in addition to the R-square. The results of these other diagnostic tests (over-fitting indices, distribution of Standardized Pearson residuals, and predictive ratios) all suggest that the model predicts payments well, after adjustment for patient risk factors. These results consider the measure from a different perspective than the R-square. Second, we feel the focus on the R-square value for this measure is not appropriate because the statistical methods we used do not produce a traditional R-square value. To provide conceptually similar number, we produced a quasi-R-square, the details of which can be found in our technical report (available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Measure-Methodology.html). Third, this quasi-R-square is consistent with other patient-level risk-adjustment models for health care payment. Lastly, the R-square results suggest that factors other than clinical severity may be predictive of resource utilization that can increase payments as discussed at length during the NQF proceedings. We note that despite the concerns raised about the R-square value during endorsement proceeding, in June 2014, the NQF Standing Committee for Cost and Resource Use Phase 2 recommended endorsement of the HF episode of care measure.

Chen SI, Dharmarajan K, Kim N, et al. Procedure intensity and the cost of care. Circ Cardiovasc Qual Outcomes 2012;5:308-13.

Safavi KC, Dharmarajan K, Kim N, et al. Variation exists in rates of admission to intensive care units for heart failure patients across hospitals in the United States. Circulation 2013;127:923-9.

After consideration of the public comments we received, we are finalizing the Hospital-Level, Risk-Standardized Payment Associated with a 30-Day Episode of Care for Heart Failure measure as proposed.

e. Severe Sepsis and Septic Shock: Management Bundle Measure (NQF #0500)

(1) Background

Sepsis, severe sepsis, and septic shock can arise from a simple infection, such as pneumonia or urinary tract infection. Although it can affect anyone at any age, it is more common in infants, the elderly, and patients with chronic health conditions such as diabetes and immunosuppressive disorders seen in transplant patients. Information for this measure comes from the NQF Measure Information-Composite for the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500). More information on this issue is available from the Surviving Sepsis Campaign: International Guidelines for Management of Severe Sepsis and Septic Shock: 2012. Sepsis is associated with mortality rates of over 16 to 49 percent, which is more than 8 times higher than the rate for inpatient stays for other hospital admissions. Findings from the National Hospital Discharge Survey indicate that the number of hospital stays for septicemia more than doubled between the years of 2000 and 2008, and patients with this condition were more severely ill than patients hospitalized for other conditions. Severe sepsis and septic shock are frequent causes of re-hospitalizations, especially during the first year after the initial hospitalization.

National Quality Forum (NQF). Measure Information-Composite. #500 Severe Sepsis and Septic Shock: Management Bundle. Updated 2014 Jan 2. NQF: Washington, DC http://www.qualityforum.org/Home.aspx.

Dellinger RP, Levy MM, Rhodes A, Annane D, et al. Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2012. Crit Care Med. 2013 Feb; 41(2):580-637.

Based on national discharge data reported by the AHRQ, sepsis was the sixth most common principal reason for hospitalization in the United States in 2009, accounting for 836,000 hospital stays. There were an additional 829,500 stays with a secondary diagnosis of sepsis for a total of 1,665,400 inpatient stays and 258,000 deaths. From 1993 to 2009, sepsis-related hospital stays increased by 153 percent, with an average annual increase of 6 percent. Medicare was the predominant payer for sepsis-related hospital stays, covering 58.1 percent of patients. Sepsis cases and sepsis-related deaths are expected to continue to increase with the aging of the population.

In a landmark study by Rivers et al., it has been shown that an absolute and relative reduction in mortality from sepsis can be reduced 16 percent and 30 percent, respectively, when aggressive care is provided within 6 hours of hospital arrival. Furthermore, a recent study of the 2008 Healthcare Cost and Utilization Project (HCUP) Nationwide Inpatient Sample determined that patients admitted through the Emergency Department had a 17 percent lower likelihood of dying from sepsis than when directly admitted.

Rivers E, Nguyen B, Havstad S et al. Early goal-directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med. 2001; 345: 1368-77.

HCUP Nationwide Inpatient Sample (NIS). Healthcare Cost and Utilization Project (HCUP). 2007-2009. Agency for Healthcare Research and Quality, Rockville, MD. http://www.hcup-us.ahrq.gov/nisoverview.jsp.

The Severe Sepsis and Septic Shock: Management Bundle measure (NQF #0500) was supported by the MAP for the Hospital IQR Program, contingent on NQF endorsement in its Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS, available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738. The MAP noted the measure addresses an NQS priority not adequately addressed in the program measure set and that early detection and treatment of sepsis in the emergency department and inpatient settings is important (page 125). This measure was initially endorsed by the NQF in 2008 for the hospital/acute care facility setting, underwent maintenance review and update in March 2013, June 2013, and May 2014.

The MAP conditionally supported this measure as a Meaningful Use measure in its Pre-Rulemaking Report: 2014 Recommendations on Measures Under Consideration by HHS, available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. The MAP stated, “Not ready for implementation; measure concept is promising but requires modification or further development.” In its Additional Findings the MAP stated that it, “noted the need for continued development of electronic specifications for NQF #0500 Severe Sepsis and Septic Shock: Management Bundle. While some workgroup members challenged the feasibility and evidence behind this measure, MAP deferred to the recent endorsement review of this measure and conditionally supported it for the Meaningful Use Program. Public comment from Edwards Lifesciences supports MAP's conclusion [page 168].” (In the proposed rule (79 FR 28236), we attributed all of the MAP's statements to its 2013 Pre-Rulemaking Report.)

(2) Overview of Measure

The purpose of the proposed Severe Sepsis and Septic Shock: Management Bundle measure is to support the efficient, effective, and timely delivery of high quality sepsis care in support of the Institute of Medicine's (IOM) aims for quality improvement. This is consistent with the Department of Health and Human Service National Quality Strategy´s priorities directed at one of the leading causes of mortality. By providing timely, patient-centered care, and making sepsis care more affordable through early intervention, reduced resource use and complication rates can result. The severe sepsis and septic shock early management bundle provides a standard operating procedure for the early risk stratification and management of a patient with severe infection. Through applying this standard operating procedure, a clinically and statistically significant decrease in organ failure, mortality, and the utilization of health care resources has been demonstrated for over 10 years. Additional information about this measure is available on the NQF's Web site at http://www.qualityforum.org/QPS/0500.

(3) Data Sources

The proposed measure is chart-abstracted data of patients presenting with septic shock who received treatment detailed in the Calculations section below.

(4) Outcome

The outcome criteria for this measure consists of: measure lactate; blood cultures; timely antibiotics; fluid resuscitation; lactate clearance; vasopressors, central venous pressure (CVP), central venous oxygen saturation (ScvO2); and overall bundle compliance. These are discussed in more detail below

  • Measure Lactate

Measurement of lactate levels is specifically associated with improved outcomes in sepsis, and an elevated lactate value identifies patients at higher risk for poor outcomes. Up to 10 percent of in-hospital cardiac arrest in the United States per year is secondary to sepsis (pneumonia). These patients are often misdiagnosed and sent to the medical floors only to suffer acute hemodynamic deterioration. These outcomes could be potentially avoided with lactate measurement upon admission providing risk stratification triggering alternative dispositions.

Levy et al. conducted an international, multisite “Surviving Sepsis Campaign” (SSC) initiative to determine the rate of change at which the sites reached the SSC guideline targets. In the first quarter of this initiative, only 61.0 percent of patients had lactate values measured consistent with guidelines. In addition, prior studies have shown that care prompted by measurement of lactate levels in sepsis patients reduced resource utilization and cost. This leads to lower likelihood of hospital-acquired conditions. This performance measure has been previously used as a core component of multicenter and national quality improvement initiatives. Formalizing it as a national performance measure will provide direct targets for intervention that are closely linked with improvements in mortality and cost.

Levy MM, Dellinger RP, et al.; Surviving Sepsis Campaign. The Surviving Sepsis Campaign: results of an international guideline based performance improvement program targeting severe sepsis. Crit Care Med. 2010 Feb;38(2):367-74.

  • Blood Cultures

In the first quarter of the Levy et al. SSC initiative, only 64.5 percent of patients had blood cultures collected prior to antibiotic administration. Collecting blood cultures prior to antibiotic administration is specifically associated with improved outcomes in sepsis, and pathogens identified by blood cultures allow for customized therapy. As a result, blood cultures continue as a recommendation of the current Surviving Sepsis Guidelines.

By obtaining blood cultures, antibiotic regimens can be customized to treat the specific infecting organism. This will result in less unnecessary exposure to antibiotics, reducing complications associated with antibiotic use, including drug reactions, allergies and adverse events, the development of drug-resistant organisms, and the occurrence of Clostridium difficile colitis. The performance measure for collecting blood cultures for suspected sepsis has been previously used and continues as a core component of the SSC guidelines.

  • Timely Antibiotics

Kumar et al. found the median time to appropriate antibiotics was 6 hours after shock. In the first quarter of the Levy et al. SSC initiative, only 60.4 percent of patients received timely antibiotics. Multiple studies, for example, have demonstrated that delays in administration of appropriate antibiotics in patients with sepsis and other severe infections are associated with longer lengths of stay, higher costs, and higher mortality. In septic shock, the Kumar et al. study demonstrated that every hour in delay of appropriate antibiotics was associated with a 7.6 percent higher mortality. The timely administration of broad-spectrum antibiotics was associated with significantly higher risk adjusted survival. Based on a preponderance of data, the current recommendations in the international guidelines for the management of severe sepsis and septic shock includes the administration of broad-spectrum antibiotic therapy within 1 hour of diagnosis of septic shock and severe sepsis.

Kumar A, Roberts D, Wood K, Light B, et al. Duration of Hypotension before Initiation of Effective Antimicrobial Therapy is the Critical Determinant of Survival in Human Septic Shock. Crit Care Med. 2006;34 (6):1589-96.

Levy MM, Dellinger RP, et al.; Surviving Sepsis Campaign. The Surviving Sepsis Campaign: results of an international guideline based performance improvement program targeting severe sepsis. Crit Care Med. 2010 Feb;38(2):367-74.

  • Fluid Resuscitation

A common finding in patients with septic shock, manifested by low blood pressure and/or other signs of organ hypoperfusion, such as elevated serum lactate levels, is intravascular volume depletion. The degree of the intravascular volume deficit in sepsis varies, yet nearly all patients require initial volume resuscitation and many patients require continuing fluid resuscitation over the first 24 hours.

Early fluid resuscitation is associated with improved outcomes for patients with acute lung injury due to septic shock. International guidelines recommend that patients with suspected hypovolemia be initially treated with at least 30 mL/kg of crystalloid (for example, Ringer's solution) to determine clinical response. In the first quarter of the Levy et al. SSC initiative, only 59.8 percent of patients received fluid resuscitation consistent with guidelines. Timely fluid resuscitation avoids an error of omission in which indicated therapy is delayed or omitted. By improving outcomes, length of stay is reduced. This leads to lower likelihood of hospital-acquired conditions. This performance measure has been previously used as a core component and continues as a core component of the SSC guidelines. Formalizing it as a national performance measure will provide direct targets for intervention that are closely linked with improvements in mortality and cost.

Ibid.

  • Lactate Clearance

Elevated lactate levels prompt the consideration of specific care practices toward hemodynamic optimization guided by either central venous oxygen saturation or lactate clearance. International guidelines recommend that patients with sepsis and continued elevated lactate values have additional therapies until lactate levels are normalized. However, normal lactate levels can be seen in septic shock, especially in children.

  • Vasopressors, Central Venous Pressure (CVP), and Central Venous Oxygen Saturation (ScvO2)

Performance gaps in individual bundle elements can range from 79 percent (Confidence Interval (CI) (69-89 percent) for vasopressors, to 27 percent (CI 18-36 percent) for Central Venous Pressure (CVP) measurement, and as low as 15 percent (CI 7-23 percent) for Central Venous Oxygen Saturation (ScvO2) in some community emergency departments. These numbers increase (50-75 percent) in larger hospital settings. CVP has been shown to have a significant association with mortality and multiple studies and meta-analysis have shown a significant association with reaching an ScvO2 of 70 percent and improved mortality.

Varpula M, Tallgren M, Saukkonen K, Voipio-Pulkki LM, Pettila V. Hemodynamic variables related to outcome in septic shock. Intensive Care Med. Jun 23 2005;31:1066-1071.

  • Overall Bundle Compliance

Multiple initiatives promoting bundles of care for severe sepsis and septic shock were associated with improved guideline compliance and lower hospital mortality. Even with compliance rates of less than 30 percent, absolute reductions in mortality of 4-6 percent have been noted. Coba et al. found that when all bundle elements were completed within 18 hours and compared with patients who did not have bundle completion, the mortality difference was 10.2 percent. Thus, there is a direct association between bundle compliance and improved mortality. In addition, a continuous quality improvement (CQI) initiative, can improve compliance rates. CQI is a quality management process that encourages continually assessing performance and whether improvements can be made. Multiple studies have shown that standardized order sets, enhanced bedside monitor display, telemedicine and comprehensive CQI feedback is feasible, modifies clinician behavior and is associated with decreased hospital mortality.

Coba V, Whitmill M, Mooney R, et al. Resuscitation Bundle Compliance in Severe Sepsis and Septic Shock: Improves Survival, Is Better Late than Never. J Intensive Care Med. Jan 10 2011.

Edwards PJ, et al. Maximizing your investment in EHR: Utilizing EHRs to inform continuous quality improvement. JHIM 2008;22(1):32-7.

(5) Cohort

This measure will focus on patients aged 18 years and older who present with symptoms of severe sepsis or septic shock. These patients will be eligible for the 3 hour (severe sepsis) and/or 6 hour (septic shock) early management measures.

(6) Inclusion and Exclusion Criteria

Numerator Statement: the numerator is: Patients from the denominator who received all the following: Step 1, Step 2, and Step 3 within 3 hours of time of presentation, and if septic shock is present (as either defined as hypotension or lactate >=4 mmol/L), who also received Step 4, Step 5, Step 6, and Step 7 within 6 hours of time of presentation. The steps are described in detail below.

Step 1: Measure lactate level

Step 2: Obtain blood cultures prior to antibiotics

Step 3: Administer broad spectrum antibiotics

Step 4: Administer 30 ml/kg crystalloid for hypotension or lactate >= 4 mmol/L

Step 5: Apply vasopressors (for hypotension that does not respond to initial fluid resuscitation to maintain a mean arterial pressure >= 65)

Step 6: In the event of persistent arterial hypotension despite volume resuscitation (septic shock) or initial lactate >= 4 mmol/L (36 mg/dl), measure central venous pressure and central venous oxygen saturation

Step 7: Re-measure lactate if initial lactate is elevated

Denominator: The denominator is the number of patients presenting with severe sepsis or septic shock. The following patients presenting with severe sepsis or septic shock will be excluded from the denominator:

  • Patients with advanced directives for comfort care;
  • Patients with clinical conditions that preclude total measure completion;
  • Patients for whom a central line is clinically contraindicated;
  • Patients for whom a central line was attempted but could not be successfully inserted;
  • A patient or a surrogate decision maker declines or is unwilling to consent to such therapies or central line placement; and
  • Patients who are transferred to an acute care facility from another acute care facility.

(7) Calculations

In calculating this measure, the denominator is the number of patients presenting with severe sepsis or septic shock. The numerator in this measure is patients from the denominator who had their lactate levels measured, had blood cultures obtained prior to receiving antibiotics, and who received broad spectrum antibiotics within 3 hours of presentation. If septic shock is present, the patients also must receive 30 ml/kg crystalloid for hypotension or lactate >=4 mmol/L, apply vasopressors (for hypotension that does not respond to initial fluid resuscitation to maintain a mean arterial pressure >=65), in the event of persistent arterial hypotension despite volume resuscitation (septic shock) or initial lactate >=4 mmol/L (36 mg/dl) measure central venous pressure and central venous oxygen saturation, and the patient's lactate level must be re-measured if the initial lactate level is elevated.

We invited public comment on this proposal.

Comment: Several commenters supported adopting this measure. Some commenters supported adopting this measure because it is NQF-endorsed. One commenter supported the addition of this measure and noted that it fills an important measure gap, and should positively impact patient care.

Another commenter strongly supported incorporating the sepsis/septic shock measure into the Hospital IQR Program beginning in the FY 2017 payment determination because of the association of sepsis with patient deaths, hospital admissions, and length of hospital stays. Further, the commenter stated that Medicare is the largest payer for sepsis-related hospital stays, accounting for close to 60 percent of all patients.

Response: We proposed adopting this measure because we believe this measure improves patient health outcomes.

Comment: Several commenters noted that there are two other trials that examine the risks/benefits of protocolized care of septic patients which are yet to be published. As this field is evolving, the commenter believed that it is not appropriate to set benchmarks which were not confirmed in the most recent, largest randomized controlled trial. Specifically, the commenters suggested that specific measure criteria should await the results of the Australian Resuscitation In Sepsis Evaluation Randomised Controlled Trial (ARISE) and The Protocolised Management in Sepsis Trial (ProMISe).

Response: We thank the commenter for feedback. We acknowledge the importance of the results pending from the ARISE and the ProMISe trials and will take those results and their potential impact into consideration when available. However, we believe that care of patients with severe sepsis and septic shock is of paramount importance and there is a significant performance gap within the Hospital IQR Program. The presence of this gap warrants the adoption of this clinical quality metric prior to the finalization of the two pending trials referenced above. The severe sepsis/septic shock bundle measure is the only NQF-endorsed sepsis measure currently available to CMS.

Comment: Many commenters opposed CMS adopting this measure citing the recent Protocolized Care for Early Septic Shock (ProCESS) trial published after publication of the proposed rule. The ProCESS trial found no additional benefit in including measurement of central venous pressure (CVP) and central venous oxygen saturation.

Response: We thank the commenters for this feedback. We note from the measure steward that the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) measure “has undergone the rigorous NQF evaluation process for over 6-7 years based on over 13 years of confirmatory studies. These studies provided the framework which allowed the measure to navigate the validity and reliability metrics as a whole measure including the central venous catheter to measure central venous pressure and oxygen saturation (SCV O2).” We note that these two clinical parameters guide the administration of intravenous fluids, vasopressors, inotropes, and blood transfusions. Further, both parameters provide critical information about cardiac dysfunction, which when treated appropriately improves outcome. The steward further notes “As a result CVC placement has been shown to be one of the most important bundle elements 34-37 and independently associated with a 9 percent reduction in mortality.” 38 39

NQF. Patient Safety Measure. Henry Ford Comments. June 2014. Available at: http://www.qualityforum.org/ProjectMaterials.aspx?projectID=73701.

Regarding the ProCESS trial, we note that this randomized trial focused on a different set of guidelines for septic shock patients and did not require patients to have a central venous catheter placed, unless peripheral access was insufficient. The protocol-based standard therapy was the result of the ProCESS Investigators reviewing the literature, surveying emergency physicians and intensivists worldwide with consensus feedback from investigators.2 The ProCESS trial protocol-based standard therapy also included administration of fluids and vasoactive agents to reach goals for systolic blood pressure and shock index (the ratio of heart rate to systolic blood pressure).2 The results of this trial were published in March 2014 and NQF reviewed the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) measure and narrowly voted to remove the central venous catheter portion of the EGDT bundle in June 2014. We note that the ProCESS trial was performed in 31 U.S. hospital emergency departments known to have a high volume of patients and that over a 5-year period randomized 1351 patients with septic shock into the trial, or on average 8 patients per site per year. The measure steward noted that a meta-analysis of 49 studies found the ProCESS trial population to account for 3 percent of the 41,064 patients in the these studies and that the 31 centers in the trial are not reflective of community settings where the majority of patients are treated in the U.S.,2 nor are the 31 centers a majority of the 4500 hospitals in the U.S.

ProCESS Investigators. A Randomized trial of Protocol-Based Care for early septic shock. NEJM. 2014; 370:1683-1693.

Finally, during the NQF Patient Safety Measure Standing Committee meeting, the steward noted that the recommendation to remove the CVC portion of the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) measure had not been tested to assess if the measure would still be reliable and valid with this change to the measure, and that the recommendation was based on a single study's protocol-based standard therapy which was noted not to be identical to the EGDT treatment used in the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) measure.

In view of this background of information we believe the most logical next step is to gather more information from two other studies that will be completed in the near future, as well as to await further recommendations from the NQF Patient Safety Measures Project as the ProCESS investigators collaborate with the stewards of the Severe Sepsis and Septic Shock: Management Bundle (NQF #0500) measure to refine the measure. We believe that sepsis and its mortality rate are important medical conditions which have also shown wide variation in treatment and outcome. We believe severe sepsis and septic shock should be monitored for improvements in mortality rates.

Comment: Commenters noted that the CVP and central venous oxygen saturation monitoring and other processes were adopted in the Surviving Sepsis Campaign (SSC) after the results of a single center trial published in 2001. Commenters also stated that the Surviving Sepsis Campaign (SSC) moderated some of its recommendations based on the results of the ProCESS trial citing the SSC's response to the ProCESS trial.

Rivers E, Nguyen B, Havstad S et al. Early goal-directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med.2001; 345: 1368-77.

Surviving Sepsis Campaign. Surviving Sepsis Campaign Responds to ProCESS Trial. May 19, 2014. Available at: http://www.survivingsepsis.org/SiteCollectionDocuments/SSC-Responds-Process-Trial.pdf.

Response: We thank the commenter for feedback. We note that monitoring CVP and central venous oxygen saturation monitoring are important components of the sepsis bundle. The SSC recommendations note that mortality outcomes increase if CVP or oxygen saturation of 70 percent or 65 percent respectively, is not achieved with fluid resuscitation to the central venous pressure target. We acknowledge that the CVP and central venous oxygen saturation monitoring and other processes were adopted by the Surviving Sepsis Campaign (SSC) after the results of a single center trial published in 2001. However, we would like to point out the SSC recommendations have been updated since their initial publication and these updated recommendations are based on many different international studies. With regard to the comment that SSC has moderated some of its recommendations based on the ProCESS trial. We note that in their response to the ProCESS trial dated May 19, 2014, SSC recognizes that there are alternative ways to obtain these results and they will address ways to include this data in future versions of their quality improvement database.

Comment: One commenter stated that support for this measure was not lessened by the ProCESS trial questioning the level of support for element “F” (measurement of central venous pressure and central venous oxygen saturation) of this measure. The commenter noted that, while the NQF Patient Safety Steering Committee voted in favor of removing element “F,” final ratification is pending by the NQF Board of Directors.

The commenter noted that the NQF Patient Safety Steering Committee did not remove its endorsement of the full measure, and cited the Draft Report for Comment on the ad hoc review that stated that “usual care for severe sepsis and septic shock had changed dramatically in the past decade with dramatic improvements in sepsis-related morbidity and mortality with several elements of the NQF #0500 measure being key to this improvement in outcomes” (p. 20).

National Quality Forum. NQF-Endorsed Measures for Patient Safety. Draft Report for Comment. May 28, 2014. Available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=76698.

Response: We agree that support for this measure has not lessened as a result of the ProCESS trial. As part of its ongoing work, the NQF Patient Safety Standing Committee conducted an ad hoc review of the sepsis measure (NQF #0500) based on results from the ProCESS trial. While the NQF Patient Safety Steering Committee voted in favor of removing element “F,” it recommended retaining endorsement of the measure as a whole. At this time final ratification is pending by the NQF Board of Directors. We refer readers to the NQF Web site for complete information on this measure's review at: http://www.qualityforum.org/News_And_Resources/Press_Releases/2014/Statement_from_NQF_on_Review_of_Sepsis_Measure.aspx. We intend to closely monitor and incorporate new information as the evidence base improves.

Comment: A commenter asked CMS to invest additional resources in developing a stronger sepsis outcome measure. Another commenter asked CMS to consider adding non-NQF-endorsed measures that address early detection of sepsis.

Response: We thank the commenters for these suggestions and will take them into consideration in the future.

Comment: A commenter supported the severe sepsis/septic shock: management bundle measure provided the chart-abstracted measures that are proposed for removal in this rule are removed. The commenter noted that, if all existing chart-abstracted measures are left intact and the proposed mandatory electronic submission requirements for CY 2016 are added, it will be difficult for the commenter to find the resources to add the new measures.

Response: We are working to lessen the burden by removing several chart-abstracted measures.

Comment: One commenter was concerned that the measure, as defined, may have a high rate of false positives.

Response: We are unaware of any studies indicating the severe sepsis/septic shock measure, as defined, has a high rate of false positives. We would be interested in seeing any evidence of a high rate of false positives.

Comment: One commenter stated that the science of sepsis treatment is evolving and measurements of the incidence of sepsis and sepsis outcomes are likely inaccurate due to coding variances and payment incentives.

Response: We acknowledge that the science of sepsis treatment is evolving. We note that this is common phenomenon in medicine, and this is why all measures undergo routine measure maintenance. We believe that the coding of sepsis is accurate because these codes are used for payment reimbursement. In addition, our payment reimbursement processes allow for review, correction, and appeals. The payment incentive in the Hospital IQR Program is for reporting, therefore there is no financial incentive associated with actual sepsis/septic shock outcomes.

Comment: Many commenters suggested that this measure poses a possible risk to patients and would be a burden on hospitals to collect the data. Specifically, one commenter was concerned about whether emergency department staff would be required to complete elements of the bundle while they triage patients.

Response: We believe this measure will benefit consumers seeking information regarding the quality of health care outcomes. Sepsis is associated with patient deaths, hospital readmissions, and increased length of hospital stays. The measure fills an important measure gap, and will positively impact patient care. We believe that these benefits will outweigh data collection burdens. We also do not believe this measure will be more burdensome than other measures for hospitals because the measure data may be collected concurrently, retrospectively, or a combination of both.

Regarding the concern of the inability to complete the bundle elements in the emergency department during triage, we note that the measure allows for completion of elements A-C within 3 hours. Timeliness of accurate detection and treatment of sepsis has been associated with improved survival in numerous studies, for example.

Castellanos-Ortega A, Suberviola B, Garcia-Astudilllo LA, Holanda MS, et al. Impact of the Surviving Sepsis Campaign protocols on hospital length of stay and mortality in septic shock patients: Results of a three-year follow-up quasi-experimental study. Crit Care Med. 2010, 8(4):1036-1043.

Ferrer R, Artigas A, Suarez D, Palencia E, et al. Effectiveness of Treatments for Severe Sepsis. A Prospective, Multicenter, Observational Study. Am J Respir Crit Care Med. 2009, 180:861-866.

Comment: A commenter suggested that CMS defer the sepsis reporting requirements until 2016, when the next version of the Surviving Sepsis Guidelines (SSG) is published.

Response: We believe the measure is important and addresses a critical gap in measurement and therefore, should be adopted at this time. However, we intend to closely monitor and incorporate new information as the evidence base improves.

Comment: Several commenters requested that CMS consider alternative sepsis measures that are NQF-endorsed, reliable, accurate, feasible, evidence- based, streamlined, and can be collected consistently and reliably, with minimal burden.

Response: At the time of this publication, we note that here are no other NQF-endorsed severe sepsis/septic shock measures available.

Comment: A commenter asked for clarification as to which patients would be excluded from this measure. This commenter also wanted clarification on whether we are developing a sampling methodology for the sepsis measure. The commenter suggested that we define a minimum case threshold for publicly reporting this measure.

Response: The exclusions for this measure were outlined above, in the proposed rule (79 FR 28237), and at: http://www.qualityforum.org/QPS/0500. We intend to develop a sampling strategy for the sepsis measure. In addition, regarding a minimum case threshold for public reporting, we will follow our existing guidelines. We display a footnote on Hospital Compare when the number of cases/patients is too few to report, that is fewer than 11 cases.

Comment: Many commenters also asked for changes to specific aspects of the measure. Components of the sepsis measure commenters would like to change include:

  • Allowing exclusions to the required fluid resuscitation amount of 30 ml/kg to take into account the elderly, frail, and cardiac compromised that are not able to handle this amount of fluid, and may have fluid overload. For example, one exclusion could be 25 ml/kg for cardiac compromise, which the commenter stated the literature also supports in sepsis fluid resuscitation.
  • Allowing administration of 30 ml/kg crystalloid for hypotension or lactate >/=4 mmol/L should be administered within 3 hours of time of presentation and not 6 hours, according to current guidelines.
  • Excluding patients from the blood culture before antibiotic measure if blood cultures are attempted without success and patients that present to the emergency department with an atypical sepsis presentation (cardiac arrest prior to arrival).

Many commenters opposed the inclusion of element F from the measure, specifically “In the event of persistent hypotension despite volume resuscitation (septic shock) or initial lactate >=4 mmol/L (36 mg/dl) measure central venous pressure and central venous oxygen saturation,” per the recommendation of the Patient Safety Measure Committee. A commenter cautioned that central lines have many complications and this trial showed early goal directed therapy without a central line was equivocal to placing a central line for monitoring. Commenters also noted that central venous catheters should be used sparingly, as they can lead to infections and other complications.

A commenter stated that the measure specifications of care steps within six hours (required only for patients with septic shock) should not include steps five through seven because they are no longer considered the standard of care or high-quality sepsis resuscitation metrics and are outdated.

Response: We thank the commenters for their feedback. We are adopting this measure as developed by the measure steward, Henry Ford Hospital, and endorsed by the NQF. We suggest the commenters recommend any changes to this measure to the measure developer/steward so that those changes would go through the consensus development process.

Comment: Several commenters sought clarification on aspects of the sepsis measure, such as:

  • Clarification of the denominator for identification of septic shock patients. The commenter asked that we clarify if the measure has specific ICD-9-CM diagnosis codes that would limit the review. If those are present, the commenter did not object to this measure. However, if they are not present, the commenter strongly objected to this measure based upon the significant burden of work that it imposes.
  • Clarification on whether the measure will be collected as aggregate data (Web-based) or if we will require the submission of patient-level data.
  • Clarification as to if the measure specifications will be provided in the standard manual format and when those specifications will be released. At this time, the commenter noted that there is no algorithm, data elements, initial patient population or sampling guidelines available to be able to begin programming this measure for collection. As this is a very complicated measure, the commenter noted that to collect this measure as a chart-abstracted measure will be a burden to the hospitals.
  • Clarification and rationale as to why we wanted to collect this as a chart-abstracted measure and not as an electronic clinical quality measure. A commenter suggested that the Severe Sepsis and Septic Shock measure be introduced as an electronic clinical quality measure rather than as a chart-abstracted measure. The inclusion of this measure should be timed to occur when electronic measure specification is available to support its inclusion. Another commenter requested clarification and rationale as to why we want to collect this as a chart-abstracted measure and not as an electronic clinical quality measure.

Response: The denominator is the number of patients presenting with severe sepsis or septic shock. These types of patients have specific ICD-9-CM codes and the codes will be provided with the measure specifications. The measure is a composite patient safety measure, which will require submission of patient-level data.

The electronic specifications of the measure are not ready for implementation. We will consider adopting the electronic clinical quality measure version when it becomes fully electronically-specified.

Comment: A commenter requested that, pending approval of the Sepsis and Septic Shock: Management Bundle Measure (NQF #0500), CMS provide the measure specifications six months in advance of the abstraction period to provide hospitals with ample time to review and evaluate any necessary process changes before the data collection period begins. Another commenter requested clarification as to if the measure specifications will be provided in the standard manual format and when those specifications will be released. At this time, the commenter notes that there is no algorithm, data elements, initial patient population or sampling guidelines available to be able to begin programming this measure for collection. As this is a very complicated measure, the commenter notes that to collect this measure as a chart-abstracted measure will be a burden to the hospitals.

Response: The measure specifications will be released in the standard format, in the Specifications Manual, which will contain the data elements and algorithm. Typically, our specifications manuals are posted on QualityNet in January for July-December discharges and July for January-June discharges. We also provide addendums each year after the finalization of the IPPS/LTCH PPS final rule. The release date of this addendum is to be determined.

Comment: One commenter stated that the quality improvement opportunities are when missed diagnosis occurs. The commenter asked if CMS will include possible diagnosis from an electronic health record problem list as a data source.

Further, the commenter stated that the first three elements for severe sepsis have best-practice times of three hours from presentation. The commenter asked if that is three hours from arrival to the facility, upon transfer between units, from presentation of symptoms, or all of the above. The commenter advised that three hours could also be very difficult to meet depending on emergency department volumes at any given time.

Response: We note that this is a chart-abstracted measure and hospitals can collect data from all available sources of medical records including EHRs.

Regarding the best-practice times for the measure, we refer the commenter to the Inclusion and Exclusion Criteria described above for a description of the steps to be completed within 3 hours of the patient's presentation. According to the measure steward, Henry Ford Hospital, the measure's intent is to use three hours following presentation/onset from one endpoint to another, be it facility transfer/arrival or unit transfer/arrival.

After consideration of the public comments we received, we are finalizing the Severe Sepsis and Septic Shock: Management Bundle Measure (NQF #0500) as proposed. We will closely monitor this measure as new clinical evidence becomes available, and will update the public via future rulemaking and/or operational guidance as necessary.

f. Electronic Health Record-Based Voluntary Measures

(1) Overview of New Electronic Health Record-Based Voluntary Measures

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28238 through 28239) we proposed four new voluntary electronic health record-based measures to be submitted as electronically specified measures: (1) Hearing Screening Prior to Hospital Discharge (NQF #1354); (2) PC-05 Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (collectively referred to as NQF #0480); (3) Home Management Plan of Care (HMPC) Document Given to Patient/Caregiver; (4) and Healthy Term Newborn (NQF #0716). The four proposed electronic health record-based measures were included on a publicly available document entitled “List of Measures Under Consideration for December 1, 2012” in compliance with section 1890A(a)(2) of the Act, and they were reviewed by the MAP in its MAP Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS. The final MAP report is available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72746. We considered the input and recommendations provided by the MAP in selecting measures to propose for the Hospital IQR Program.

The specifications for the electronic clinical quality measures for eligible hospitals are found at: http://cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/eCQM_Library.html.

Many commenters raised similar concerns applicable across the proposed electronic clinical quality measures; we summarize and respond to these general comments first below before discussing the individual electronic clinical quality measures.

Comment: Many commenters opposed one or more of these voluntary electronic clinical quality measures for the following reasons:

  • A significant portion of the measures' populations are not covered by Medicare.
  • The proposed measures would not lead to improved hospital quality or offer insight on how to improve electronic clinical quality measures.
  • CMS did not propose to allow hospitals to submit chart-abstracted data on these measures in addition to the electronic clinical quality measures.

Response: We are concerned with improving the quality of care provided to all patients, not just Medicare patients. All of our non-claims-based measures include all-payer patients, meaning they include non-Medicare patients as well as Medicare beneficiaries.

We disagree that these measures would not lead to improved hospital quality of care. The measures address high-impact conditions not adequately addressed in the program measure set. We also disagree that these measures will not improve electronic clinical quality measures. Reporting clinical quality measures in their electronic form is a different mode of data collection that, as with any measure, will require refinement over time. We believe that implementing and using will drive quality improvement through measuring quality through EHR's, and provide nationally representative information to inform future electronic clinical quality measure refinements.

Finally, we believe these measures will give hospitals useful information that can be used to improve the quality of care for those patients in the measure population regardless of the mode of collection and submission. We are in the process of moving away from chart-abstracted measures. Therefore, in part to minimize hospitals data collection burden and when electronic specifications are available, we intend to adopt those versions. We proposed to adopt these measures as voluntary electronic clinical quality measures to align with the Medicare EHR Incentive Program to provide hospitals' flexibility in reporting. We note that the proposed measures are voluntary and a hospital may choose to not report one or more of the proposed measures.

Comment: One commenter was concerned that the complexity of the data currently in chart abstraction for these measures will make it difficult to ensure that this information will accurately be translated when submitting these measures electronically.

Response: These measures are already electronically-specified and as such, no translation is required. As previously stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50817 through 50818), we do not believe that the electronic clinical quality measures are substantively different from their chart-abstracted form.

Comment: Commenters recommended aligning CMS and TJC requirements for these measures in an effort to reduce the amount of resources that are spent when requirements are different or the timing of changes in requirements creates additional challenges.

Response: We intend to continue working with TJC and other stakeholders to reduce hospitals' quality reporting burden.

(2) Voluntary Electronically Specified Measure: Hearing Screening Prior to Hospital Discharge (NQF #1354)

The Hearing Screening Prior to Hospital Discharge (NQF #1354) measure assesses the proportion of all live births born at a hospital that have been screened for hearing loss before hospital discharge. The Joint Committee on Infant Hearing encourages early screening and intervention in infants with hearing loss to maximize linguistic competence and literacy development in children with hearing loss or who are hard of hearing. Early intervention improves developmental and social outcomes for children. The States and CDC have collected this measure as a population-based measure for more than 10 years.

This measure is NQF-endorsed and was supported by the MAP in its Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS, available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738. The MAP noted that the measure addresses a high-impact condition not adequately addressed in the program measure set.

The numerator is all live births during the measurement period born at a facility and screened for hearing loss prior to discharge, or screened but still not discharged, or not screened due to medical reasons or a medical exclusion.

The denominator includes all live births during the measurement period born at a facility and discharged without being screened, or screened prior to discharge, or screened but still not discharged.

The measure excludes any patient deceased prior to discharge and has not received hearing screening.

Comment: One commenter supported the hearing screening prior to hospital discharge measure.

Response: We thank the commenter for their support.

Comment: One commenter opposed the Hearing Screening Prior to Hospital Discharge measure, and expressed concern that it will encourage physicians to obtain other preventative screenings during the hospitalization that are unnecessary or unrelated to the cause of the patient's admission.

Response: This measure relates to hearing screening for newborns prior to discharge, not all patients. Newborns, as defined by this measure, are not in the same category as other admitted patients as they are born to an admitted patient. Early screening allows for early intervention in infants with hearing loss. We do not believe newborn preventive hearing screenings will encourage physicians to perform unneeded preventive screenings.

After consideration of the public comments we received, we are finalizing the adoption of the Hearing Screening Prior to Hospital Discharge measure for voluntary electronic reporting as proposed.

(3) Voluntary Measure: PC-05 Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (collectively referred to as NQF #0480)

Exclusive breast milk feeding for the first 6 months of neonatal life has long been the expressed goal of World Health Organization (WHO), HHS, American Academy of Pediatrics (AAP) and American College of Obstetricians and Gynecologists (ACOG).

The PC-05 Exclusive Breast Milk Feeding measure and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (NQF #0480) is endorsed by the NQF and supported by the MAP in its Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS, available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738. The MAP noted that the measure addresses a high-impact condition not adequately addressed in the program measure set.

This measure assesses the number of newborns exclusively fed breast milk during the newborn's entire hospitalization; and the subset measure only includes those newborns whose mothers chose to exclusively feed breast milk.

The numerator is the same for both the measure and subset measure—newborns that were fed breast milk only since birth. However, the denominators differ. For PC-05, the denominator is defined as single term liveborn newborns discharged alive from the hospital with ICD-9-CM Principal Diagnosis Code for single liveborn newborn. The denominator for the subset measure, PC-05a, is defined as single term newborns discharged alive from the hospital excluding those whose mothers chose not to breast feed with ICD-9-CM Principal Diagnosis Code for single liveborn newborn. The ICD-9-CM Principal Diagnosis Codes for single liveborn newborns are found in Appendix A, Table 11.20.1: Single Live Newborn in the Specifications Manual for Joint Commission National Quality Measures available at: http://manual.jointcommission.org/releases/TJC2013A/AppendixATJC.html.

Excluded populations:

  • Admitted to the Neonatal Intensive Care Unit (NICU) at this hospital during the hospitalization.
  • ICD-9-CM Other Diagnosis Codes for galactosemia as defined in Appendix A, Table 11.21 in the Specifications Manual for Joint Commission National Quality Measures found at: http://manual.jointcommission.org/releases/TJC2013A/AppendixATJC.html.
  • ICD-9-CM Principal Procedure Code or ICD-9-CM Other Procedure Codes for parenteral infusion as defined in Appendix A, Table 11.22 in the Specifications Manual for Joint Commission National Quality Measures found at: http://manual.jointcommission.org/releases/TJC2013A/AppendixATJC.html.
  • Experienced death.
  • Length of Stay >120 days.
  • Enrolled in clinical trials.
  • Patients transferred to another hospital.
  • ICD-9-CM Other Diagnosis Codes for premature newborns as defined in Appendix A, Table 11.23 in the Specifications Manual for Joint Commission National Quality Measures found at: http://manual.jointcommission.org/releases/TJC2013A/AppendixATJC.html.
  • Documented Reason for Not Exclusively Feeding Breast Milk.

The maternal reasons for not exclusively breastfeeding are limited to the following situations:

  • HIV infection;
  • Human t-lymphotrophic virus type I or II;
  • Substance abuse and/or alcohol abuse;
  • Active, untreated tuberculosis;
  • Taking certain medications, that is, prescribed cancer chemotherapy, radioactive isotopes, antimetabolites, antiretroviral medications and other medications where the risk of morbidity outweighs the benefits of breast milk feeding;
  • Undergoing radiation therapy;
  • Active, untreated varicella;
  • Active herpes simplex virus with breast lesions; and
  • Admission to Intensive Care Unit (ICU) post-partum.

We invited public comments on this proposal.

Comment: One commenter strongly supported the adoption of PC-05: Exclusive Breast Milk Feeding and the Subset Measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (Collectively Referred to as NQF #0480). Another commenter urged CMS to make the exclusive breast milk feeding measure a mandatory measure no later than FY 2017. The commenter believed that exclusive electronic reporting of these measures could ultimately reduce the burden of collection and increase the potential for timely feedback to all stakeholders on the ever important area of maternity care. The commenter indicated that the health benefits of breastfeeding for mothers and for babies are well established and that the measure has the virtue of being included in TJC's core Perinatal Care measure set (PC-05), which hospitals with more than 1,100 births annually are now required to collect and report. The commenter indicated that the use of standardized measures helps avoid confusion among consumers and health professionals and reduces duplication of related measure concepts and burden of collection.

Response: We thank the commenters for their support. We will take into consideration their recommendations as we plan Hospital IQR Program policies in the future.

Comment: One commenter stated that it is unclear whether noncompliance with the breast feeding measure would be created if the mother changed her mind at some point during the stay.

Response: For PC-05a only, if the mother's initial feeding plan was to exclusively feed breast milk and she diverges from that plan to feed formula later in the hospitalization, then the case will fail. A case is only excluded from the denominator if formula feeding is the initial stated feeding plan.

Comment: One commenter believed there should be exclusions for newborns' medical conditions that require supplemental feedings. This commenter did not support the measure because it is an electronic clinical quality measure only. The commenter would support the measure in its chart-abstracted form because it is NQF-endorsed and supported by the MAP.

Response: We suggest that any recommendations for changes to the measure be shared with the measure developer/steward, TJC. As is, the measure is NQF-endorsed and includes the electronic specification. In 2012, The MAP declined to support the electronic clinical quality measure because of an issue regarding patient choice. However, the measure developer has addressed this issue following the 2012 MAP recommendation. Patients that choose not to exclusively breast feed are excluded from the denominator. In 2013, the MAP supported the measure for adoption by the Hospital IQR Program, noting the measure addresses an NQS priority not adequately addressed in the program measure set.

Comment: Several commenters recommended the integration of technical assistance provided by TJC and the United States Breastfeeding Committee (USBC) to assist with implementation of the measure. The commenters pointed out that USBC has published an online toolkit to help hospitals implement the measure and suggested that we should inform hospitals of the availability of the toolkit.

Response: We thank the commenters for their suggestions and will consider them in the future.

After consideration of the public comments we received, we are finalizing are finalizing the PC-05 Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother's Choice (collectively referred to as NQF #0480) measure as a voluntary electronic clinical quality measure as proposed.

(4) Voluntary Measure CAC-3: Home Management Plan of Care (HMPC) Document Given to Patient/Caregiver

Asthma is the most common chronic disease in children and a major cause of morbidity and health care costs nationally. For children, asthma is one of the most frequent reasons for admission to hospitals. There were approximately 157,000 admissions for childhood asthma in the United States in 2009. Under-treatment and/or inappropriate treatment of asthma are recognized as major contributors to asthma morbidity and mortality. Guidelines developed by the National Asthma Education and Prevention Program (NAEPP) of the National Heart, Lung and Blood Institute (NHLBI), as well as by the American Academy of Pediatrics (AAP) for the diagnosis and management of asthma in children, recommend establishing a plan for maintaining control of asthma and for establishing plans for managing exacerbations.

The CAC-3: Home Management Plan of Care (HMPC) Document Given to Patient/Caregiver measure is no longer endorsed by the NQF and was not supported by the MAP in its Pre-Rulemaking Report: 2013 Recommendations on Measures Under Consideration by HHS available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738,, because the measure no longer meets the NQF endorsement criteria. However, based on the prevalence of asthma among children, as well as the risks associated with under-treatment or over-treatment described above, we believe the measure is appropriate for voluntary collection. Because asthma is a serious, and potentially life-threatening disease, we believe that it is important to allow hospitals to voluntarily report this data, which may help inform our policy.

This measure assesses the proportion of pediatric asthma patients (aged 2-17 years) discharged from an inpatient hospital stay with a HMPC document in place. The numerator is the number of pediatric asthma inpatients with documentation that they or their caregivers were given a written HMPC document that addresses: (1) Arrangements for follow-up care, (2) environmental control and control of other triggers, (3) method and timing of rescue actions, (4) use of controllers, and (5) use of relievers.

The denominator is the number of pediatric asthma inpatients (age 2 years through 17 years) discharged with a principal diagnosis of asthma.

The measure excludes: (1) Patients with an age less than 2 years or 18 years or greater; (2) patients who have a length of stay greater than 120 days; and (3) patients enrolled in clinical trials.

We invited public comments on this proposal.

Comment: A commenter supported the CAC-3 HPMC measure and noted that this plan of care supports patients' successful transition from the hospital to home.

Response: We thank the commenter for their support.

Comment: Some commenters opposed the measure's adoption as a voluntary electronic clinical quality measure because the NQF has removed its endorsement and the MAP has not recommended this measure. Another commenter requested that CMS provide additional information beyond what was stated in the proposed rule regarding our rationale for inclusion of the CAC-3 Home Management Plan of Care Document Given to Patient/Caregiver. The commenter noted that this measure's loss of NQF endorsement is cause for concern, but more importantly, the commenter did not feel this documentation measure appropriately contributes to evaluating the state of perinatal care in the U.S.

Response: This is a pediatric measure addressing children aged 2-17, not a perinatal care measure. Since it is a pediatric measure, CAC-3 fills a gap in the Hospital IQR Program measure set. We are moving away from chart-abstracted measures and when electronic specifications are available, we intend to adopt the electronic clinical quality measure version of a new measure. We acknowledge that the MAP did not support the adoption of this measure because the NQF withdrew their endorsement. According to the NQF report, the reason for this was because the measure did not pass the criteria for the category “Importance to Measure and Report.” NQF stated that the evidence is not as strong for care plan as for use of ICS. The Committee noted the recent publication in JAMA by Morse in October 5, 2011 that found “Among children admitted to pediatric hospitals for asthma, there was high hospital-level compliance with CAC-1 and CAC-2 quality measures and moderate compliance with the CAC-3 measure but no association between CAC-3 compliance and subsequent ED visits and asthma-related readmissions” (http://jama.ama-assn.org/content/306/13/1454.abstract). The NQF also cited concerns over the lack of standardization of a quality care plan, how language is constructed, and health literacy issues. Despite these findings, however, the NQF still agreed that “patient education is clearly an essential component in successful asthma management.” Our purpose for adopting this voluntary electronic clinical quality measure is to align with the Medicare EHR Incentive Program and to provide hospitals with flexibility in their quality reporting. We reiterate that the proposed measure is voluntary and a hospital may choose to not report this measure.

Furthermore, we proposed to include this non-NQF endorsed measure under the Hospital IQR Program exception authority as discussed in section IX.A.7. of the preamble of this final rule.

After consideration of the public comments we received, we are finalizing the Home Management Plan of Care (HMPC) Document Given to Patient/Caregiver measure as a voluntary electronic clinical quality measure as proposed.

(5) Voluntary Measure: Healthy Term Newborn (NQF #0716)

This measure assesses the optimal outcome of pregnancy and childbirth, specifically a healthy term newborn. It evaluates the impact of any changes in the management or intervention on the positive outcome for the newborn.

The measure is NQF-endorsed. The MAP recommended removal of this measure in its Pre-Rulemaking Report: 2013 Recommendations on Measures under Consideration by HHS available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738,, because the measure required modification or further development. However, the MAP strongly supported the measure concept for inclusion once technical issues were resolved. Given its endorsement by NQF, as well as the MAP's strong support for the measure concept, we believe the measure is appropriate for voluntary reporting.

The result of the measure calculation is the percentage of term singleton live births (excluding those with diagnoses originating in the fetal period) that do not have significant complications during birth or the nursery care.

National Quality Forum. National Voluntary Consensus Standards for Patient Outcomes 2009. Available at: http://www.qualityforum.org/WorkArea/linkit.aspx?L inkIdentifier=id&ItemID=67546.

The numerator of this measure is the absence of conditions or procedures reflecting morbidity that happened during birth and nursery care to an otherwise normal infant.

The denominator is composed of singleton, term (>=37 weeks), inborn, live births in their birth admission. The denominator further has eliminated fetal conditions likely to be present before labor. Maternal and obstetrical conditions (for example, hypertension, prior cesarean, malpresentation) are not excluded unless there is evidence of fetal effect prior to labor (for example, Intrauterine Growth Restriction (IUGR)/Small for Gestational Age (SGA)).

This measure excludes: (1) multiple gestations; (2) preterm, congenital anomalies; and, (3) fetuses affected by selected maternal conditions.

We invited public comments on this proposal.

Comment: Some commenters supported the adoption of this measure. One commenter noted the measure has recently been refined and renamed as “Unexpected Newborn Complications” and expressed the hope that CMS will adopt the updated version.

Further, one commenter recommended that CMS make the measure mandatory no later than FY 2017. The commenter believed that the exclusive electronic reporting of this measure could ultimately reduce the burden of collection and increase the potential for timely feedback to all stakeholders on the ever important area of maternity care.

Response: We will monitor the progress of the refined measure and consider adopting it after the measure completes the NQF-endorsement process. We will take into consideration the commenters' recommendations as we plan Hospital IQR Program policies in the future.

After consideration of the public comments we received, we are finalizing the Healthy Term Newborn (NQF #0716) measure as a voluntary electronic clinical quality measure as proposed.

g. Readoption of Measures As Voluntarily Reported Electronic Clinical Quality Measures

In order to align with the Medicare EHR Incentive Program for eligible hospitals (EHs) and critical access hospitals (CAHs), in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28239 through 28242) we proposed to re-adopt two measures previously removed from the Hospital IQR Program; (a) AMI-2 Aspirin Prescribed at Discharge for AMI (acute myocardial infarction) (NQF #0142) (electronic clinical quality measure); and (b) AMI-10 Statin Prescribed at Discharge (NQF #0639) (electronic clinical quality measure). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28239) we proposed to add these measures to the list of voluntarily reported electronic clinical quality measures as described in section IX.A.7.f. of the preamble of this final rule. We believe we should continue aligning the Hospital IQR Program and the Medicare EHR Incentive Program in order to minimize reporting burden and continue the transition to reporting of electronic clinical quality measures, and we believe voluntary adoption of these measures will further that aim. Further, we believe that allowing hospitals the option to electronically report topped-out measures will provide hospitals with an opportunity to test the accuracy of their electronic health record reporting systems.

(1) Readoption of AMI-2 Aspirin Prescribed at Discharge (NQF #0142)

The AMI-2 Aspirin Prescribed at Discharge (NQF #0142) assesses the percentage of acute myocardial infarction (AMI) patients who are prescribed aspirin at hospital discharge.

The measure is NQF endorsed, but has been placed in reserve status, as the performance on this measure is “topped-out.” The MAP recommended the measure should be suspended and phased out in its Pre-Rulemaking Report: 2013 Recommendations on Measures under Consideration by HHS available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738. However, as stated above, we intend to continue aligning the Hospital IQR Program and Medicare EHR Incentive Program, and we believe collecting this measure on a voluntary basis enables us to continue collecting quality data on this topic while working to minimize reporting burden on participating hospitals. Further, allowing hospitals the option to electronically report topped-out measures will provide hospitals with an opportunity to test the accuracy of their electronic health record reporting systems.

The numerator includes AMI patients in the denominator who are prescribed aspirin at hospital discharge. The denominator includes patients with the following ICD-9-CM principal diagnosis codes of AMI: 410.00, 410.01, 410.10, 410.11, 410.20, 410.21, 410.30, 410.31, 410.40, 410.41, 410.50, 410.51, 410.60, 410.61, 410.70, 410.71, 410.80, 410.81, 410.90, and 410.91.

The following patients are excluded from this measure:

  • Patients less than18 years of age;
  • Patients who have a length of stay greater than 120 days;
  • Patients enrolled in clinical trials;
  • Patients who were discharged to another hospital;
  • Patients who expired;
  • Patients who left the hospital against medical advice;
  • Patients who were discharged to home for hospice care;
  • Patients who were discharged to a health care facility for hospice care;
  • Patients with comfort measures only documented; and
  • Patients with a documented reason for no aspirin at discharge.

(2) Readoption of AMI-10 AMI-Statin Prescribed at Discharge (NQF #0639)

AMI-10 AMI-Statin Prescribed at Discharge (NQF #0639) assesses the percent of acute myocardial infarction (AMI) patients who are prescribed a statin at hospital discharge.

The measure is NQF endorsed. The MAP recommended phased removal in its Pre-Rulemaking Report: 2013 Recommendations on Measures under Consideration by HHS available at: https://www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=72738 because the performance on this measure is likely “topped-out.” However, as stated above, we intend to continue aligning the Hospital IQR Program and Medicare EHR Incentive Program, and we believe collecting this measure on a voluntary basis enables us to continue collecting quality data on this topic while working to minimize reporting burden on participating hospitals. Further, allowing hospitals to electronically report topped-out measures on a voluntary basis will provide hospitals with an opportunity to test the accuracy of their electronic health record reporting systems.

The numerator includes AMI patients in the denominator who are prescribed a statin medication at hospital discharge. The denominator includes patients with the following ICD-9-CM principal diagnosis codes of AMI: 410.00, 410.01, 410.10, 410.11, 410.20, 410.21, 410.30, 410.31, 410.40, 410.41, 410.50, 410.51, 410.60, 410.61, 410.70, 410.71, 410.80, 410.81, 410.90, and 410.91.

The following patients are excluded from this measure:

  • Patients less than 18 years of age;
  • Patients who have a length of stay greater than 120 days;
  • Patients with comfort measures only documented;
  • Patients enrolled in clinical trials;
  • Patients who were discharged to another hospital;
  • Patients who left the hospital against medical advice;
  • Patients who expired;
  • Patients who were discharged to their home for hospice care;
  • Patients who were discharged to a health care facility for hospice care;
  • Patients with low-density lipoprotein less than 100 mg/dL within the first 24 hours after hospital arrival or 30 days prior to hospital arrival and not discharged on a statin; and
  • Patients with a reason for not prescribing statin medication at discharge.

We invited public comments on our proposal to readopt these two measures as electronic clinical quality measures.

Comment: Some commenters supported the inclusion of voluntary reporting for certain electronic clinical quality measures for the Hospital IQR Program, and noted that voluntary reporting allows hospitals to be better prepared for submitting new quality measures from EHRs and to correct any operational issues that arise. Several commenters supported adopting AMI-2 and AMI-10 as electronic clinical quality measures, because aligning the Hospital IQR Program with the Medicare EHR Incentive Program could reduce reporting burdens. The commenter hoped that CMS will continue to expand efforts to allow for electronic reporting to include registries, which are commonly used for data collection and reporting, in addition to EHRs.

Response: We thank these commenters for their support. We would like to clarify that at this time we do not allow registry reporting for these measures.

Comment: One commenter did not support CMS' proposal to readopt two topped-out measures for purposes of electronic reporting, arguing that topped-out measures, by definition, are removed because they are no longer an accurate measure of hospital performance. The commenter was concerned that these measures would not advance hospital quality or improve electronic reporting.

Other commenters opposed AMI-2 and AMI-10 as electronic clinical quality measures because they were topped-out and retired as chart-abstracted measures and they believed retaining them would not advance hospitals' understanding of how to submit electronic clinical quality measures or improve the quality of hospital care.

Response: As we explained in section IX.A.2.g.(2) of the preamble of this final rule in response to a similar comment, even though these measures are topped-out, we would still like to retain the electronically specified versions for the following reasons: (1) to align the Hospital IQR Program and the Medicare EHR Incentive Program, (2) to allow us to monitor the effectiveness of measure reporting by EHR's, and (3) to familiarize hospitals with reporting electronically specified measures. Topped-out status is also only one of many factors which we consider before determining whether a measure should be removed.

While these measures may be topped-out, they are still an accurate measure of performance. Continuing to report on these measures is a way to monitor for continued high performance. Electronic measure data will help us evaluate variations in data capture modes (chart-abstracted versus electronic clinical quality measures) in order to determine whether and what adjustments are necessary for the two different modes of collection. In addition, we believe that by allowing hospitals to voluntarily report these measures via electronic submission, we will provide hospitals needed flexibility in electronic clinical quality measure reporting, as requested by hospitals in their comments to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50813 through 50814). As stated in the proposed rule (79 FR 208243), we intend to propose to require electronic clinical quality measure reporting in future rulemaking. We are providing this voluntary option to provide hospitals time to prepare for required electronic clinical quality measure reporting.

After consideration of the public comments we received, we are finalizing the readoption of both AMI-2: Aspirin Prescribed at Discharge (NQF #0142) and AMI-10: Statin Prescribed at Discharge (NQF #0639) as voluntary electronic clinical quality measures as proposed.

In summary, for FY 2017 payment determination and subsequent years, we are finalizing: (1) the adoption of 11 total measures—9 new measures (4 of which are voluntary electronic clinical quality measures) and 2 previously removed measures re-adopted as voluntary electronic clinical quality measures, and (2) the removal of 19 measures (4 of which were previously suspended), ten of which are being retained as voluntary electronic clinical quality measures. We are not finalizing the removal of one of the required chart-abstracted measures (SCIP-Inf-4). This gives a total of 63 measures (47 required and 16 voluntary electronic clinical quality measures) in the Hospital IQR Program measure set.

Set out below is a table showing both the previously adopted and the newly finalized quality measures for the FY 2017 payment determination and subsequent years. Please note that this table does not include suspended measures.

Previously Adopted Hospital IQR Program Measures and Measures Newly Finalized in This Final Rule for the FY 2017 Payment Determination and Subsequent Years

Short nameMeasure nameNQF No.Submission methods for FY 2017 payment determinationNew for FY 2017 payment determination
AMI-7aFibrinolytic Therapy Received Within 30 Minutes of Hospital ArrivalNQF #0164Electronic clinical quality measure or chart-abstracted REQUIRED
SCIP-Inf-4Cardiac Surgery Patients With Controlled 6 A.M. Postoperative Blood GlucoseNQF #0300Chart-abstracted only REQUIRED
SepsisSevere sepsis and septic shock: management bundleNQF #0500Chart-abstracted only REQUIREDNew for FY 2017.
Imm-2Influenza ImmunizationNQF #1659Chart-abstracted only REQUIRED
Stroke-1Venous thromboembolism (VTE) prophylaxisNQF #0434Chart-abstracted only REQUIRED
ED-1Median time from ED arrival to ED departure for admitted ED patientsNQF #0495Electronic clinical quality measure or chart-abstracted REQUIRED
ED-2Admit Decision Time to ED Departure Time for Admitted PatientsNQF #0497Electronic clinical quality measure or chart-abstracted REQUIRED
Stroke-4Thrombolytic therapyNQF #0437Electronic clinical quality measure or chart-abstracted REQUIRED
Stroke-6Discharged on statin medicationNQF #0439Electronic clinical quality measure or chart-abstracted REQUIRED
Stroke-8Stroke educationN/AElectronic clinical quality measure or chart-abstracted REQUIRED
VTE-1Venous thromboembolism prophylaxisNQF #0371Electronic clinical quality measure or chart-abstracted REQUIRED
VTE-2Intensive care unit venous thromboembolism prophylaxisNQF #0372Electronic clinical quality measure or chart-abstracted REQUIRED
VTE-5VTE discharge instructionsN/AElectronic clinical quality measure or chart-abstracted REQUIRED
VTE-6Incidence of potentially preventable VTEN/AElectronic clinical quality measure or chart-abstracted REQUIRED
PC-01Elective delivery (Collected in aggregate, submitted via Web-based tool or electronic clinical quality measure)NQF #0469Electronic clinical quality measure or chart-abstracted REQUIRED
CLABSINational Healthcare Safety Network (NHSN) Central line-associated Bloodstream Infection (CLABSI) Outcome MeasureNQF #0139NHSN REQUIRED
SSIAmerican College of Surgeons—Centers for Disease Control and Prevention (ACS-CDC) Harmonized Procedure Specific Surgical Site Infection (SSI) Outcome MeasureNQF #0753NHSN REQUIRED
Colon procedures
Hysterectomy procedures
CAUTINational Healthcare Safety Network (NHSN) Catheter-associated Urinary Tract Infection (CAUTI) Outcome MeasureNQF #0138NHSN REQUIRED
MRSANational Healthcare Safety Network (NHSN) Facility-wide Inpatient Hospital-onset Methicillin-resistant Staphylococcus aureus (MRSA) Bacteremia Outcome MeasureNQF #1716NHSN REQUIRED
CDINational Healthcare Safety Network (NHSN) Facility-wide Inpatient Hospital-onset Clostridium difficile Infection (CDI) Outcome MeasureNQF #1717NHSN REQUIRED
HCPInfluenza vaccination coverage among healthcare personnel (HCP)NQF #0431NHSN REQUIRED
MORT-30-AMIHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following acute myocardial infarction (AMI) hospitalization for patients 18 and olderNQF #0230Claims REQUIRED
MORT-30-HFHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following heart failure (HF) hospitalization for patients 18 and olderNQF #0229Claims REQUIRED
MORT-30-PNHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following pneumonia hospitalizationNQF #0468Claims REQUIRED
COPD MortalityHospital 30-Day, All-Cause, Risk-Standardized Mortality Rate (RSMR) following Chronic Obstructive Pulmonary Disease (COPD) HospitalizationNQF #1893Claims REQUIRED
STK MortalityStroke 30-day mortality rateN/AClaims REQUIRED
CABG mortalityHospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following coronary artery bypass graft (CABG) surgeryN/AClaims REQUIREDNew for FY 2017.
READM-30-AMIHospital 30-day all-cause risk-standardized readmission rate (RSRR) following acute myocardial infarction (AMI) hospitalizationNQF #0505Claims REQUIRED
READM-30-HFHospital 30-day, all-cause, risk-standardized readmission rate (RSRR) following heart failure hospitalizationNQF #0330Claims REQUIRED
READM-30-PNHospital 30-day, all-cause, risk-standardized readmission rate (RSRR) following pneumonia hospitalizationNQF #0506Claims REQUIRED
READM-30-TH/TKAHospital-level 30-day, all-cause risk-standardized readmission rate (RSRR) following elective primary total hip arthroplasty (THA) and/or total knee arthroplasty (TKA)NQF #1551Claims REQUIRED
READM-30-HWRHospital-Wide All-Cause Unplanned Readmission (HWR)NQF #1789Claims REQUIRED
COPD READMITHospital 30-Day, All-Cause, Risk-Standardized Readmission Rate (RSRR) following Chronic Obstructive Pulmonary Disease (COPD) HospitalizationNQF #1891Claims REQUIRED
STK READMIT30-day risk standardized readmission rate (RSMR) following Stroke hospitalizationN/AClaims REQUIRED
CABG READMITHospital 30-day, all-cause, unplanned, risk-standardized readmission rate (RSRR) following coronary artery bypass graft (CABG) surgeryN/AClaims REQUIREDNew for FY 2017.
PSI 4 (PSI/NSI)Death among surgical inpatients with serious, treatable complicationsNQF #0351Claims REQUIRED
PSI 90Patient safety for selected indicators (composite)NQF #0531Claims REQUIRED
MSPBPayment-Standardized Medicare Spending Per Beneficiary (MSPB)NQF #2158Claims REQUIRED
AMI paymentAMI Payment per Episode of CareN/AClaims REQUIRED
HF PaymentHospital-level, risk-standardized 30-day episode-of-care payment measure for heart failureN/AClaims REQUIREDNew for FY 2017.
PN paymentHospital-level, risk-standardized 30-day episode-of-care payment measure for pneumoniaN/AClaims REQUIREDNew for FY 2017.
Hip/knee complicationsHospital-level risk-standardized complication rate (RSCR) following elective primary total hip arthroplasty (THA) and/or total knee arthroplasty (TKA)NQF #1550Claims REQUIRED
Registry Nursing Sensitive CareParticipation in a Systematic Clinical Database Registry for Nursing Sensitive CareN/AWeb-based REQUIRED
Registry for General SurgeryParticipation in a Systematic Clinical Database Registry for General SurgeryN/AWeb-based REQUIRED
Safe Surgery ChecklistSafe Surgery Checklist UseN/AWeb-based REQUIRED
HCAHPSHCAHPS + CTM-3NQF #0166 NQF #0228Patient Survey REQUIRED
AMI-2Aspirin Prescribed at Discharge for AMINQF #0142Electronic clinical quality measureVoluntary electronic clinical quality measure.
AMI-8aPrimary PCI Received Within 90 Minutes of Hospital ArrivalNQF #0163Electronic clinical quality measureVoluntary electronic clinical quality measure.
AMI-10Statin Prescribed at DischargeNQF #0639Electronic clinical quality measureVoluntary electronic clinical quality measure.
SCIP-Inf-1aProphylactic Antibiotic Received Within One Hour Prior to Surgical IncisionNQF #0527Electronic clinical quality measureVoluntary electronic clinical quality measure.
SCIP-Inf-2aProphylactic Antibiotic Selection for Surgical PatientsNQF #0528Electronic clinical quality measureVoluntary electronic clinical quality measure.
SCIP-Inf-9Urinary catheter removed on Postoperative Day 1 (POD 1) or Postoperative Day 2 (POD 2) with day of surgery being day zeroNQF #0453Electronic clinical quality measureVoluntary electronic clinical quality measure.
Stroke-2Discharged on antithrombotic therapyNQF #0435Electronic clinical quality measureVoluntary electronic clinical quality measure.
Stroke-3Anticoagulation therapy for atrial fibrillation/flutterNQF #0436Electronic clinical quality measureVoluntary electronic clinical quality measure.
Stroke-5Antithrombotic therapy by the end of hospital day twoNQF #0438Electronic clinical quality measureVoluntary electronic clinical quality measure.
Stroke-10Assessed for rehabilitationNQF #0441Electronic clinical quality measureVoluntary electronic clinical quality measure.
VTE-3Venous thromboembolism patients with anticoagulation overlap therapyNQF #0373Electronic clinical quality measureVoluntary electronic clinical quality measure.
VTE-4Patients receiving un-fractionated Heparin with doses/labs monitored by protocolN/AElectronic clinical quality measureVoluntary electronic clinical quality measure.
PC-05Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother´s ChoiceNQF #0480Electronic clinical quality measureVoluntary electronic clinical quality measure.
EHDI-1aHearing Screening Prior to Hospital DischargeNQF #1354Electronic clinical quality measureVoluntary electronic clinical quality measure.
CAC-3Home Management Plan of Care (HMPC) Document Given to Patient/CaregiverN/AElectronic clinical quality measureVoluntary electronic clinical quality measure.
HTNHealthy Term NewbornNQF #0716Electronic clinical quality measureVoluntary electronic clinical quality measure.

h. Electronic Clinical Quality Measures

(1) Data Submission Requirements for Quality Measures That May Be Voluntarily Electronically Reported for the FY 2017 Payment Determination

We believe that collection and reporting of data through health information technology will greatly simplify and streamline reporting for many CMS quality reporting programs. Through electronic reporting, hospitals will be able to leverage EHRs to capture, calculate, and electronically submit quality data that is currently manually chart-abstracted and submitted to CMS for the Hospital IQR Program. As we noted in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51614), we recognize the need to align and harmonize measures across CMS quality reporting programs to minimize the reporting burden imposed on hospitals. In the Medicare EHR Incentive Program Stage 2 final rule (77 FR 54083 through 54087), we finalized a total of 29 clinical quality measures from which hospitals must select at least 16 measures covering three National Quality Strategy (NQS) domains to report beginning in FY 2014. We anticipate that, as health information technology evolves and infrastructure is expanded, we will have the capacity to accept electronic reporting of many of the chart-abstracted measures that are currently part of the Hospital IQR Program.

In the FY 2014 IPPS/LTCH PPS final rule, for the STK (with the exception of STK-1), VTE, ED, and PC measure sets, we allowed hospitals to either: (1) electronically report at least one quarter of CY 2014 (Q1, Q2, or Q3) quality measure data for each measure in one or more of those four measure sets; or (2) continue reporting all measures in those four measure sets using chart-abstracted data for all four quarters of CY 2014 (78 FR 50818).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28242 through 28243) for the FY 2017 payment determination, we proposed to expand this policy, such that providers may select to voluntarily report any 16 of the 28 Hospital IQR Program electronic clinical quality measures that align with the Medicare EHR Incentive Program as long as those 16 measures span three different NQS domains. The 28 measures are listed in the table below. Only 28 of the 29 measures adopted in the Medicare EHR Incentive Program are applicable for the Hospital IQR Program, because the measure ED-3 Median time from ED arrival to ED departure for discharged ED patients (NQF #0496) is an outpatient quality measure. We expect eligible hospitals to select measures that best apply to their patient mix.

For the FY 2017 payment determination, we also proposed to expand the reporting requirement of electronic clinical quality measures to require a full year's data collection and submission instead of a minimum of one quarter. In addition, for the FY 2017 payment determination, we proposed to require data submission within approximately 60 days after the end of a calendar year quarter. We have listed the proposed submission deadlines in the table below. We also refer readers to section IX.D.2. of the preamble of this final rule for a description of the electronic clinical quality measures data reporting periods and proposed submission deadlines.

CY 2015/FY 2017 Electronic Clinical Quality Measures Data Reporting Periods and Proposed Submission Deadlines

CY 2015 quarterReporting period (2015)Proposed submission deadlines
1January 1-March 31May 30, 2015.
2April 1-June 30Aug 30, 2015.
3July 1-September 30Nov 30, 2015.
4October 1-December 31Feb 28, 2016.

As an incentive for hospitals to voluntarily submit electronically-specified clinical quality measures, we proposed that for the FY 2017 payment determination, hospitals successfully submitting electronic clinical quality measures according to our procedures will not have to validate those electronic clinical quality measures by submitting chart-abstracted data to validate the accuracy of the measure data submitted electronically.

By proposing these changes, we believe we would further align the Hospital IQR Program and the Medicare EHR Incentive Program and promote greater electronic clinical quality measure data reporting for hospitals. In addition, we believe that these changes would ease hospitals' administrative burden, as they will be able to report the same clinical quality measures once to partially satisfy both the Hospital IQR and Medicare EHR Incentive Programs' requirements.

We welcomed public comments on these proposals.

Commenters' opinions on these proposals focused on timing, substantive and nonsubstantive quality measure updates, our collaboration with hospitals and EHR vendors, certification requirements, and general concerns about electronic clinical quality measure reporting.

Comment: Some commenters were concerned that we were not allowing hospitals and EHR vendors enough time to transition to the new data submission deadlines. Some commenters supported the long-term goal of transitioning to EHR-enabled measurement and the general progression toward electronic clinical quality measures, noting that it will improve communication and documentation while reducing hospital resources now used for chart-abstracted measures. A commenter strongly supported CMS using electronic clinical quality measures and strongly objected to the implementation of chart-abstracted measures. The commenter stated that the current methodology of manual chart abstraction is resource intensive and inefficient for hospitals. By focusing on electronic clinical quality measures, hospitals can focus on performance improvement and target resources to implementing EHRs and processes to improve patient care.

Response: We thank commenters for their support. We plan to move away from chart-abstracted measures and move towards electronic clinical quality measures, as appropriate.

This voluntary option also simplifies alignment with the Medicare EHR Incentive Program and allows hospitals to partially satisfy requirements in both programs using a common set of measures. Since hospitals have a choice whether to submit voluntary electronic clinical quality measure data or chart-abstracted data, we recommend hospitals that are not yet prepared to submit electronically instead submit via chart-abstraction. We encourage hospitals to submit test data when they are ready.

We are actively working to reduce hospitals' reporting burden by offering the option to submit electronic clinical quality measures.

We received a number of comments regarding the timing of this proposal in relation to hospitals' readiness to submit electronic clinical quality measures.

Comment: Some commenters stated that healthcare providers and electronic health records systems are not yet ready to adopt electronic clinical quality measures, expressing concern about vendor problems related to meaningful use, including problems associated with submitting data to CMS, and about the accuracy and feasibility of electronically specified measures.

Other commenters suggested that CMS not specify a date for mandatory electronic reporting until significant levels of CEHRT adoption are achieved, and a validation process for electronic clinical quality measures is operational and yields evidence of measure reliability.

Response: We recently published a proposed rule (79 FR 29732 through 29738) proposing changes to the meaningful use stage timeline and changes to the requirements for the reporting of clinical quality measures for 2014. The comment period closed July 21, 2014. We hope the commenter was able to share their concerns regarding vendor problems related to meaningful use by responding to the proposed rule. We would like to clarify that this rule provides flexibility to hospitals and CAHs needing to update their EHR systems only for the most recent version of the CQMs, which is not a criteria for 2014 CEHRT. No changes to 2014 CEHRT criteria or timelines are being finalized in this rule. As we have previously mentioned, we are finalizing voluntary electronic clinical quality measure submission in order to give hospitals flexibility. Hospitals that are not yet ready to submit electronically can satisfy requirements for applicable measures as previously finalized and finalized in this rule at section IX.A.2.g.(2) of the preamble of this final rule, that is submit via chart-abstraction. We encourage hospitals to work with vendors and encourage vendors to work with the various EHR-related and electronic clinical quality measure HHS working groups to become more informed about policies and standards. As participants in these groups, the hospitals and vendors can share their concerns with CMS, ONC, and other measure stakeholders and help to improve processes. In addition, we suggest hospitals participate in our pilot electronic validation test to get free feedback on the accuracy of their data and have an opportunity to provide direct input regarding concerns. We refer readers to section IX.A.11.e. of the preamble of this final rule where this policy is discussed.

Comment: A commenter opposed the creation of voluntary electronic reported clinical quality measures.

Response: We respectfully disagree with the commenter that did not support voluntary electronic clinical quality measure reporting. This voluntary reporting provides hospitals the opportunity to test their submissions to prepare before electronic clinical quality measure reporting is required for this program.

Comment: Commenters urged CMS to begin a more robust dialogue with hospitals, EHR vendors, and other stakeholders regarding submitting electronic clinical quality measures so that there is a shared understanding of the opportunities and challenges that lay ahead—both from the hospital operational perspective as well as from our perspective.

Response: We have begun our education and outreach efforts with hospitals and vendors by holding educational webinars/sessions, uploading a number of resources to QualityNet, and creating a listserv for updates and announcements. Further, we have past recorded sessions discussing electronic clinical quality measures issues on our Web site at: https://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228773852046. We also note that hospitals may submit test files or practice submissions at any time and encourage hospitals and vendors to begin submitting test files as soon as feasible.

Comment: One commenter recommended that CMS consider that certain measures currently improve quality of care for patients, but may not immediately lend themselves to e-specification.

Response: We will take the comment into consideration for future measures, and note that we have expanded our measures under consideration process in order to find measures from a greater number of sources.

Comment: Some commenters requested that CMS allow more time for implementing certification requirements and adopting measure specification updates. A commenter specifically suggested that CMS allow 18 months between the time of an updated specification adoption and the federal fiscal year to which the specification updates should apply. Another commenter recommended that CMS refrain from requiring certification of the revised measures and provide flexibility on the date by which the revisions must be fully implemented on provider sites. The commenter expressed concern that the current time frame of just prior to the October 1, 2014 start of the measure reporting year is inadequate to accommodate the development, testing, certification of the software by health IT vendors and subsequent delivery and implementation of software for every customer site. This time constraint could lead providers to continue to attest to their electronic clinical quality measures in FY 2015, rather than submit their electronic clinical quality measures as CMS would prefer.

One commenter noted that there is insufficient time for vendors and certification test labs to obtain certification and subsequently deliver the certified product in time for hospitals to submit electronic clinical quality measures electronically in CY 2015. The commenter therefore anticipated that hospitals will continue to attest their clinical quality measure data in FY 2015. The commenter suggested that CMS allow hospitals who elect to attest their clinical quality measure for the Medicare EHR Incentive Program in CY 2015 to submit data electronically for the Hospital IQR Program during CY 2015.

Response: We believe when discussing the “revised measures,” the commenter is referring to the annual April updates to the electronic clinical quality measures. For submission of CY 2015 data, we will only accept data consistent with the April 2014 measure specifications. Electronic clinical quality measure specifications are available in the CMS eCQM Library at: http://cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/eCQM_Library.html. The October 1, 2014 date is the beginning of the reporting period for the Medicare EHR Incentive Program only. However, we proposed, that in order to align the two programs' electronic clinical quality measure reporting and submission periods, both programs' reporting periods and submission deadlines would begin with Q1 CY 2015 discharges (79 FR 28245 through 28246).

However, after consideration of these comments regarding timing and hospitals' readiness, we are modifying our proposal so that hospitals that wish to participate in the voluntary reporting need only submit one CY 2015 quarter (Q1, Q2, or Q3) of electronic clinical quality measure data with a submission deadline of November 30, 2015. We hope that this modification will encourage more hospitals to submit electronic clinical quality measures rather than attest. The commenter is reminded that attesting is a Medicare EHR Incentive Program option only and would not apply to Hospital IQR Program requirements. In addition, if a hospital chooses not to voluntarily submit one quarter of electronic clinical quality measure data for the Hospital IQR Program, it must submit all four quarters of chart-abstracted data in CY 2015.

We believe that by modifying our proposal and reducing the data requirement to one quarter's worth of data and by adopting the November 30th submission deadline, hospitals will have adequate time to update their EHR's ability to capture and report data.

In addition, measure certification falls under the ONC. ONC published a proposed rule February 26, 2014 describing voluntary 2015 Edition Electronic Health Record (EHR) Certification Criteria; Interoperability Updates and Regulatory Improvements (79 FR 10880 through 10946). The proposed rule offered a potential “gap certification” solution which may help the commenter with their concerns about the current timelines for development, testing, certification of the software by health IT vendors. The final rule is expected to be published in the summer of 2014. With respect to CEHRT requirements, though 2014 CEHRT is required, eligible hospitals are not required to ensure that their CEHRT products are recertified to the most recent version of the electronic specifications for the clinical quality measures.

A hospital may submit electronic clinical quality measures for the Hospital IQR Program during CY 2015 even if they attest their aggregate measure numerators and denominators through the Medicare EHR Incentive Program. The hospital could submit as test data or production data. Test data submissions are submissions that do not count as submissions; they are practice submissions. Production data submissions are considered final submissions meant to fulfill Program submission requirements. With respect to CEHRT requirements, although 2014 CEHRT is required, eligible hospitals are not required to ensure that their CEHRT products are recertified to the most recent version of the electronic specifications for the clinical quality measures.

Comment: One commenter noted that the proposed subregulatory process for annual updates is to incorporate “nonsubstantive” changes to measure specifications. However, the commenter believed that the annual updates include substantive changes. The commenter looks forward to working with CMS to further refine the definition of nonsubstantive changes and recommended that the annual updates be limited to changes that do not have a significant impact on clinicians, software, or recertification.

Response: We interpret the commenter's use of the term “annual updates” to be in reference to our publication of the measure specifications in the electronic clinical quality measure Library at: http://cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/eCQM_Library.html. We will work with stakeholders to help define substantive and nonsubstantive changes related specifically to electronic measure specifications, and will take suggestions regarding any recommended changes into consideration for future rulemaking.

Comment: A commenter explained that there is confusion as to whether vendors need to certify to the updated measures and whether hospitals must start their measure reporting year with the annual updates and request clear and consistent guidance. The commenter also noted that the Cypress tool is not yet available for testing of the new measures, and no information has been provided as to when Cypress may be available.

Response: Although 2014 CEHRT is required, eligible hospitals are not required to ensure that their CEHRT products are recertified to the most recent version of the electronic specifications for the clinical quality measures. Hospitals that choose to voluntarily submit electronic clinical quality measures in Q1, Q2, or Q3 of CY 2015 for FY 2017 payment determination must use the 2014 version of the measure specifications.

Cypress version 2.5 is expected to be available with the eligible hospital and eligible provider measure packages in September 2014. Cypress version 2.51 is expected to align with the CMS Implementation Guide released for publication in July 2014.

Comment: A commenter expressed concerns about electronic clinical quality measure specifications in regards to the “Medication, Order not done: Medical Reason” related to the STK, VTE, and future AMI, PN, and SCIP measures. The commenter pointed out that the Hospital IQR Program requires providers to document a medical reason for not prescribing a medication/device and the MU/electronic clinical quality measure requires providers to document “what medication of choice would you have prescribed if not for a medical reason.” The commenter disagreed with the requirement to answer these questions and suggested that providers may view the questions as inefficient “administrative only questions” and may avoid them entirely. The commenter also suggested that the questions may force the institution to intentionally fail a measure due to lack of a contraindication and that it is improper to use data related to medication orders for public reporting of quality of care and financial incentives because not all medication orders that count for the Hospital IQR Program also count for electronic clinical quality measures since they are not all included in the qualifying RxNorm document.

Response: We acknowledge that this is a known issue that is being addressed through HL7 and expected to be implemented in FY 2015. ONC has consolidated several JIRA comments into one issue. The commenter can follow the progress of the issue at http://jira.oncprojectracking.org/browse/CQM-970. We note that to date there are no consequences for measure failure and encourage the commenter to review our zero denominator clarification in section IX.D.5. of the preamble of this final rule.

Comment: A commenter expressed concern with the increasing number of measures for electronic clinical quality measure submission. The commenter advised that since electronic health records allow documentation to be placed in multiple places, chart review is required. The commenter stated that current medical record technology has not matured to restrict documentation input into only the field or fields designated for electronic data retrieval. Potential technological solutions contribute to alert fatigue. Further, the commenter believed that because there is not a common electronic medical record system for all staff to use regardless of the care setting, multiple inefficient documentation systems are created and customized to suit the needs of the individual hospital and facility. The commenter stated that resolving these problems will require a significant financial investment while reimbursement for services declines.

Response: We recognize that many hospitals struggle with proper data capture in the EHR. We encourage these hospitals to work with their vendors to reduce burden and human intervention through chart abstraction. The electronically-specified clinical quality measures remain voluntary at this time to provide an opportunity for hospitals to improve upon accurate data capture.

Comment: A commenter specifically disagrees with CMS' statement that electronic clinical quality measures are more easily reported than chart-abstracted measures.

Response: We disagree that electronic clinical quality measures are not more easily reported than chart-abstracted measures; once capture is possible within EHR, the time and resources compared to manual abstraction should be significantly less. As data becomes more standardized, it is expected that reporting burden will decrease over time. For example, electronic clinical quality measure collection does not require hospital staff time to find and pull paper medical records, and manually review medical records to abstract data elements used in measure calculation. We acknowledge there are costs, but also benefits to moving to electronic data capture. EHR user training is a cost that will ultimately result in consistency coming from a common understanding and capture of common data definitions.

Comment: A commenter recommended that CMS develop and share a five-year roadmap for the future regarding the transition of all clinical quality measurement programs to electronic reporting so hospitals can strategically plan for workflows that support electronic reporting. The commenter further recommended that this guidance, as well as all electronic quality reporting sub regulatory guidance and eMeasure specifications should be located on a central Web site.

Response: We are working on a roadmap for both the Hospital IQR and Hospital VBP Programs, as well as a consolidated location for electronic clinical quality measure resources.

After consideration of the public comments we received, we are modifying our proposal to finalize that hospitals that choose to voluntarily report electronic measures should submit one quarter of electronic clinical quality measure data from Q1, Q2, or Q3 of CY 2015 for FY 2017 payment determination. Hospitals that choose to voluntarily submit electronic clinical quality measures must use the 2014 version of the measure specifications and submit 16 measures covering three NQS domains from the 28 available electronically specified measures. However, hospitals may voluntarily submit more than one quarter of data. We will not accept Q4 2015 data for CY 2015 as this would likely delay EHR Incentive Program payments. Policies for CY 2016/FY 2018 payment determination electronic clinical quality measure reporting and submission will be made in future rulemaking.

Because we are modifying our proposal to now only require 1 quarter's worth of data from hospitals that wish to voluntarily submit electronically specified measures. We are subsequently also modifying the submission deadline to November 30, 2015 regardless of which quarter of data is submitted. We also refer readers to section X.2.h.1 for further discussion of submission of electronically specified measures.

The chart below provides a summary of the finalized reporting periods and electronic submission deadlines for the FY 2017 Hospital IQR Program:

FY 2017 Hospital IQR Program Electronic Reporting Periods and Submission Deadlines for Eligible Hospitals

CY 2015 QuarterDischarge reporting periodsSubmission deadlines
Q1January 1, 2015-March 31, 2015November 30, 2015.
Q2April 1, 2015-June 30, 2015November 30, 2015.
Q3July 1, 2015-September 30, 2015November 30, 2015.
Q4October 1, 2015-December 31, 2015Not Applicable.

(2) Public Reporting of Electronic Clinical Quality Measures

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50813 through 50818), we adopted a policy under which we would only publicly report electronic clinical quality measure data under the Hospital IQR Program if we determined that the data are accurate enough to be reported. However, we noted that the majority of public commenters had opposed our proposal to withhold the electronically reported data from publication on Hospital Compare, and instead urged us to publicly display it (78 FR 50815). Therefore, for electronic clinical quality measure data submitted for the FY 2016 payment determination, we will publically report the data as previously finalized. However, for the FY 2017 payment determination, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28243) we proposed to provide hospitals that voluntarily report one year of electronic clinical quality measure data (as proposed above) an option to have their data reported on Hospital Compare with a preview period prior to public reporting. We also proposed to add a footnote next to that publically reported data indicating that it is a result of electronically-specified measures.

We welcomed public comments on these proposals.

Comment: Commenters suggested that when reporting electronic clinical quality measure data, hospitals should be provided a preview period for the FY 2016 payment determination. One commenter believed that public display of electronic measures for the FY 2016 payment determination should not occur because accuracy of data has not been validated, there would be inconsistencies in reporting time periods and that display of the data may not provide accurate or valuable data to the public for decision making.

Commenters noted that display of the data may not provide accurate or valuable data to the public for decision making, and specifically stated that there are no validity and reliability studies demonstrating the capture of equivalent data between chart-abstracted measures and electronically captured measures and urging us to develop a data validation strategy before publicly posting this information. Commenters stated that measures submitted as electronic clinical quality measures should not be publicly reported until validation of electronic clinical quality measures demonstrates that they are comparable to values reached through chart-abstraction. A commenter expressed concern that opportunity for a preview period before posting electronic clinical quality data on Hospital Compare will not offset the risks associated with reporting clinical quality measures electronically. One commenter asked that CMS wait until more research is conducted and there is an understanding of the limitations and opportunities of the electronic clinical quality measures. The commenter also asked that CMS wait until the preponderance of hospitals can do so and a data validation system for electronic measures is established.

However, other commenters stated that not reporting electronic clinical quality measures on Hospital Compare fails to provide the public with reliable data and requested that CMS communicate the criteria it will use to determine if the electronic clinical quality measure data are accurate enough to be publicly displayed.

Response: Regarding public reporting for electronically reported data submitted for the FY 2016 payment determination, we note that this policy is not subject to change in this rulemaking as it was previously finalized. However, consistent with our finalized policy, we will not post data that we determine are not deemed to be accurate. We intend to use the results of our validation pilot to assist in determining criteria for identifying electronic clinical quality measure data accuracy. These criteria will be proposed in future rulemaking.

With respect to inconsistencies in reporting periods, historically we publicly reports data on Hospital Compare as it becomes available. Therefore, it is not unusual for there to be inconsistencies in reporting periods. The current data collection periods for each measure are posted on Hospital Compare.

We appreciate the commenter's concerns about validation. As finalized in section IX.A.11.e. of the preamble of this final rule, we intend to conduct a validation strategy pilot test in FY 2015. We also intend to develop mandatory requirements for validation in the FY 2016 IPPS/LTCH PPS proposed rule, which will make assessment of validity possible prior to posting of data collected for the FY 2018 payment determination.

However, based on public comments received opposing public reporting for FY 2017, we are modifying our proposal to finalize that we will only publish the names of hospitals who successfully submit CY Q1, Q2, or Q3 electronic clinical quality measure data by November 30, 2015. We will not: (1) report actual data or performance rates for measures submitted as electronic clinical quality measures on Hospital Compare, (2) include a preview period, or (3) provide hospitals an option to suppress their participation.

Comment: One commenter encouraged CMS to use a strategy similar to the Star Ratings program with “topped-out” measures. The commenter suggested we allow voluntary submission of “topped-out” measures through all reporting mechanisms and publically report on Hospital Compare as “display measures.” This would allow for continued monitoring of performance and increase alignment with the Medicare Advantage and Part D plans. This concept of display measures could be used for introducing and testing new measures by first introducing the new measures on the display page.

Response: We thank the commenter for the suggestion and will consider the idea in the future.

After consideration of the public comments we received and as a logical outgrowth of our existing public reporting policy, we are finalizing our policy that we will only publicly report the names of those hospitals who successfully submit CY 1, CY 2, or CY 3 electronic clinical quality measure data by the November 30, 2015 submission deadline. Hospitals will not have a preview period nor will we allow hospitals to opt out of this public reporting. We will indicate these hospitals with a symbol on Hospital Compare to recognize their advanced ability to submit data electronically. We will not publicly report actual data or performance rates of electronic clinical quality measures at this time.

8. Possible New Quality Measures and Measure Topics for Future Years

a. Mandatory Electronic Clinical Quality Measure Reporting for FY 2018 Payment Determination

We anticipate that, as EHR technology changes and improves, hospitals will electronically report all clinical process-of-care and HAI measures that are currently part of the Hospital IQR Program or that have been proposed for adoption into the Program. As stated above, we intend for the future direction of electronic quality measure reporting to reduce significantly administrative burden on hospitals under the Hospital IQR Program. We will continue to work with measure stewards and developers to develop new measure concepts, and conduct pilot, reliability, and validity testing. We believe that this voluntary reporting option will provide hospitals and us with the ability to test systems in CY 2015 for future quality program proposals that, if finalized, will make electronic reporting a requirement instead of voluntary. We believe this will simplify measure collection and submission for the Hospital IQR Program, and will reduce the burden on hospitals to report chart-abstracted measures.

We intend to propose to require reporting of electronic clinical quality measures for the Hospital IQR Program beginning for the CY 2016 reporting period or FY 2018 payment determination. We considered proposing to require hospitals to electronically report some Hospital IQR Program quality measures in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27695). After considering public comments, we made electronic reporting voluntary in CY 2014 in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50813 through 50814). However, after two years, we believe that hospitals are more prepared and should be required to report Hospital IQR Program measures as electronic clinical quality measures beginning in CY 2016. We intend to propose this policy in future rulemaking, but requested comments on this intention here.

Comment: A commenter asked for clarification on the proposal to electronically report all clinical process of care and HAI measures beginning in CY 2016, which are currently part of the Hospital IQR Program or which have been proposed for adoption for the FY 2018 payment determination and subsequent years. The commenter believed that CMS is moving away from the critical work of the Infection Preventionist and into a realm that is without professional judgment for identifying an HAI.

Response: We clarify that we did not propose electronic reporting of all clinical process of care and HAI measures in CY 2016. We do not intend to take away the professional judgment of the Infection Preventionist professionals.

Comment: One commenter was very concerned about the amount of resources that would be needed to analyze, validate, and ensure compliance with the electronically specified clinical quality measure specifications as well as the actual submission process. The commenter asked that CMS require the use of electronic submissions gradually instead of for all Hospital IQR Program measures in CY 2016. The commenter recommended a proposal that encourages voluntary submission of one or two measures that are not “topped-out” for CY 2016 with future gradual expansion of required electronic measures over a period of several years. The commenter stated that this would allow hospitals to become proficient in reporting measures electronically while curtailing the administrative burden that hospitals experience when implementing new electronic measures.

The commenter also urged CMS to allow hospitals to have flexibility in how measures are transmitted until all measure developers confirm that the measures can be e-specified within the timeframe. A commenter noted that any decisions to add electronic clinical quality measures should be dependent on the final decisions for Stage 2 and Stage 3 of meaningful use, given the current difficulties providers and vendors are experiencing with Stage 2 EHR implementation. In addition, the commenter recommended that CMS consider that certain measures currently improve quality of care for patients, but may not immediately lend themselves to e-specification.

Response: We believe we are providing a gradual approach to electronic clinical quality measure adoption and submission. This will be the second year that the Hospital IQR Program has provided a voluntary electronic reporting option. With respect to the commenter's request that we allow flexibility in how measures are submitted, we will strive to include a variety of measures in the Hospital IQR Program, such as claims-based, chart-abstracted, electronically specified, and structural aggregate measures. We recognize that many hospitals struggle with proper data capture in the EHR and we encourage these hospitals to work with their vendors to reduce burden associated with human intervention through chart abstraction. The electronic clinical quality measures remain voluntary at this time to provide an opportunity to improve upon accurate data capture. We continue to work with the Medicare EHR Incentive Program team to ensure measure alignment moving forward.

We agree that not all measures are appropriate for electronic specification.

Comment: A commenter stated that while ONC and others are working to ensure common data standards, it is unwise to dismiss inclusion of a measure that is currently not electronically specified, but which may improve the quality of care for patients.

Response: We will not remove a measure merely because it lacks an electronic specification. In the FY 2011 IPPS/LTCH PPS final rule (75 FR 50185), we outlined seven criteria for removing measures from the Hospital IQR Program. In section IX.A.2.a. of the preamble of this final rule, we are also finalizing updated criteria for determining “topped-out” status. Conversely, we will consider adopting a measure that does not have electronic specifications if the measure meets a critical need and measurement gap.

Comment: A commenter urged CMS to reconsider the proposal to begin requiring electronic clinical quality measures in CY 2016. The commenter stated that there is a lack of clear e-specifications and certification requirements, and that mandatory reporting should only begin when EHR systems are able to reliably generate this data.

Response: We will consider these suggestions as we develop policies on electronic reporting. Please note that we did not propose to require electronic clinical quality measures in CY 2016, but rather, we signaled an intent. We thank the commenters for providing this feedback, and will take it into account in the future.

b. Possible Future Electronic Clinical Quality Measures

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28244) we stated that we intend to continue to support the following measure domains in the Hospital IQR Program measure set: effective clinical care (for example, the AMI, PN, STK, and VTE measures), communication and care coordination (for example, the readmission measures), patient safety (for example, the HAI measures), person and caregiver-centered experience (for example, the HCAHPS measure), community/population health (for example, the global immunization measure), and efficiency and cost reduction (for example, the Medicare Spending per Beneficiary measure). This approach will enhance better patient care while aligning the Hospital IQR Program with our other established quality reporting and pay-for-performance programs, such as the Hospital VBP Program.

Based on the above approach, we stated our intent to propose to adopt the following electronic clinical quality measures with data collection beginning with October 1, 2016 discharges (or, as described further above, January 1, 2017, if the proposal to align reporting under the Hospital IQR Program and Medicare EHR Incentive Program is finalized) to coincide with Medicare EHR Incentive Program Stage 3 collection:

• Hepatitis B Vaccine Coverage Among All Live Newborn Infants Prior to Hospital or Birthing Facility Discharge (NQF #0475)

The Hepatitis B Vaccine Coverage Among All Live Newborn Infants Prior to Hospital or Birthing Facility Discharge (NQF #0475) measure is NQF-endorsed, supported by the MAP, and conditionally supported by the MAP as an electronic clinical quality measure for the EHR Incentive Program by the MAP in its 2014 Recommendations on Measures for More Than 20 Federal Programs final report available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. However, the MAP recommends a review of the electronic specifications of this measure through the NQF endorsement process.

This measure requires each hospital/birthing facility to measure its administration of a dose of hepatitis B vaccine to all infants born in their hospital/birthing facility prior to discharge for a specific time period (for example, one calendar year). Hospitals are required to assess infants whose parents refused vaccination for exclusion from the coverage estimate.

• PC-02 Cesarean Section (NQF #0471)

The PC-02 Cesarean Section (NQF #0471) is NQF-endorsed and supported by the MAP in its 2014 Recommendations on Measures for More Than 20 Federal Programs final report available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. The MAP noted that there is an important public education piece to the reporting of PC-02 and recommended that we work with others to ensure consumers understand what the results mean and why the measure is important.

This measure assesses the number of nulliparous women with a term, singleton baby in a vertex position delivered by cesarean section.

• Adverse Drug Events—Hyperglycemia

Adverse Drug Events—Hyperglycemia is conditionally supported by the MAP in its 2014 Recommendations on Measures for More Than 20 Federal Programs final report available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. Use of this measure would address a very common condition. The MAP expressed concerns over the feasibility of using this measure in the Hospital IQR Program as it has been tested using electronic data and stated that the NQF endorsement process should resolve this issue.

This measure assesses the average percentage of hyperglycemic hospital days for individuals with a diagnosis of diabetes mellitus, anti-diabetic drugs (except metformin) administered, or at least one elevated glucose level during the hospital stay. The measure's numerator is the sum of the percentage of hospital days in hyperglycemia for all admissions in the denominator. The measure's denominator is the total number of admissions with a diagnosis of diabetes mellitus, at least one administration of insulin or any oral anti-diabetic medication except metformin, or at least one elevated blood glucose value (>200 mg/dL [11.1 mmol/L]) at any time during the entire hospital stay.

Exclusions include: (1) Admissions with a diagnosis of diabetic ketoacidosis (DKA) or hyperglycemic hyperosmolar syndrome (HHS); (2) admissions without any hospital days included in the analysis; (3) admissions with lengths of stay greater than 120 days.

• Adverse Drug Events—Hypoglycemia

Adverse Drug Events—Hypoglycemia is conditionally supported by the MAP in its 2014 Recommendations on Measures for More Than 20 Federal Programs final report, which is available at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report_2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. Use of this measure would address a common condition that is very dangerous to patients. The MAP expressed concerns over the feasibility of using this measure in the Hospital IQR Program as it has been tested using electronic data and that the NQF endorsement process should resolve this issue.

This measure assesses the rate of hypoglycemic events following the administration of an anti-diabetic agent. The measure's numerator is the total number of hypoglycemic events (<40 mg/dL) that were preceded by administration of a short/rapid-acting insulin within 12 hours or an anti-diabetic agent other than a short/rapid-acting insulin within 24 hours, were not followed by another glucose value greater than 80 mg/dL within 5 minutes, and were at least 20 hours apart. The measure's denominator is total number of hospital days with at least one anti-diabetic agent administered. Exclusions include admissions with length of stay greater than 120 days.

We requested comments on these possible future measures.

Comment: One commenter strongly supported the proposed measure Adverse Drug Events—Hypoglycemia.

Response: We thank the commenter for their support.

Comment: One commenter strongly supported adding Adverse Drug Events—Hyperglycemia. Another commenter advised that measurement of Adverse Drug Events—Hyperglycemia via chart abstraction requires searching for discrete, out-of-range blood glucose lab values, which is resource intensive. The commenter stated that collection of this measure as an electronic clinical quality measure is the most efficient data collection mechanism and supports Meaningful Use of an electronic health record. The commenter believed that glucose testing results can be captured at the point-of-care or from the laboratory system and stored in the EHR as discrete data fields.

Response: We thank the commenters for their support and will address this measure in future policy making.

Comment: A commenter noted that that electronically submitted data on Adverse Drug Events-Hyperglycemia would be highly unreliable. Further, that commenter stated that recommendations regarding levels of glucose control are variable among patient populations and there is limited information within CMS' proposal regarding what patient populations would be included in the sample.

Response: Adverse Drug Events—Hyperglycemia is conditionally supported by the MAP. The MAP expressed concerns over the feasibility of using this measure in the Hospital IQR Program as it has been tested using data from the EHR. Some hospitals and health systems are able to use the results of these electronic measures to address adverse events at the point of care and to track improvement over time. The data elements are still under development.

Comment: Several commenters supported the adoption of the Hepatitis B Vaccine measure. A commenter recommended that further attention is given to high volume conditions and/or procedures, the goals of the three-part aim, and alignment between the Hospital IQR Program and other HHS programs.

Response: We thank the commenters for their support.

Comment: Several commenters supported the adoption of the Cesarean Section measure. One commenter believed that exclusive electronic reporting of this measure could ultimately reduce the burden of collection and increase the potential for timely feedback to all stakeholders on the ever important area of maternity care. Commenters also noted that the two leading obstetric professional societies, American College of Obstetricians and Gynecologists and Society for Maternal-Fetal Medicine, recently released a detailed set of consensus recommendations for safely reducing the rate of initial or primary cesarean sections, stating that this procedure is overused and that there are many safe ways to reduce the rate.

Response: We thank the commenters for their support.

We note that we received many public comments regarding other suggested future measures and policies addressing different operational aspects of the Hospital IQR Program such as public reporting and working with other stakeholders. We thank the commenters for their comments. Because we believe these comments are not within the scope of this current rulemaking, we are not addressing them in this final rule. However, we intend to consider all of these views for future rulemaking and Hospital IQR Program development.

9. Form, Manner, and Timing of Quality Data Submission

a. Background

Sections 1886(b)(3)(B)(viii)(I) and (II) of the Act state that the applicable percentage increase for FY 2007 and each subsequent fiscal year shall be reduced by 2.0 percentage points (or beginning with FY 2015, by one-quarter of such applicable percentage increase (determined without regard to sections 1886(b)(3)(B)(ix), (xi), or (xii) of the Act)) for any subsection (d) hospital that does not submit, to the Secretary in accordance with this clause and in a form and manner, and at a time, specified by the Secretary, data required to be submitted on measures selected under this clause with respect to such a fiscal year. We note that, in accordance with this section, the FY 2015 payment determination begins the first year that the Hospital IQR Program will reduce the applicable percentage increase by one-quarter of such applicable percentage increase. In order to participate in the Hospital IQR Program, hospitals must meet specific procedural requirements.

Hospitals choosing to participate in the Hospital IQR Program must also meet specific data collection, submission, and validation requirements. For each Hospital IQR Program year, we require that hospitals submit data on each measure in accordance with the measure's specifications for a particular period of time. The data submission requirements, Specifications Manual, and submission deadlines are posted on the QualityNet Web site at: http://www.QualityNet.org/. Hospitals submit quality data through the secure portion of the QualityNet Web site. There are safeguards in place in accordance with the HIPAA Security Rule to protect patient information submitted through this Web site.

In order to participate in the Hospital IQR Program, hospitals must meet specific procedural requirements. Hospitals choosing to participate in the Hospital IQR Program must also meet specific data collection, submission, and validation requirements.

b. Procedural Requirements for the FY 2017 Payment Determination and Subsequent Years

The Hospital IQR Program procedural requirements are codified in regulation at 42 CFR 412.140. We refer readers to the codified regulations for participation requirements, as further explained by the FY 2014 IPPS/LTCH PPS final rule (78 FR 50810 through 50811).

c. Data Submission Requirements for Chart-Abstracted Measures

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51640 through 51641), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53536 through 53537), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50811) for details on the Hospital IQR Program data submission requirements for chart-abstracted measures.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28245) we did not propose any changes to data submission requirements for chart-abstracted measures.

d. Alignment of the Medicare EHR Incentive Program Reporting and Submission Timelines for Clinical Quality Measures with Hospital IQR Program Reporting and Submission Timelines

The Hospital IQR Program and the Medicare EHR Incentive Program have different reporting and submission periods for electronic clinical quality measures, with hospitals reporting data to the Hospital IQR Program based on calendar year deadlines while the Medicare EHR Incentive Program is based on fiscal year deadlines. In addition, the Hospital IQR Program generally requires quarterly reporting and submission of data for chart-abstracted measures while the Medicare EHR Incentive Program requires annual submission of clinical process of care measure data.

As a result of the different and incongruent Hospital IQR and Medicare EHR Incentive Programs' schedules, hospitals reporting and submitting measure data to both programs would have to do so multiple times in a calendar year. This discrepancy may create confusion and additional burden for hospitals attempting to report data to both programs. To alleviate this possible confusion and reduce provider burden, beginning with the CY 2015 reporting period/FY 2017 payment determination, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28245 through 28246) we proposed to align incrementally the data reporting and submission periods for clinical quality measures for the Medicare EHR Incentive Program and the Hospital IQR Program on a calendar year basis.

This proposed change also would also move us closer to meeting our commitment to align quality measurement and reporting among our programs, as we described in the Electronic Health Record Incentive Program—Stage 2 final rule (77 FR 54049 through 54051), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53502 and 53534), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50811 through 50819 and 78 FR 50903 through 50904).

In order to ease the transition and prevent the delay of Medicare EHR Incentive Program payments, we proposed to shift incrementally the Medicare EHR Incentive Program reporting and submission periods for clinical quality measures to align with that of the Hospital IQR Program. We refer readers to section IX.D.2. of the preamble of this final rule for a detailed discussion of this proposal in the Medicare EHR Incentive Program. Specifically, for the CYs 2015 and 2016, we proposed in the Medicare EHR Incentive Program to require CY reporting, but only for the first three calendar quarters (that is, January through September). This proposal will allow us to align data reporting and submission periods without shifting the Medicare EHR incentive payments.

We note that for the Hospital IQR Program, for the FY 2017 payment determination, we proposed to change the November 30th submission deadline to require data submission within approximately 60 days of the close of a quarter. We refer readers to section IX.A.7.h.(1) of the preamble of this final rule where this proposal is discussed. We also proposed this change in the Medicare EHR Incentive Program in order to align the two programs. We refer readers to section IX.D.2. of the preamble of this final rule where this proposal is discussed. In summary, we proposed to align the reporting and submission periods of the Medicare EHR Incentive Program clinical quality measures with that of the Hospital IQR Program for CYs 2015 and 2016.

Proposed Reporting Timeline To Align the Medicare EHR Incentive Program With Proposed Hospital IQR Program Submission Periods

CYMedicare EHR incentive program reporting requirements*Hospital IQR program reporting requirementsSubmission period**
2015 Reporting PeriodQ1January 1-March 31, 2015January 1-March 31, 2015Data must be submitted by May 30, 2015.
Q2April 1-June 30, 2015April 1-June 30, 2015Data must be submitted by August 30, 2015.
Q3July 1-September 30, 2015July 1-September 30, 2015Data must be submitted by November 30, 2015.
Q4N/A for Medicare EHR Incentive ProgramOctober 1-December 31, 2015For Hospital IQR Program, Data must be submitted by February 28, 2016.
2016 Reporting PeriodQ1January 1-March 31, 2016January 1-March 31, 2016Data must be submitted by May 30, 2016.
Q2April 1-June 30, 2016April 1-June 30, 2016Data must be submitted by August 30, 2016.
Q3July 1-September 30, 2016July 1-September 30, 2016Data must be submitted by November 30, 2016.
Q4N/A for Medicare EHR Incentive ProgramOctober 1-December 31, 2016For Hospital IQR Program, Data must be submitted by February 28, 2017.
* Calendar year alignment and quarterly reporting for 2015 and 2016 would apply for electronically reported CQM data only.
** Proposed Medicare EHR Incentive Program and Hospital IQR submission period would allow data submission on an ongoing basis starting January 2 of the reporting year, and ending approximately 60 days after the end of the quarter.

We invited public comments on these proposals.

Comment: Many commenters supported CMS' proposal to align the Hospital IQR Program and the Medicare EHR Incentive Program. One commenter supported efforts aligning the Hospital IQR Program and the EHR Incentive Program and supported using the Hospital IQR Program as the foundation of the alignment.

Response: We thank these commenters for their support.

Comment: One commenter recommended that additional steps be taken to fully align the Hospital IQR and Medicare EHR Incentive Programs, and stated that it is currently not possible for a hospital to satisfy the meaningful use requirements with mandatory Hospital IQR Program measures only. This commenter observed that the Stage 2 list of electronic clinical measures includes some that have not been adopted for the Hospital IQR Program, and also some Hospital IQR Program measures that have been found to be “topped-out.” Two measures that previously were removed from the Hospital IQR Program remain as electronic clinical quality measures for demonstrating meaningful use of EHRs. The commenter recommended that CMS work to ensure hospitals could meet the meaningful use requirements by electronically reporting some mandatory Hospital IQR Program measures, without having to report additional measures that have not been determined to have value for public reporting or quality improvement purposes under the Hospital IQR Program.

Response: We are actively taking steps to align the list of available measures between the Hospital IQR and Medicare EHR Incentive Programs by proposing to adopt six new and retain 10 topped-out measures as electronic clinical quality measures (79 FR 28220 through 28242) so that 28 of the 29 Stage 2 measures are adopted by the Hospital IQR Program. As previously noted, ED-3—Median time from ED arrival to ED departure for discharged ED patients, is an outpatient quality measure. While 12 of the Stage 2 measures are required Hospital IQR Program measures, we believe that allowing hospitals the flexibility to select other measures that best fit their patient population is a benefit to the hospitals. This flexibility was requested by commenters in response to our restriction to 16 specific measures in CY 2014 (78 FR 50814-50815). As proposed and as finalized in this rule, hospitals can meet some meaningful use requirements by electronically reporting some mandatory Hospital IQR Program measures. We intend to continue working with hospitals to ensure that they are able to meet meaningful use requirements by reporting Hospital IQR Program measures electronically. We respectfully disagree with the commenter's implication that only the mandatory Hospital IQR Program measures have value. We believe that allowing hospitals the flexibility to choose which additional measures to report is a benefit to the hospital and their patient population. We refer readers to our response in section IX.A.2.a. of the preamble of this final rule regarding why we are retaining “topped-out” measures.

Comment: Some commenters opposed or expressed concerns with CMS' alignment proposal. One commenter stated that it is premature to require quarterly reporting of electronic clinical quality measures because of the implementation delays with 2014 CEHRT for meaningful use and the anticipated changes in the attestation requirements for meaningful use in 2014.

Response: We understand the commenters' concerns. We are not finalizing quarterly reporting of electronic clinical quality measures at this time. We refer readers to section IX.A.2.h.(1) of the preamble of this final rule where this is discussed in more detail.

Comment: One commenter appreciated CMS' goal to align the Hospital IQR Program and the Medicare EHR Incentive Program, but noted several concerns. Specifically, the commenter is concerned about the validity of the electronic clinical quality measures, noting that hospitals that are performing well under the chart-abstracted versions of measures are not the same hospitals that achieve high performance levels under the electronic clinical quality measure versions. Because of this concern, the commenter urged CMS to develop a methodology for validation and argues that chart-abstracted versions of measures should never be compared to electronic clinical quality measure versions.

Response: We are unaware of data showing that hospitals that are performing well under the chart-abstracted versions of measures are not the same hospitals that achieve high performance levels under the electronic clinical quality measure versions. To date, we have only heard anecdotal comments about actual performance level differences between the two modes of collection. We do not have sufficient data to be able to confirm these comments. We are conducting a small validation pilot and have proposed to conduct a larger pilot in CY 2015. More discussion of the electronic clinical quality measure validation pilot can be found in section IX.A.11.e. of the preamble of this final rule.

Comment: One commenter is concerned that CMS' data systems may not be prepared to routinely accept EHR-based measures.

Response: We would like to reassure the commenter that our data systems are prepared to accept EHR-based measures. The CMS database has been open to accept electronic clinical quality measure submissions since January 2, 2014.

Comment: One commenter was concerned that different deadlines (that is, for chart-abstracted measures versus electronic clinical quality measures) may lead to confusion and requested that CMS undertake a strong educational initiative using current educational resources for both programs and ensure that technical assistance is available for hospitals opting to submit data for both programs electronically.

Response: We routinely provide educational sessions and resources on the QualityNet Web site. After publication of the final rule, we will update the resources and offer additional educational sessions to assist reporting hospitals. We urge the commenter to sign up for our electronic mail distribution list available for pertinent updates and announcements of upcoming educational sessions. Further, we have recorded sessions available on electronic clinical quality measures on our Web site at: https://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228773852046. We also refer readers to our response in section IX.A.2.h.(1) of the preamble of this final rule regarding education and outreach to hospitals and vendors.

Comment: Another commenter expressed concern that the methods to encourage participation in the voluntary electronic reporting option and to align critical quality measure reporting in the Hospital IQR Program and the Medicare EHR Incentive Program undermine the goals of the Hospital IQR Program—namely, continuous hospital quality improvement. Rather than consider exceptions to Hospital IQR Program requirements, the commenters suggested that CMS leverage the data from the Medicare EHR Incentive Program for insight and development of a report on lessons learned to date from hospitals' experience with certified electronic health record technology (CEHRT), and their use for electronic clinical quality measures.

Response: We respectfully disagree with the commenter that our efforts to align reporting of electronic clinical quality measures between the Hospital IQR and Medicare EHR Incentive Programs undermine the goals of the Hospital IQR Program. We believe that clinical quality measure reporting, regardless of the mode of submission, will lead to continuous quality improvement.

We interpret the commenter's statement “consider exceptions to Hospital IQR Program requirements” and “CMS leverage the data from the Medicare EHR Incentive Program” to request that we not introduce an electronic voluntary reporting option for Hospital IQR. By allowing one submission to partially fulfill requirements for two programs, we believe we are alleviating the burden of reporting data to two programs. We disagree that leveraging data from the Medicare EHR Incentive Program would promote continuous quality improvement, since many hospitals have elected to attest results of their electronic clinical quality measures.

Comment: One commenter cautioned that its EHR vendor prioritizes complying with federal government requirements over fixing critical errors in its system that could affect patient safety.

Response: Patient safety is the top priority and we urge hospitals to work closely with their vendors to ensure patient safety as the highest priority.

Comment: One commenter encouraged CMS to more clearly state that references to submission timelines in its proposal to align the Hospital IQR Program and the Medicare EHR Incentive Program do not impact chart-abstracted measures. Another commenter asked CMS to clarify whether the submission deadline for the first quarter of CY 2015 is May 30 or May 31.

Response: Our proposal to align the Hospital IQR Program and the Medicare EHR Incentive Program does not affect chart-abstracted measures' submission deadlines. The alignment applies to electronic clinical quality measures only.

In addition, as stated in section IX.A.2.h.(1) of the preamble of this final rule above, we are finalizing a modified version of our proposal. We will not require quarterly reporting at this time for the electronic clinical quality measures. As a result, we also modified the submission deadline for electronic clinical quality measures, which instead will be November 30, 2015. Policies for electronic clinical quality measure reporting in CY 2016/FY 2018 payment determination and subsequent years will be made in future rulemaking.

Comment: One commenter recommended that CMS finalize the zero denominator and case threshold changes as proposed.

Response: We refer readers to sections IX.D.5. and IX.D.6. of the preamble of this final rule for the discussion of zero dominators and the case threshold exemption in the EHR Incentive Program. We note that while this policy was clarified in the EHR Incentive Program, it also applies to electronic reporting for the Hospital IQR Program.

After consideration of the public comments we received, we are finalizing our proposal to align the EHR Incentive Program with the Hospital IQR Program, with modifications. We proposed to align the reporting period and submission deadlines of the Medicare EHR Incentive Program clinical quality measures with that of the Hospital IQR Program for CY 2015. While we are finalizing our proposal to align the reporting period and submission deadline of the Medicare EHR Incentive Program with those of the Hospital IQR Program on the calendar year for clinical quality measures that are reported electronically, we are not finalizing our proposal to require quarterly submission of clinical quality measure data in CY 2015.

Since we are also modifying our proposal in the Hospital IQR Program to finalize that hospitals can voluntarily submit one calendar year (CY) quarter's data for CY Q1 (January 1-March 31, 2015), CY 2 (April 1-June 30, 2015), or CY 3 (July 1-September 30) by November 30, 2015, we are also applying these modifications to the alignment with the Medicare EHR Incentive Program. As a result, we are not incrementally shifting the Medicare EHR Incentive Program reporting period and submission deadlines for clinical quality measures as proposed. We plan to continue to align reporting periods and submission deadlines in CY 2016 and subsequent years in future policy years. We refer readers to section IX.E.2. of the preamble of this final rule for a detailed discussion of the final policy in the Medicare EHR Incentive Program.

ED-1, ED-2, Stroke-4, Stroke-6, Stroke-8, VTE-1, VTE-2, VTE-3, VTE-5, VTE-6, AMI-7a, and PC-01 are measures required under the Hospital IQR Program and may be submitted as chart-abstracted or as electronic clinical quality measures. If a hospital chooses to submit one calendar quarter (CY 2015 Q1, Q2, or Q3) as an electronic clinical quality measure by November 30, 2015, a hospital does not need to also submit chart-abstracted data for that measure.

The chart below provides a summary of the finalized reporting periods and electronic submission deadlines for the FY 2017 Hospital IQR Program:

FY 2017 Hospital IQR Program Electronic Reporting Periods and Submission Deadlines for Eligible Hospitals

Discharge reporting periodsSubmission deadline
January 1, 2015-March 31, 2015November 30, 2015.
April 1, 2015-June 30, 2015November 30, 2015.
July 1, 2015-September 30, 2015November 30, 2015.
October 1, 2015-December 31, 2015Not Applicable.

e. Sampling and Case Thresholds for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (75 FR 50230), the FY 2012 IPPS/LTCH PPS final rule (76 FR 51641), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53537), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50819) for details on our sampling and case thresholds for the FY 2016 payment determination and subsequent years.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28246) we did not propose any changes to sampling or case thresholds.

f. HCAHPS Requirements for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50220), the FY 2012 IPPS/LTCH PPS final rule (76 FR 51641 through 51643), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53537 through 53538), the FY 2014 IPPS/LTCH PPS final rule and (78 FR 50819 through 50820) for details on HCAHPS requirements.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28246) we did not propose any changes to HCAHPS requirements.

Hospitals and HCAHPS survey vendors should, however, regularly check the official HCAHPS Web site at http://www.hcahpsonline.org for new information and program updates regarding the HCAHPS Survey, its administration, oversight and data adjustments.

g. Data Submission Requirements for Structural Measures for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51643 through 51644), and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53538 through 53539) for details on the data submission requirements for structural measures.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28246) we did not propose any changes to data submission requirements for structural measures.

h. Data Submission and Reporting Requirements for Healthcare-Associated Infection (HAI) Measures Reported via NHSN

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51631 through 51633; 51644 through 51645), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53539), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50820 through 50822) for details on the data submission and reporting requirements for healthcare-associated infection (HAI) measures reported via the CDC's National Healthcare Support Network (NHSN) Web site. The data submission deadlines are posted on the QualityNet Web site at: http://www.QualityNet.org/.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28246) we did not propose any changes to data submission and reporting requirements for healthcare-associated infection measures reported via the NHSN.

10. Submission and Access of HAI Measures Data Through the CDC's NHSN Web Site

As finalized in the FY 2014 Hospital IPPS/LTCH PPS final rule (78 FR 50805 through 50807), the Hospital IQR Program requires hospitals to report data via the CDC's NHSN Web site for the following HAI measures: (1) CLABSI (NQF #0139); (2) CAUTI (NQF #0138); (3) SSI following colon surgery; (4) SSI following abdominal hysterectomy; (5) laboratory-identified MRSA bacteremi a infection (NQF #1716); (6) laboratory-identified Clostridium difficile infection (NQF #1717); and, (7) healthcare personnel vaccination (NQF #0413). In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51644 through 51645), we adopted the data submission and reporting standard procedures that have been set forth by CDC for NHSN participation in general and for submission of specific HAI measures to NHSN.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28246 through 28247) for the FY 2016 payment determination and subsequent years, for the Hospital IQR Program, we clarified our data reporting and submission requirements for the above stated HAI measures. By adopting the data reporting and submission procedures set forth by the CDC, we intended that hospitals report, through the existing NHSN process, any and all data elements at the patient-level that are designated as “required” on NHSN forms (such as, the “primary bloodstream infection” or “annual facility survey” forms). Some examples of these “required” patient-level data elements include: patient identifier, date of birth, and gender; detailed event data, such as specific symptoms identified to meet case definitions and laboratory results; and risk factor data used to calculate the hospital-level measures. Hospitals may find a comprehensive list of required forms and data elements on the NHSN Web site (http://www.cdc.gov/nhsn/acute-care-hospital/index.html).

We further clarified that the NHSN required data collected by the CDC will be shared with CMS for Hospital IQR Program and Hospital VBP Program administration, monitoring and evaluation activities, including validation, appeals review, program impact evaluation, and development of quality measure specifications. We routinely use submitted quality measure data for these types of program administration, monitoring and evaluation activities.

In addition, we proposed that we will also receive access from the CDC to voluntarily submitted name and race identifying information with respect to Hospital IQR Program required measures. These data will also be used for Hospital IQR Program and Hospital VBP Program administration, monitoring and evaluation activities, including validation, appeals review, program impact evaluation, and development of quality measure specifications. More specifically, for Hospital IQR Program validation, we proposed to use these data to ensure accurate matching between patient charts submitted for HAI validation that cannot be matched to NHSN using Medicare beneficiary identification numbers. We also proposed to use these data as appropriate for program evaluation.

We invited public comment on this proposal.

Comment: Several commenters strongly supported the proposal for CMS to access NHSN patient-, system-, and aggregate-level data. Commenters stated that this access is necessary to evaluate the impact of the Hospital VBP and Hospital IQR Programs as required by the Affordable Care Act, as well as the HAC Reduction Program. Commenters stated that this information is also critical to inform quality improvement efforts and to ensure accurate data collection and will also increase the sampling power of the Hospital IQR Program validation process. A commenter also noted its trust that CMS will ensure data privacy and abide by all security and privacy requirements, as CMS has historically been an excellent steward to ensure data privacy and security in its quality programs.

Response: We thank the commenters for the support.

Comment: Many commenters opposed the proposed NHSN data access policy for validation purposes. Many of these commenters expressed the opinion that access to patient-level data was not needed for validation because CMS already has a validation process.

Response: We would like to clarify why we need access to these NHSN data for validation. Although commenters correctly point out that we already have an HAI validation process, the current validation process is inefficient, does not provide timely information for the validation-related appeals process, and does not give hospitals all the information that would be useful to them.

One example of how validation could improve in efficiency is by providing the CMS' clinical data abstraction center (CDAC) contractor with access to data regarding which symptoms a patient experienced in order to meet the requirements for reporting a particular type of HAI event. In validating a single quarter of 2013 HAI data, CDAC encountered more than 30 episodes-of-care in which the hospital reported an event to NHSN and for which CDAC did not find a reportable infection during its medical record review. In these situations, CDAC employs quality controls to ensure that its staff have not overlooked or misinterpreted important documentation. However, HAI records selected for validation are on average more than 1,000 pages in length, with maximum page length above 60,000 pages. Having information about which symptoms CDAC should be looking for and on what dates catheters were inserted would greatly assist the CDAC in ensuring that a properly documented and reported HAI was not overlooked, and that the hospital was credited appropriately. Without this data access, we rely on hospitals to request an educational review or appeal cases to identify any potential CDAC errors, thereby increasing inefficiencies and burden for hospitals.

Another reason we need direct access to patient-level NHSN data for Hospital IQR Program administration is to support the processing of validation-related appeals. A hospital may request from CMS at any time an educational review to better understand whether or not CDAC reached a correct conclusion during validation. However, a hospital which fails to meet Hospital IQR Program validation requirements has 30 days to appeal after this determination. Hospitals that fail to meet any Hospital IQR Program requirement, including validation, are ineligible for the Hospital VBP Program, and therefore, would not contribute to Hospital VBP Program performance standards. Because of the tight timeframe between the Hospital IQR Program payment determination and when Hospital VBP Program benchmarks must be posted, we must process Hospital IQR Program appeals very quickly, sometimes in 48 hours or less. Taking time at precisely this juncture to verify with CDC what was submitted to NHSN as the basis for the appeal is inefficient, and threatens timely payment determinations.

Lastly, under our current validation process, we are unable to provide patient-level data element information of hospital reported HAI data for mismatched validation cases. We believe that our proposal is in part responsive to the commenters in previous rules; those comments indicated that we needed to provide hospitals with more detailed HAI validation educational feedback (78 FR 50826 through 50827). We believe that this patient-level information is necessary to provide specific and actionable feedback to hospitals to report more accurate HAI data for CMS programs. For example, if CDAC can explain to a hospital that a patient did have the infection symptoms that the hospital reported to NHSN, but that the symptoms (and therefore, the infection) first occurred too long after a catheter was removed, the hospital would have a clearer explanation of why an infection was reported incorrectly. Moreover, by accessing NHSN data at the patient-level for every required reporting element, CDAC can review the accuracy of data reporting to NHSN at the data-element level and provide all of this feedback to hospitals. When CDAC validates clinical process-of-care measures, CDAC reviews and provides feedback to hospitals for every data element submitted to the Hospital IQR Program. CDAC is unable to provide a comparable level of feedback to hospitals for HAI measures, because it does not have access to patient-level data at the element-level.

Comment: Many commenters stated that the proposal to access patient name and race submitted on a voluntary basis as particularly objectionable. Some commenters questioned why patient name and race were needed for validation. A few commenters noted that this patient identifiable information would not be particularly useful because it is not available for every patient. A few commenters wanted to know why CMS needed data on non-Medicare beneficiaries. A few commenters stated that CMS should observe whether the new requirement to link data using Medicare Beneficiary ID for validation is helpful before instituting new policies. One commenter asked how frequently CMS failed to match validation cases on Medicare Beneficiary ID number.

Response: Our past validation experience indicates that accessing patient race and name data for validation will allow CDAC to match validation cases that lack Medicare beneficiary numbers with a higher level of confidence. If we cannot access these data, a hospital might have to request an educational review or appeal to determine that we made an inappropriate mismatch. We believe that this approach is much less efficient and more burdensome to hospitals than using the patient name and race data from NHSN where available to confirm the match. The policy requiring hospitals to report Medicare beneficiary identification numbers to NHSN is first effective for HAIs occurring in patients discharged in quarter 3, 2014 (78 FR 50822). We do not agree with commenters who indicated that we should wait until we implement this policy because of the number of patients with HAIs who are not receiving Medicare who will be impacted. We anticipate that there will be many cases that lack Medicare beneficiary identification numbers, because a large percentage of the 5 HAIs reportable to NHSN as part of the Hospital IQR Program occur among patients under 65 years of age. For infections reported in 2013, the percentage of events reported for patients under 65 years of age ranged from a low of 44 percent for laboratory-identified Clostridium difficile (CDI) events to a high of 64 percent for surgical site infections (SSI). In these instances without Medicare beneficiary numbers, it would be helpful to have other data, such as name and race where available, in order to more effectively match validation cases.

We agree that patient race and name data is not available for every patient. We believe that this information would be more useful if it were required and not voluntary. We will discuss this with CDC and take the suggestion under consideration for future rulemaking, considering both the burden of added requirements as well as the potential benefits. For the present, we believe that the available patient race and name data will greatly assist in identification of medical records required for CMS validation submission, and CMS' matching of validation medical records to NHSN reported infection events. Regarding data for non-Medicare beneficiaries, we remind commenters that the Hospital IQR Program requires quality data that encompasses all-payer patients (both Medicare beneficiaries and those not participating in Medicare). Therefore, data from all patients must be validated.

Comment: Some commenters asked CMS to clarify: (1) how it intends to use patient-level data for program evaluation, and (2) why aggregate-level data cannot be used for this purpose. Some commenters stated that CMS should only use aggregate-level data for program evaluation. Several commenters observed that patient name and race would have limited usefulness because these data are submitted voluntarily and are not available for all submitted cases. One commenter wanted to know what CMS meant by the phrase “as appropriate for program evaluation.”

Response: We are collecting data from NHSN using our authority to collect these data for validation purposes. For purposes of Hospital IQR Program data program administration, including validation and appeals, like all data we collect for that purpose, we intend to use that data more broadly to meet goals of the Hospital IQR and VBP Programs including measure and program evaluation. Measure and program evaluation are two key components of administering a public reporting program. We intend to use patient-level data for program evaluation to assess the impact of quality measures used in the Hospital IQR and Hospital VBP Programs and determine whether either program may have unintended consequences as we already do with other non-NHSN measures data.

Aggregate-level data have limited effectiveness for program evaluation, because they do not have a strong predictive power nor allow for multivariate statistical modeling. On the other hand, patient-level data provides us with much greater predictive power and the capability to perform multivariate statistical modeling through matching this data across all quality measures, including HAI measures. Such analyses provide additional information about the validity and impact of individual measures included in the Hospital IQR Program. For example, information from the same group of patients must be matched at the individual patient-level for the SCIP process-of-care antibiotic administration, PSI-90 component claims, and HAI measures to assess correlation among measure results. Such analyses provide additional information about the validity of individual measures included in the Hospital IQR Program, and also assist with assessing the relative impact of different types of measures on the distribution of Hospital VBP Program performance scores. These types of analyses provide actionable data to determine whether either program may have unintended consequences, including disproportionately penalizing hospitals serving the poor and vulnerable.

Patient level data on race and Hispanic ethnicity are particularly important for evaluating any potential unintended consequences related to poor and vulnerable populations. Aggregate level analyses have limited predictive power and lack the level of detail needed to evaluate whether program initiatives have had unintended consequences in contributing to disparities both within and across hospitals as well as disparities associated with specific populations. We intend to use patient-level information, as well as race and Hispanic origin information where available, to improve the accuracy of categorizing safety net hospitals in our impact analysis. However, we agree with the concerns of some of the commenters that the patient race and ethnicity data may be of limited usefulness because it may be reported by too few hospitals. This is why we described the use of these voluntarily reported data “as appropriate”. In the routine course of analysis, we intend to evaluate the level of completeness of the voluntarily submitted patient race and Hispanic ethnicity data, and its appropriateness for the specific analyses designed to evaluate the impact of the HVBP Program on safety net hospitals.

Comment: Several commenters expressed concern about specific uses for patient-level access to NHSN data. A few indicated that CMS should not access patient-level NHSN data to produce standardized infection ratios (SIRs) to post on Hospital Compare. CDC currently performs this role and these commenters believe that CDC should continue to do so. One commenter expressed concern that CMS would misuse or misinterpret data to reduce hospital payment rates.

Response: We agree with commenters that CDC effectively produces SIRs and should continue to provide these data to us to post on the Hospital Compare Web site, and that it would be duplicative for us to perform this work. We do not intend to perform these analyses and will not use the data in ways that reduce an individual hospital's payment rates.

Comment: Several commenters viewed CDC as housing the only “credible” experts on NHSN data collection and analysis, such that if CMS used these data to produce trends, evaluate and update NHSN measure specifications, or conduct data mining activities, the results might be incorrect, misleading, or not scientifically valid.

Response: We recognize that CDC is the measure steward for NHSN data, and uniquely understands the intricacies of NHSN data collection. We do not intend to independently update NHSN measure specifications and would only make changes in response to CDC updates. Such changes would be subject to our substantive and nonsubstantive changes policy (see 77 FR 53504). We also would not conduct data mining activities. The measure steward, CDC, is responsible for updating measure specifications. We would invite CDC to provide feedback on any NHSN quality trend data we produce for Hospital IQR or Hospital VBP Program evaluation purposes.

Comment: Several commenters viewed any analyses that CMS might conduct as potentially duplicative with CDC efforts and therefore, wasteful of resources. One commenter asserted that CDC should conduct validation instead of CMS.

Response: Our intention to access and use NHSN data does not constitute redundant or duplicative efforts with the CDC. CDC produces national and hospital level HAI SIRs for NHSN, and also provides CMS with hospital-specific data for reporting on Hospital Compare. We intend to continue using CDC reported HAI SIRs.

Further, the CDC does not validate these measures for purposes of the Hospital IQR Program. CMS has both the authority and the responsibility to conduct validation activities under section 1886(b)(3)(B)(viii)(XI) of the Act. We are statutorily responsible with auditing a number of hospitals to ensure the validity of the reporting program. Our validation process provides hospitals with a single standardized national process and provides hospitals in the validation sample with actionable and specific patient-level, confidential feedback on mismatched patient-level validation results in order to improve accuracy.

We might consider contracting with CDC to conduct such validation in future years if we determine that CDC is interested in conducting validation for the Hospital IQR Program and could do so more efficiently than CMS. However, any validation process that CMS would undertake would have to be standardized nationally and employ quality assurance standards such as assessing inter-abstractor reliability. CDC's current validation strategy, which involves providing technical assistance to states conducting validation, is not nationally standardized. It therefore does not meet CMS' needs to ensure accuracy of HAI measure data using a standardized and nationwide process.

National Healthcare Safety Network (NHSN) Validation Guidance and Toolkit 2012. Validation for Central Line-Associated Bloodstream Infection (CLABSI) in ICUS. http://www.cdc.gov/nhsn/toolkit/validation-clabsi/,, last accessed 7/29/2014.

Comment: Many commenters questioned whether CMS had rights to the data, and stated that CMS access would violate the confidentiality agreement between hospitals and NHSN, or indicated that the data being required and accessed exceeded those needed to measure performance as posted on Hospital Compare. Several commenters indicated that CMS should justify its need for specific data elements, arguing that CMS should require and access the minimum data needed to meet its goals.

Response: We believe that our responsibility encompasses more than merely measuring quality performance. As described above in this section, we have both the authority and the responsibility to conduct validation of the data for the Hospital IQR Program. Furthermore, we wish to clarify that this policy does not constitute an expansion of reporting requirements, because we would access data that hospitals are already required to submit to meet Hospital IQR Program requirements.

In addition, our policy to access data required by the Hospital IQR Program also does not violate the confidentiality agreement between facilities and the CDC. The confidentiality agreement signed by facilities specifically indicates that one purpose for the data submitted to NHSN is to “enable healthcare facilities to report HAI and prevention practice adherence data via NHSN to the U.S. Centers for Medicare and Medicaid Services (CMS) in fulfillment of CMS's quality measurement reporting requirements for those data.” We would only be accessing data reported to fulfill Hospital IQR Program requirements, and therefore, would not violate the confidentiality agreement.

We agree with commenters that data collection requirements should be kept to the minimum necessary to meet quality measurement goals, and that each data element collected should be justified.

Using our authority and responsibility to access data needed to administer the Hospital IQR Program including by performing validation and appeals, we intend to access and store only the minimum data for any of the particular analysis of the types described above. However, given the varied analytical requirements for validation and appeals, and the frequency with which CDC makes small changes are made to NHSN specifications, we believe it would be impractical to provide a data element by data element rationale in a regulation.

Comment: Many commenters expressed concerns about security for this highly sensitive and confidential data. Many commenters wanted to know whether CMS or its contractors would access the data, and how CMS would prevent inadvertent disclosures or privacy breaches. One commenter specifically wanted to know how the data would be transferred and how long it would be stored.

Response: It is our intention that our staff as well as contractors would request access to data submitted via the NHSN for the purposes of administering the Hospital IQR Program. In accessing data submitted via the NHSN, we would uphold the same privacy and security standards we use for other quality measures data submitted directly to us. For example, we would comply with all applicable requirements of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules to safeguard and limit the use and disclosure of the information we access and obtain through the NHSN, as well as require through HIPAA business associate agreements that our contractors do the same.

We have several options for securely transferring data. For example, the Secure Transfer Protocol on QualityNet has secure transfer capabilities that ensure encryption of both the data and the transmission process. We will collaborate closely with CDC to ensure that we minimize the number of requests made for data. We will store data according to the CMS Center for Clinical Standards and Quality standard operating procedure for retention of records, which calls for retention of data for 10 years.

Comment: A few of the commenters opposing CMS' proposed data access policy urged CMS to work with CDC to support activities that increase accuracy through education, validation, and widespread adoption of electronic health records with “infection decision and support software.”

Response: As described above and further below, we conduct data validation and would like to do more to educate hospitals about data accuracy. This would require better access to NHSN data as proposed in this policy. We will consider the recommendation regarding infection decision and support software for future policy development in concert with our other efforts and incentives to promote EHR adoption.

After considering public comments we received, we are finalizing the policy to access NHSN data as proposed.

11. Modifications to the Existing Processes for Validation of Chart-abstracted Hospital IQR Program Data

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53539 through 53553), we finalized the processes and procedures for validation of chart-abstracted measures in the Hospital IQR Program for the FY 2015 payment determination and subsequent years; this rule also contained a comprehensive summary of all procedures finalized in previous years and still in effect. Several modifications to these processes were finalized for the FY 2016 and FY 2017 payment determinations in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50822 through 50835). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28219) for the FY 2017 payment determination and subsequent years, we proposed additional modifications to these processes. Proposed changes fall into the following categories: (a) Eligibility criteria for hospitals selected for validation; (b) number of charts to be submitted per hospital for validation; (c) combining scores for HAI and clinical process-of-care measures; (d) processes to submit medical records for chart-abstracted measures; and (e) plans to validate electronic clinical quality measure data.

a. Eligibility Criteria for Hospitals Selected for Validation

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50833 through 50834), for the FY 2016 payment determination and subsequent years, we finalized our process to draw a random sample of 400 hospitals and an additional sample of up to 200 hospitals meeting specific targeting criteria for purposes of validation. For the FY 2017 payment determination and subsequent years, we proposed one minor change to this process. In the FY 2011 IPPS/LTCH PPS final rule (75 FR 50227), we defined hospitals eligible for validation as the subset of subsection (d) hospitals that successfully submitted “at least one case for the third calendar quarter of the year two years prior to the year to which validation applies.”

For the FY 2017 payment determination and subsequent years, we proposed to change the definition of validation-eligible hospitals to be the subset of subsection (d) hospitals that successfully submitted at least one case to the Hospital IQR Clinical Data Warehouse during the quarter containing the most recently available data. The quarter containing the most recently available data will be defined based on when the random sample is drawn. For example, for the FY 2017 payment determination, we intend to draw this sample in November or December of 2014. The second quarter (Q2) of 2014 ends in June 2014, but hospitals participating in the Hospital IQR Program may submit quality data from this quarter until November 15, 2014 (see www.qualitynet.org for submission deadlines). If CMS draws its sample early in November 2014, before all the second quarter hospital data are submitted and processed by the Clinical Data Warehouse, the “quarter containing the most recently available data” will be first quarter (Q1) of 2014. On the other hand, if CMS draws its sample late November or early December 2014 after the second quarter 2014 hospital data are processed, the second quarter of 2014 will contain the most recently available data.

We proposed this change because, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50822 through 50825), for the FY 2017 annual payment determination and subsequent years, we changed the timing of quarters for validation of HAI measures, as illustrated in the three graphs (78 FR 50824). To align with this change for HAI measures and to give hospitals more time to complete HAI validation template requirements once selected, we intend to draw the validation sample several months sooner than we have historically drawn it. Historically, we drew the sample early in each calendar year. This proposal provides us with greater flexibility for when we can sample hospital data and allows CMS to use the most recent data available to select hospitals.

We invited public comment on this proposal.

Comment: Several commenters supported CMS' proposal to change the definition of validation-eligible hospitals because it allows more flexibility in the timing to draw the sample, allows alignment of the HAI and chart-abstracted validation timeframes, and provides hospitals with more time to submit HAI validation templates.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing this policy as proposed.

b. Number of Charts To Be Submitted per Hospital for Validation

(1) Background

In the sections that follow, we discuss our proposals to: (1) Change the number of charts hospitals must submit for validation; (2) change the measure-specific sample sizes for HAI validation; and (3) change the topic areas and sample design for clinical process of care measures. We proposed these changes because section 1886(o) of the Act requires the Hospital VBP Program to use a subset of Hospital IQR Program measures and there is a declining number of measures and chart-abstracted measure topic areas available to the Hospital VBP Program. Our proposals also will direct more resources to measures and topic areas that also overlap with the Hospital VBP Program. Finally, our proposals will ensure that all chart-abstracted measure topic areas containing required measures within the Hospital IQR Program are included in validation. A more detailed rationale accompanies each proposal.

As described in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53539 through 53553), the Hospital IQR Program validates chart-abstracted data submitted to two different systems: clinical process-of-care data submitted to the Hospital IQR Program Clinical Data Warehouse and HAI data submitted to the NHSN. Different validation approaches are used for the data submitted to each of the systems. The process for selecting and validating HAI data was first introduced in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51646 through 51648) and has evolved annually in each successive IPPS/LTCH PPS rule. In contrast, validation of the clinical process of care measures, which involves separate samples for each topic area, has not substantively changed since it was first finalized for the FY 2012 payment determination in the FY 2010 IPPS/LTCH PPS final rule (74 FR 43884 through 43889).

(2) Number of Charts To Be Submitted for Validation

(A) Total Number of Charts Required for Validation

Our current policy requires hospitals to submit 96 charts for validation (60 charts for clinical process-of-care measures and 36 charts for HAIs) (78 FR 50825 through 50834). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28248) for the FY 2017 payment determination and subsequent years, we proposed to require hospitals selected for Hospital IQR Program validation to submit 18 patient charts per quarter for a total of 72 charts per year. A sample size of 72 charts is statistically estimated to be the number of charts needed to determine whether an individual hospital clearly passed validation and to assess hospital performance across both types of measures (HAIs and clinical process-of-care) combined. As finalized in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53551), hospitals may fall into three validation categories: (1) Hospitals pass validation with a lower bound of the confidence interval greater than or equal to 75 percent; (2) hospitals fail validation with an upper bound for a hospital's confidence interval lower than 75 percent; and (3) hospitals neither pass nor fail validation with a 90 percent confidence interval that includes values above and below 75 percent. Hospitals in the third category that neither pass nor fail validation receive their annual payment update, but may be randomly sampled for inclusion in the targeted validation in the following year.

We estimate that a sample of 72 charts will be sufficient to estimate a reliability of 75 percent +/− 10 percent with 90 percent confidence, assuming a design effect no greater than 1.4. Historical data suggests that most hospitals in the Hospital IQR Program pass validation and validated data have a high level of accuracy. For example, for the FY 2013 payment determination, approximately 95 percent of hospitals validated had data reliability of 85 percent or higher. With a sample of 72 charts and an expected mean data reliability well above 85 percent, we should be able to identify most hospitals that pass validation. Of the remaining hospitals, we will use the same conservative approach to identify hospitals failing validation that we have used since the inception of the Hospital IQR Program.

Comment: Many commenters supported the decrease in the number of charts required for validation.

Response: We thank the commenters for their support.

Comment: Several commenters opposed CMS' proposed changes to the chart-abstracted data validation process. The commenters were concerned that hospitals were more likely to fail as a consequence of the policy. One commenter suggested a two-stage process, under which the initial sample size for clinical process of care charts would be small, but a hospital failing validation would be invited to submit additional charts. The validation score for the combined larger pool of charts then would be used for determining whether the hospital has failed validation. Since only a small number of hospitals fail validation, this would be an efficient strategy. Some commenters also said that hospitals needed more feedback on these chart-abstracted measures.

Response: We disagree that the proposed policy to decrease sample size will increase the likelihood that one or more individual hospitals will fail validation. As explained in the proposal above, a hospital fails validation when the upper bound for its two-tailed 90 percent confidence interval is less than 75 percent. For any hospital that submits data at a given level of reliability, the only two factors that would affect how likely the upper bound of the confidence interval is to be less than that reliability standard are (1) the level of reliability selected for the standard, and (2) the confidence level. We did not propose to change either the standard level of reliability (currently 75 percent) or the confidence level of the upper bound (currently 90 percent) (77 FR 53551). Therefore, the likelihood that hospitals fail validation will not increase by decreasing sample size.

Currently, a high percentage of hospitals pass, and we anticipate that the same percentages of hospitals would continue to pass, but acknowledge that the width of the confidence interval would increase due to decreased sample size. As stated in our proposal, we anticipate that additional hospitals would be eligible for targeted selection for validation in the following year. This targeting process is quite similar to the recommendation made by the commenter. We will take into consideration for future rulemaking the remainder of the commenter's recommendations to combine scores across the first and second samples to produce a final passing or failing score.

We also appreciate that the commenters would like more data on these chart-abstracted, clinical process-of-care cases. However, our proposal reflects our best efforts to balance the cost and burden against the desire for more detailed feedback. Moreover, some of the measures that have been in the program for a long time are reported very accurately. For these measures, only minimal feedback is needed. We intend to summarize national validation results and provide educational training for hospitals to incorporate the lessons learned to address the most frequently occurring validation mismatches. We believe it would be wasteful to increase resources simply to verify the accuracy of the measures that are already being reported well.

After consideration of the public comments we received, we are finalizing our proposal to require hospitals selected for Hospital IQR Program validation to submit 18 patient charts per quarter for a total of 72 charts per year as proposed.

(B) Number of Charts Required for HAI and Clinical Process-of-Care Measures

As finalized in the FY 2014 IPPS/LTCH PPS final rule for the FY 2017 payment determination and future years, we require hospitals to submit 9 charts for HAI measures per quarter (78 FR 50831) and for the FY 2016 payment determination and future years, we require hospitals to submit 15 charts for clinical process-of-care measures per quarter for validation (78 FR 50830). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28248) for the FY 2017 payment determination and subsequent years, we proposed that of the 18 charts proposed to be submitted per quarter (above), 10 charts would be submitted to validate HAI measures and 8 charts would be submitted to validate clinical process-of-care measures. This would equal 72 charts per year with a mix of 40 HAI and 32 clinical process-of-care measure charts. We proposed to require more HAI charts than clinical process-of-care measure charts because HAI measures now, as proposed, have a greater impact on the Hospital VBP and the HAC Reduction Programs. Considering only the relative importance of HAIs and clinical process-of-care charts to the Hospital VBP Program, which is about 4 times as great, CMS might choose a ratio larger than 10 HAI charts for every 8 clinical process-of-care charts. However, we estimate that we spend about 4 times as much money per chart to validate HAIs as clinical process-of-care measures. Moreover, the clinical process-of-care measures are still a critical part of the Hospital IQR Program.

Therefore, we proposed this mix of 40 HAI and 32 clinical process of care charts per year because we believe it to be optimal after considering both the relative importance of the two types of charts to the Hospital IQR Program and related payment incentive programs and the relative cost of validation for the two types of charts.

We invited public comment on these proposals.

Comment: Many commenters supported this proposal. Most commenters supported the proposed mix of HAI and clinical process of care cases.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing our proposal that of the 18 charts proposed to be submitted per quarter, 10 charts would be submitted to validate HAI measures and 8 charts would be submitted to validate clinical process-of-care measures as proposed.

(3) HAI Validation: Measures and Measure-Specific Sample Sizes

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50828 through 50832) for the FY 2016 payment determination and subsequent years, we finalized the HAI measures to be included in validation, the processes for completing validation, and the specific sample sizes for each. To validate HAI data, hospitals must use Validation Templates to provide supplemental data to CMS. These supplemental data provide CMS with a set of candidate infections for each HAI. As finalized previously, hospitals sampled for validation will be randomly assigned to provide two Validation Templates, either: (1) CLABSI and CAUTI, or (2) MRSA and CDI. Consequently, up to 300 hospitals will provide data on each of these 4 measures. We also previously finalized a decision to validate a smaller number of patient charts for SSI from twice as many hospitals because of the smaller number of candidate SSIs expected per hospital per quarter. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28248) we did not propose to change the process for validating individual measures.

However, as described above in this section, we proposed to increase the total HAI sample size by 1 chart per quarter for a total of 4 more charts per year. As explained below in this section, HAI measures have greater relative scoring weights in the Hospital VBP and HAC Reduction Programs than clinical process-of-care measures. Therefore, in order to align the Hospital IQR Program with the Hospital VBP and HAC Reduction Programs, we proposed to increase measure-specific sample size targets to support this 1 chart per quarter increase in the Hospital IQR Program for the FY 2017 payment determination and subsequent years. Specifically, the total number of charts for CLABSI, CAUTI, MRSA, and CDI would increase by 1 from 15 to 16; and the total number of charts for SSI would increase by 2 from 6 to 8. The previously finalized and proposed specific sample-size charts are detailed in the tables below.

Previously Finalized Number of Charts Required for HAI Validation for the FY 2017 Payment Determination and Subsequent Years

HAINumber of hospitalsNumber of quartersCharts/ quarter/ hospitalNumber of charts per hospital
Previously Finalized:
Central line associated bloodstream infections (CLABSI)Up to 30043.75*15
Catheter-associated urinary tract infections (CAUTI)Up to 30043.75*15
MRSAUp to 30043.75*15
CDIUp to 30043.75*15
SSIUp to 60041.5*6
*As previously finalized, within each hospital, quarterly targets are 3, 3, and 1 respectively for CLABSI, CAUTI, and SSI, and 3, 3, and 1 respectively for MRSA, CDI, and SSI. As finalized, 2 additional charts per quarter per hospital were to be randomized to meet the fractional case targets on average.

Proposed Number of Charts To Be Submitted for HAI Validation for the FY 2017 Payment Determination and Subsequent Years

HAINumber of hospitalsNumber of quartersCharts/ quarter/ hospitalNumber of charts per hospital
Proposed:
Central line associated bloodstream infections (CLABSI)Up to 3004416
Catheter-associated urinary tract infections (CAUTI)Up to 3004416
MRSAUp to 3004416
CDIUp to 3004416
SSIUp to 600428

We invited public comment on this proposal.

Comment: Many commenters expressed general support for validation provisions.

Response: We thank commenters for their support.

Comment: One commenter asked CMS to provide a rationale as to why CDAC assesses over-reporting of CLABSI and CAUTI events to NHSN. The commenter further wanted to know whether the purpose of validation is “to determine if the hospital knows how to read and understand the measure specifications and report accordingly or to assist the hospitals in identifying processes and procedures needed to reduce the rates and improve quality of care.”

Response: We have both the authority and the responsibility to conduct validation activities under section 1886(b)(3)(B)(viii)(XI) of the Act. We are statutorily responsible with auditing a number of hospitals to ensure the accuracy of the reported data. This includes verifying the accuracy of data reported to NHSN. We look to confirm that all events that should have been reported were reported and all events that should not have been reported were not.

An important factor for increasing accuracy is ensuring that hospitals know how to read and understand measure specifications and report them accordingly. Because hospitals have a financial disincentive to erroneously report more infections than actually occurred in their hospitals, education and feedback about these types of errors can benefit hospitals.

After consideration of the public comments we received, we are finalizing our proposal to increase measure-specific sample size targets by 1 chart per quarter for the FY 2017 payment determination and subsequent years as proposed.

(4) Clinical Process of Care Measures: Topic Areas and Sample Design

As discussed above in this section, we proposed to sample 8 total patient charts for clinical process-of-care measures per quarter per hospital included in validation for the Hospital IQR Program for the FY 2017 payment determination and subsequent years. Those 8 charts are discussed in greater detail below.

As shown in the table below, two other (than immunization) Hospital IQR Program clinical process-of-care topic areas overlap with measures proposed for inclusion in the FY 2017 Hospital VBP Program. Regardless, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28249 through 28250) we did not propose to target those topic areas for the following reasons. One of these measures, PC-01, Elective delivery prior to 39 completed weeks of gestation, is reported in aggregate. We cannot use the same mechanism to validate PC-01 as we use for measures reported at the patient level, but we hope to include it in our validation program in the future should reporting PC-01 as an electronic clinical quality measure becomes a requirement. The second measure is AMI-7a. AMI-7a describes a process of care only performed in small rural hospitals. Of the approximately 3,300 hospitals participating in the Hospital IQR Program for the FY 2015 payment determination, only 113 submitted cases for this measure in the first two quarters of CY 2013. Therefore, targeting hospitals that report the AMI-7a measure would unduly single out small rural hospitals that disproportionately report relatively high AMI-7a measure denominator counts for validation, and would be inequitable.

Number of Chart-Abstracted Clinical Process-of-Care Measures per Topic Area Proposed To Be Reported in the Hospital IQR Program in the CY 2014 and CY 2015 Discharge Periods*

Topic areaNumber of required measures reported in CY 2014 for FY 2016 hospital IQR programNumber of required measures proposed for CY 2015 for FY 2017 hospital IQR programProposed to include in the hospital VBP program for FY 2017
Acute Myocardial Infarction (AMI)21Yes
Heart Failure (HF)10No
Pneumonia (PN)10No
Surgical Care Improvement Project (SCIP)70No
Venous thromboembolism (VTE)65No
Stroke (STK)84No
Emergency department throughput (ED)22No
Prevention—global immunization (IMM)11Yes
Sepsis01No
Perinatal Care (PC) **11Yes
* Data validated for the FY 2017 payment determination are Quarter 3, CY 2014, Quarter 4, CY 2014 Quarter 1, CY 2015 and Quarter 2, CY 2015 (78 FR 50824).
** Not reported at the patient level and not proposed for inclusion in validation.

For the FY 2017 payment determination and subsequent years, we proposed that the remaining 5 of the 8 clinical process-of-care charts be drawn from a systematic random sample of charts across all topic areas containing required measures other aside from those in the immunization and perinatal care topic areas. Across all hospitals included in validation, we believe this approach will ensure adequate numbers of patient charts are sampled for each topic area. Under this proposal, the pool of clinical process-of-care topic areas sampled for validation will include: STK, VTE, ED, and sepsis, as well as all other Hospital IQR Program-required topic areas such as AMI. We received many comments in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50807 through 50810; 78 FR 50825) regarding the importance of validating VTE, STK, and ED measures not included in validation for the FY 2016 payment determination. With this proposal, STK, VTE, ED, and sepsis measures would be included in the pool of clinical process-of-care measures for validation. The systematic random sample of topic areas from this pool would ensure that charts are sampled proportionate to the number of charts submitted for each topic. Thus, a sample of 20 charts per year would not be limited to only one topic area by random occurrence. In addition, across all hospitals included in validation, we believe this approach will ensure adequate numbers of patient charts are sampled for each topic area.

We used data submitted to the Clinical Data Warehouse for the Hospital IQR Program from quarters 1 and 2 of 2013 to estimate that at least 400 cases per topic area would be validated per year (across all hospitals).

This proposal simultaneously simplifies the sampling plan for clinical process-of-care measures and gives us the flexibility of introducing or removing new topic areas into validation each year without having to redesign and propose a new sampling strategy. Using a random sample ensures that new topic areas are not excluded from the validation sample and we can more easily adjust as the topic areas change over the years. If this proposal is finalized, every time a new required topic area is added to the Hospital IQR Program, it will automatically be added to validation, and every time a topic is removed from the Hospital IQR Program, it will automatically be excluded from validation.

We invited public comment on these proposals.

Because of the close relationship between this proposal and the one immediately below, we provide one consolidated set of comments and final policy for the two sections together at the end of the next proposal.

(5) Immunization Measure Validation

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28250) we proposed for the Hospital IQR Program for the FY 2017 payment determination and subsequent years, that 3 of the 8 total patient charts each quarter be targeted from the Immunization topic area. Currently, this topic area only includes the Immunization for Influenza (NQF #1659) measure, which overlaps with the Hospital VBP Program. We want to ensure that every hospital included in validation is validated for this topic area because of the overlap.

Comment: Many commenters supported the proposed policies to drop the measures that are topped-out from the validation process, and to divide the quarterly clinical process-of-care sample of 8 charts per hospital into a systematic random sample of 5 charts of all required topic areas other than immunization and perinatal care and a second sample of 3 immunization charts because of the importance of immunization to the Hospital VBP Program.

Response: We thank these commenters for their support.

Comment: A few commenters opposed the proposed policy to have 3 charts dedicated to immunization each quarter. These commenters observed that the IMM-2 measure only has meaning in the months of October through March, when hospitals are expected to immunize patients. Therefore, in 2 of the 4 quarters, the only data element available to validate would be the patient's discharge date. Because discharge date is not a measure of care quality, validating this element alone would not yield meaningful results. These commenters requested further clarification on the proposed methodology. For example, the commenters asked if ED Throughput (EDT) would be validated on those charts instead. A commenter asked if CMS will validate EDT on every IMM chart, since hospitals use the same population to sample cases for both measures.

Response: We had not considered the seasonal nature of this measure when we proposed this policy, and that very limited data would be available for 2 of the 4 quarters included in validation for this measure. We agree that it would be wasteful to validate 6 cases per year (or 3 cases per quarter for 2 quarters) per hospital during a time period which we know will not contain any meaningful data.

We will address this concern by finalizing a modified version of our proposal as follows. We will not sample any records for the IMM topic area in the quarters when the IMM-2 measure does not yield meaningful data and increase the number of IMM records sampled in the quarters during which this measure does yield meaningful data. In other words, for quarters 4 and 1 for each hospital included in validation, we will draw a quarterly random sample of 5 charts for validation for the IMM topic area and a quarterly systematic random sample of 3 charts in the “other” category. In quarters 2 and 3, when the IMM-2 measure does not apply, we will draw a systematic random sample of 8 charts from the “other” category. As established in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50822 through 50825), the quarters to be included in validation for the FY 2017 payment determination are quarters 3 and 4, 2014 and quarters 1 and 2, 2015. In quarters 3 and 4, 2014, the topic areas that will be in the “other” stratum are: AMI, ED, HF, PN, SCIP, STK, and VTE. In quarters 1 and 2, 2015, the topic areas that will be in the “other” category are: AMI, ED, SCIP, STK, VTE, and sepsis.

We did not propose to validate the same cases for EDT and IMM, because EDT measures are not also finalized for the Hospital VBP Program. We disagree that making a one-for-one substitution of EDT for IMM cases would be an appropriate substitution, because unlike the IMM measure set, which contains a measure finalized in the Hospital VBP Program, the EDT topic area is not inherently more important than any other required topic area in the Hospital IQR Program.

Comment: A few commenters expressed concern that CMS intended to drop some required measure sets from validation in its “other” systematic random sample, and/or suggested that CMS continue validating chart-abstracted data for all measures sets that are part of the Hospital IQR and Hospital VBP Programs. In addition, several commenters noted that the “other” sample would include the ED, STK, VTE, and sepsis measures in validation.

Response: We agree that all required measure sets should be included in validation to the extent that this is operationally feasible. In our weighting proposal below in this section, we identified the topic areas containing required measures other than Immunization and Perinatal Care only for quarters 1 and 2, in CY 2015 and inadvertently omitted HF, PN, and SCIP. However, we explicitly proposed to include a generic category so that we would not be required to revise our validation strategy every time a new topic area was added or deleted from the Hospital IQR Program measure set. Further, we included all CY 2014 Hospital IQR Program measures, including HF, PN, and SCIP, in our table above in this section, “Number of Chart-Abstracted Clinical Process-of-Care Measures per Topic Area Proposed to Be Reported in the Hospital IQR Program in the CY 2014 and CY 2015 discharge periods.” We thank the commenter for the opportunity to clarify this ambiguity that we had no intention of dropping these measures from validation, and that in fact, as reflected in the Table above in quarters 3 and 4, 2014, these topic areas would meet the definition of the “other” category because they contain Hospital IQR Program required measures other than immunization and perinatal care.

We appreciate the commenter's concern and wish to reiterate that we proposed to draw a systematic random sample of records from “topic areas containing required measures aside from those in the immunization and perinatal care topic areas.” For example, the HF, PN, and SCIP topic areas include measures that are required for the Hospital IQR Program in quarters 3 and 4, 2014, which are part of validation for the FY 2017 payment determination. Therefore, the HF, PN, and SCIP measure sets would fall into the “other” category in these quarters. However, these topics are not included in the Hospital IQR Program in 2015 because they met “topped-out” criteria and therefore, they would not fall into the “other” category in quarters 1 and 2, 2015.

Comment: A few other commenters opposed validation of the VTE, STK, or sepsis measures. These commenters opposed validation of the VTE or STK measures because they believed that the measure specifications were of poor quality. These commenters wanted to know how CMS would ensure the clarity of TJC-developed specifications. Those commenters opposing validation of the sepsis measure observed that because it was new, hospitals were inexperienced with reporting it.

Response: Although we appreciate that the measure specifications could be clearer for the VTE and STK measures and that the sepsis measure is new, all of these measures are NQF-endorsed and are finalized in the Hospital IQR Program. Any lack of clarity regarding the meaning of VTE and STK measure specifications and the inexperience of hospitals with the sepsis measures appear to be good reasons to provide hospitals with education and feedback on the data quality of these measures.

We believe that the potential adverse impact to any individual hospital of validating measures in the VTE, STK, and Sepsis topic areas to be very small. In contrast, we believe that combining the validation data in these topic areas across all hospitals will provide the Hospital IQR Program and hospitals with rich information about the quality of data and needs for education and improved specifications.

After consideration of public comments we received, we are modifying both our proposals that the remaining 5 of the 8 clinical process-of-care charts be drawn from a systematic random sample of charts across all topic areas containing required measures aside from those in the immunization and perinatal care topic areas, and our proposal that 3 of the 8 total patient charts each quarter are to be targeted from the Immunization topic area. The modification takes into consideration the seasonal nature of the IMM measure set and is otherwise consistent with our proposals to sample 8 clinical process of care charts per quarter and to validate the IMM topic area separately from other topic areas because of its importance to the Hospital VBP Program.

We are finalizing a modified policy as follows. In quarters 4 and 1, for each hospital included in validation, we will draw a quarterly random sample of 5 charts for validation of the IMM topic area and a quarterly systematic random sample of 3 charts in the “other” category. In quarters 2 and 3, for each hospital included in validation, we will draw a quarterly systematic random sample of 8 charts from all topic areas containing required measures other than immunization and perinatal care.

c. Combining Scores for HAI and Clinical Process of Care Topic Areas

We refer readers to the FY 2010 IPPS/LTCH PPS final rule (74 FR 43885) for the process of scoring clinical process-of-care measures, the FY 2014 IPPS/LTCH PPS final rule (78 FR 50832 through 50833) for the process of scoring HAI measures, and FY 2014 IPPS/LTCH PPS final rule (78 FR 50833) for the process to be used to compute the confidence interval. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28250) we did not propose any changes to those established policies.

However, for the FY 2017 payment determination and subsequent years, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28250 through 28251) we proposed to modify our approach to weighting the scores for each of the HAI, IMM and “other topic areas” with two proposals.

In the FY 2011 IPPS/LTCH PPS final rule (75 FR 50226) and the FY 2013 IPPS/LTCH PPS final rule (77 FR 53548 through 53553), we established a process to combine the HAI and clinical process-of-care measure scores by weighting them proportionate to the number of measures included in validation. For example, in section IX.A.11.b.(4) of the preamble of this final rule, our proposal to validate all clinical process-of-care measures required by the Hospital IQR Program for the FY 2017 payment determination would yield 14 clinical process-of-care measures in validation in CY 2015 and only 5 HAI measures in validation. Using the previously finalized weights, the clinical process of care measures score would contribute 14/19 and the HAI score would contribute only 5/19 to the combined score. This weighting does not reflect either the relative importance of HAIs to clinical process of care measures in the Hospital VBP Program nor the resources proposed to devote to their validation.

In sections IV.I. and IV.J. of the preamble of this final rule (the Hospital VBP Program and the HAC Reduction Program, respectively), we discuss our proposals to weight the patient safety domain (of which the HAI measures are part) more heavily in the Hospital VBP Program (20 percent for the patient safety domain versus 5 percent for the clinical process of care measures) and to use the HAI measures for the HAC Reduction Program.

In this section, we discuss our proposal to weight the HAI measures more heavily than the clinical process of care scores to align with these proposals in sections IV.I and IV.J. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28250 through 28251) for the FY 2017 payment determination and subsequent years, we proposed to weight the HAI score 66.7 percent (or 2/3) of the total score and the clinical process-of-care measures to weight 33.3 percent (or 1/3) of the total score. Further justification is provided after the second proposal.

In addition, we proposed to weight the IMM measures more heavily than other chart-abstracted clinical process-of-care measures validated in the Hospital IQR Program to align with the Hospital VBP Program. We are changing the process currently established to calculate the clinical process-of-care score, which is based on application of the formulas for the variance of a stratified single-stage cluster sample with unequal cluster sizes and the variance of a proportion in a stratified random sample (see reference to Cochran's “Sampling Techniques” at 75 FR 50226 and 78 FR 53550). We have previously applied this formula without consideration for the relative importance of different measures. When so applied, each topic area is weighted proportionate to the amount of data submitted to the warehouse for that topic area.

However, we proposed to modify the formulas as previously applied to weight the IMM topic area more heavily because of the overlap with the Hospital VBP Program. For the FY 2017 payment determination and subsequent years, we proposed to weight the “IMM” clinical topic area as 66.7 percent (2/3) and all other topic areas combined 33.3 percent (1/3) of the clinical process-of-care score. The weights reflect our policy preference to place greater relative weight on Hospital VBP Program included measures to better ensure accurate scores and payment. Emphasizing chart-abstracted clinical process of care measures validated in the Hospital IQR Program to align with the Hospital VBP Program will address the need to validate Hospital IQR Program data not currently included in Hospital VBP Program for public reporting and validation feedback to hospitals.

The table below shows the effect of the two proposals combined (the first to weight the HAI score more heavily than the clinical process-of-care score and the second to weight IMM data more heavily than other clinical process-of-care topic areas). The HAI topic area will count 3 times as much as the IMM topic area and 6 times as much as all other topic areas combined.

Proposed Weighting To Combine Scores Across Chart-Abstracted Topic Areas Included in Validation for the FY 2017 Payment Determination and Subsequent Years

Topic AreaWeight percent
Healthcare-associated infection (HAI)66.7
Immunization (IMM)22.2
Other (all clinical process of care topic areas containing required measures other than IMM and Perinatal Care)11.1
Total100

Previously, the clinical process-of-care measures accounted for 20 percent of the Hospital VBP Program score, whereas the HAI measures, a subset of the outcome measures, weighted 30 percent (FR 53605 through 53606). The proposed relative weights for the HAI (66 percent) and IMM (22 percent) topic areas better reflect the strong emphasis we have proposed for the HAI measures.

These proposals will require adjustments to the formulas applied to compute the confidence intervals. As we have done in the past, we intend to post the specific formulas used to compute the confidence interval on the QualityNet Web site at least one year prior to final computation (https://www.qualitynet.org/dcs/ContentServer?c=Page &pagename=QnetPublic%2FPage%2FQnetTier2&cid=1138115987129). These formulas will continue to account appropriately for the manner in which patient charts were sampled and data were abstracted.

We invited public comment on these proposals.

Comment: Many commenters supported the proposals to increase the weight of HAI measures, to decrease the weight of process of care measures, and to weight the immunization measure more heavily than other clinical process of care measures when computing a hospital's validation score.

Response: We thank the commenters for their support.

Comment: A few commenters opposed the proposal to weight the IMM measure more heavily than other clinical process-of-care measures because of the seasonal nature of the measure. These commenters suggested that weighting the EDT measure more heavily might be appropriate.

Response: We have addressed the commenters' concern by finalizing a policy to validate IMM data only in-season (we refer readers to section IX.A.11.b.(5) of the preamble of this final rule, above). Having made this policy adjustment above in this section, we believe that weighting IMM more heavily than other clinical process of care measures because of its importance to the Hospital VBP Program is still appropriate. The EDT measure is not included in the Hospital VBP Program; so it should not count more than any other process-of-care measures.

After consideration of the public comments we received, we are finalizing our policy to weight the “IMM” clinical topic area twice as heavily and all other topic areas combined of the clinical process-of-care score as proposed with final weights of 66.7% for HAI, 22.2% for IMM, and 11.1% for all topic areas containing required clinical process of care measures other than IMM and perinatal care.

d. Processes To Submit Patient Medical Records for Chart-Abstracted Measures

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50834 through 50835), we finalized a process for the FY 2016 payment determination and subsequent years that allows hospitals to submit patient charts for validation via: (1) paper patient medical records; or (2) secure transmission of electronic versions of patient information. The process previously finalized restricts electronic submission of patient information to digital images of patient medical records submitted using encrypted CD-ROMs, DVDs, or flash drives.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28251) we proposed for the FY 2017 payment determination and subsequent years to expand the options for secure transmission of electronic versions of patient medical records. Specifically, we proposed to allow hospitals to submit digital images (PDFs) of patient charts using a Secure File Transfer Portal on the QualityNet Web site. This portal would allow hospitals to transfer files through either a Web-based portal or directly from a client application using a secure file transfer protocol. The system provides a mechanism for securely exchanging documents containing sensitive information such as Protected Health Information (PHI) or Personally Identifiable Information (PII). Detailed instructions on how to use this system are available in the Secure File Transfer 1.0 User Manual available on QualityNet at: http://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetBasic&cid=1228773343598. After July 2014, hospitals can submit all Hospital IQR Program validation data using this portal. This proposal responds to many commenters from the FY 2014 IPPS/LTCH PPS rulemaking that were concerned that encrypted CD-ROMs were cumbersome and requested viable alternatives. We believe that the burden associated with using this portal will be similar to or less than that involved with submitting patient medical records via portable electronic media (that is, encrypted CD-ROMS, DVDs, or flash drives). Therefore, we intend to reimburse hospitals according to the rate established for submitting patient medical records via portable electronic media (78 FR 50956).

We invited public comment on this proposal.

Comment: A few commenters strongly supported the proposal to expand the transmission options for patient medical records, specifically the option to submit pdfs via the QualityNet Web site. The commenters believed this action will streamline the validation process and reduce the burden on hospitals.

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing our policy to allow hospitals to submit digital images (PDFs) of patient charts using a Secure File Transfer Portal on the QualityNet Web site as proposed.

e. Plans To Validate Electronic Clinical Quality Measure Data

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50807 through 50810), we finalized a voluntary process allowing hospitals to partially meet Hospital IQR Program requirements for the FY 2014 payment determination by submitting electronic clinical quality measure data via certified electronic health record technology. Many commenters expressed concern that we did not have an adequate methodology to validate these data.

To respond to these concerns as well as to ensure that Hospital IQR Program data are accurate and reliable, we conducted an environmental scan, including review of prior public comments to CMS proposed rules and requests for information, review of the technical and academic literatures, numerous listening sessions, and interviews with nine hospitals. From these activities, we identified three key categories of threats to data accuracy: (1) the design of the EHR product, including both the manufacturer-provided EHR product and the hospital's customizations of that EHR product to support the hospital's specific workflows and processes, (2) hospital and provider documentation practice, and (3) EHR and electronic clinical quality measure standards and specifications. We understand the potential threats to validity in each of these categories. To respond to these concerns, we are currently conducting a small scale test of a remote real-time validation strategy for electronic clinical quality measures in approximately 9 hospitals.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28251 through 28253) we did not propose any requirements for validation of electronic clinical quality measures for the FY 2017 payment determination. However, we stated that we intend to conduct a larger scale pilot test of validation activities in FY 2015. The pilot test will engage up to 100 volunteer hospitals in a highly interactive test abstraction of their EHR systems using a secure remote access, real-time abstraction technology. Hospitals that volunteer to participate must meet the Medicare EHR Incentive Program Stage 2 criteria (77 FR 53968 through 54162) and be able to produce QRDA Category 1 Revision 2 extracted data (individual patient data) for at least 6 of the 16 measures in the STK, VTE, ED, and PC topic areas. The Office of the National Coordinator for Health Information Technology (ONC) adopted QRDA as the standard to support both QRDA Category I (individual patient) and QRDA Category III (aggregate) data submission approaches for meaningful use Stage 2 in the Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology rule (77 FR 54163 through 54292). Interested hospitals will be invited to attend a 30-minute pre-briefing session where they will be provided with detailed instructions about the process and a demonstration explaining how to install needed software and have any concerns about security or systems requirements addressed. The software to be installed, Bomgar, is approved by CMS and meets our security requirements allowing CDAC to remotely view isolated records in real-time under hospital supervision, comparing all abstracted data with QRDA Category 1 file data and summarizing the results after the real-time session.

We implemented Bomgar software, a commercial product, in a CMS data center to allow for the review of medical records securely over the Internet. The product will allow the CDAC staff and Hospital medical record staff to easily set up remote support sessions for reviewing Hospital IQR Program-related EHR records under hospital supervision. The software was tested and passed our strict security standards. The electronic sessions do not require changes to a hospital's firewall or network because both the CDAC computer and the hospital computer connect to the product through secure outbound connections. The product will log and record every session and all session data will be safe-guarded by federal government approved encryption.

While CDAC has limited, remote viewing access, hospitals will be asked to:

  • Generate separate lists of patients eligible for measures in each of the four topic areas (STK, VTE, ED, and PC);
  • Generate QRDA Category 1 files extracted automatically from an EHR for all applicable measures for up to 3 records within each of the 4 topic areas (for a total of 12 records) as selected by CDAC; and
  • Show selected records, such as laboratory records, and patient medical history, navigating through the EHR system as directed by CDAC.

During this remote real-time session, CDAC will:

  • Follow the specifications for the electronic measure to abstract relevant information related to each data element from up to 10 different sources, for example, medication administration records, laboratory reports, and patient history, (including structured and unstructured fields) within each patient medical record.

After concluding the real-time session with a hospital, CDAC will:

  • Compare all abstracted data with QRDA Category 1 file data; and
  • Summarize results identifying patterns of concern.

Based on these results, we will, with our contractors:

  • Work with measure stewards to refine measure specifications based on conflicting findings;
  • Share conflicting findings with individual hospitals to support improvement;
  • Publicize de-identified patterns of conflicting findings that allow vendors to develop automated checks;
  • Determine reliability (agreement) between QRDA Category 1 extracted and abstracted data; and
  • Produce descriptive statistics to estimate sample size requirements for future validation.

To address the burden associated with this test, we intend to reimburse hospitals for the burden associated with their participation. Details about reimbursement are included in section XIII.B.6. of the preamble of this final rule. We posted on QualityNet a detailed draft of the operational procedures that volunteer hospitals will be expected to follow during the public comment period at https://www.qualitynet.org under “Data Validation Resources.” We developed this process to attempt to meet all of our goals for validity, as further explained in the table below.

Electronic Clinical Quality Measure Validation Strategy Summary for the Hospital IQR Program

Desired Attributes of Validation Strategy
• Assesses accuracy including reliability and population representativeness.
• Employs a standardized process conducted by an objective third party.
• Minimizes burden to hospitals.
• Minimizes costs to CMS by being performed at a central location.
• Leverages the dynamic qualities of an EHR, including query functions.
• May ultimately integrate with validation of other IQR measures.
Goals of Test
• Assess the accuracy and completeness of electronic clinical quality measure data.
• Assess Hospital IQR Program readiness for electronic clinical quality measure reporting requirements.
• Identify the needs for and implement updates to measure specifications and standards.
• Plan future validation requirements, including detailed operational instructions and sample size.
Planned Process Overview
Hospitals will:
• Allow CMS' Clinical Data Abstraction Contractor (CDAC) to remotely view records in real-time.
• Generate separate lists of patients eligible for measures to be validated.
• Generate QRDA Category 1 extract files for all applicable measures for up to 12 records selected by CDAC.
• Show selected records, navigating through the EHR system as directed by CDAC.
CDAC will:
• Abstract data following the specifications for the electronic measure and relevant information related to each data element from up to 10 different sources (including structured and unstructured fields) within each medical record.
• Compare all abstracted data with QRDA Category 1 file data.
• Assess and refine operational processes.
CMS and its contractors will:
• Determine reliability (agreement) between extracted and abstracted measures.
• Work with measure stewards to refine measure specifications based on conflicting findings.
• Share conflicting findings with individual hospitals to support improvement.
• Publicize de-identified common patterns of conflicting findings that allow vendors to develop automated checks.
• Produce descriptive statistics to estimate sample size requirements for future validation.
• Reimburse hospitals for burden associated with participation in test.

We invited public comment on this voluntary pilot test for validation.

Because of the close relationship of comments for this policy and the request for information that follows, we respond to comments for both after the next paragraph.

We also considered other validation approaches including one that supplements the current procedures and compares quality data manually abstracted by the hospitals with QRDA Category 1 extracts from their EHRs. Although we are making no specific proposals related to these alternatives at this time, we invited comments on whether we should develop or identify existing computerized applications to assist hospitals in self-validation and on the specific functionalities that may be useful for self-validation. For example, as part of the validation process, should we develop or identify an existing application that would use natural language processing, to identify potential threats to validity that human abstractors might then review more closely. An example of such an application might be one that searches the unstructured fields for contraindications to VTE prophylaxis, even if such contraindications were not noted in a structured field within an EHR. We also invited comments any other types of applications that would be useful for self-validation.

Comment: Many commenters expressed concern that there is no proposed validation process for the electronic clinical quality measures, or that the validation strategy that CMS proposed is still in its initial stages. These commenters opposed CMS' use of electronic clinical quality measure data that has not been validated and proven to be reliable for public reporting or pay-for-performance. Some commenters are pleased that CMS has taken steps to validate electronic clinical quality measures data, but believed that all measures used in public reporting and pay-for-performance programs should be subject to data validation, and noted that failure to do so will eliminate any benefit of electronic clinical quality measures.

Response: We understand commenters' concerns regarding use of electronic clinical quality measure data that is not validated and proven reliable for public reporting or pay-for-performance. We note that although we have signaled target dates for requiring hospitals to report electronic clinical quality measure validation requirements in the Hospital IQR Program, we have not proposed, nor are we finalizing any formal requirements to report electronically specified measures at this time. We recognize that validation is a major concern for many stakeholders interested in electronic clinical quality measure reporting and will take these comments into consideration for future rulemaking.

Comment: Many commenters expressed general support for CMS' efforts towards a voluntary pilot test for EHR validation. Some commenters encouraged CMS to complete this process quickly. Several of these commenters encouraged CMS to publicly report the results of the pilot to allow hospitals and vendors to implement processes to support electronic clinical quality measure validation.

Response: We intend to complete pilot activities in CY 2015. We also intend to publicly report aggregated results from the pilot, while protecting the confidentiality of individual providers and patients.

Comment: Many commenters advocated for greater collaboration in the electronic clinical quality measure validation process. Several commenters suggested that CMS work with other federal agencies and private sector experts to develop the protocols and testing environments needed to begin validation of electronic clinical quality measures. Other commenters emphasized the important role of the vendor in the validation pilot. A few commenters specifically observed that the validation plan does not “reflect the significant role of EHR vendors in this effort,” and/or that vendors need to be engaged so that hospitals are prepared to participate in the pilot, including being prepared to produce QRDA-1 files on demand in real-time. One commenter specifically recommended that the pilot should only include functional requirements that are required in Stage 2, 2014 Edition certification.

Response: We recognize the importance of engaging vendors, federal partners, and other private sector experts in the validation process, and we intend to do so going forward. We intend to reach out to vendors prior to implementation of the pilot to compare current product capabilities relative to pilot requirements. As described in our proposed policy, the only requirements for participation are that hospitals must meet the Medicare EHR Incentive Program Stage 2 criteria (77 FR 53968 through 54162) and be able to produce QRDA Category 1 Release 2 extracted data (individual patient data) for at least 6 of the 16 measures in the STK, VTE, ED, and PC topic areas. We realize that this may limit participation of hospitals who qualify for meaningful use based on reporting of other measures. Our proposed policy does not require that hospitals are able to produce QRDA-1 files in real time, only that hospitals are able to produce these files.

We have instructed the CDAC contractor to be very flexible so that if a hospital cannot produce QRDA-1 files or the measures of interest in real-time, but can provide them later, our contractor will accept them later during the pilot project data collection period. Similarly, we have directed CDAC to work out a flexible process if some hospitals are not able to generate patient lists for the ED, STK, VTE, or PC measure sets in real time. We intend to revise our pilot data collection materials to reflect that flexibility. We are not aware of any other specific functional requirements in the pilot materials proposed that are not part of stage 2, 2014 edition certification. We believe that we can complete outreach and collaborative activities before and after the validation pilot within the framework of the policy we have proposed.

During the pilot itself, we will allow CDAC to engage with the vendor with the hospital's permission, and can do so within the confines of the policy as proposed. However, we will not reimburse vendors. As we describe in the burden section XIII.B.6. of the preamble of this final rule, we will limit reimbursement to hospitals for the costs associated with one staff person participating for up to 16 hours and costs associated with providing medical records. We believe this is reasonable as it is in the business interests of vendors to support hospitals that need QRDA Category-1 files.

Comment: Most commenters believe that the validation pilot should accommodate a comparison of chart-abstracted and electronic clinical quality measure outcomes for the same measures, and/or that CMS should clarify whether it will evaluate whether the intent of the chart-abstracted and electronic clinical quality measures are the same.

In contrast, several commenters specifically noted that one should not expect the same result from a manual process (which allows for differences in documentation practices, judgment, and error and accommodates data from multiple sources) as from an electronic process which extracts data from a “defined specific data element location,” or that the processes for electronic clinical quality measure validation should be “independent” from validation of chart-abstracted measures. One of these commenters also advised that CMS acknowledge the role of customization in creating variability in records and that data can be unstructured in the EHR and that provider documentation can vary and still support the intent of the measure.

Response: We understand that although the purpose of validation for chart-abstracted and electronic clinical quality measures are the same, the outcomes of validation may be different for many reasons, including what was described by the commenter above. We agree with commenters that our validation process should also include a comparison of chart-abstracted and electronic clinical quality measure outcomes for the same measures and will add this to our electronic clinical quality measures validation pilot as finalized below.

We also are aware that hospitals may customize software in ways that create reporting errors and that individual providers may create errors by using the software in a manner other than that intended by the manufacturer. We understand, from a scoring perspective, that we can only hold vendors and hospitals accountable for achieving an outcome that should be generated based on existing standards and specifications. In addition to problems that may arise because of misalignment or errors in standards or specifications, we also are aware that hospitals may customize software in ways that create errors and that individual providers may create errors by using the software in a manner other than intended. We did not include a proposal for scoring individual hospitals, because we are aware that vendors have no choice but to code to existing specifications and standards. We intend to partner with stakeholders to assist in interpreting results and help develop a validation strategy that addresses these issues.

We also understand that provider documentation may vary, be located in unstructured fields, and still support the measure. We intend that our validation pilot will be able to distinguish among these many different threats to accuracy as well as identify times when variability in documentation does not threaten accuracy. We further believe that the pilot will be a rich source of information about all of these scenarios.

Comment: A few commenters raised issues related to the questions included in the detailed participation pilot materials posted on QualityNet describing the EHR walkthrough process. One commenter recommended that CMS include vendors as a source of information for many of the questions in the interview document that CMS posted on the QualityNet Web site to document the methodology we proposed to use for the validation pilot as noted in CMS' proposal. This commenter also requested additional guidance on the purpose of these questions and their relation to the outcome of the pilot and encouraged us to develop a final process that minimized burden to providers and the health system. One commenter recommended that the electronic clinical quality measure data validation pilot exclude assessment of EHR features, focusing instead on the health data of the EHR.

Response: We assume that when the first commenter was requesting guidance on the Electronically Specified Clinical Quality Measures Program Walk-through and Interview Document, the commenter was referring to questions related to “acceptability of remote technology for validation” as the other questions have a very clear relationship to the range of technical issues that this commenter raised in relation to electronic clinical quality measure validation generally. This section on “acceptability of remote technology for validation” includes the questions assessing EHR features that one commenter suggested we remove. We agree that vendors may be a better source of information for these questions, and therefore, intend to remove questions 9-12 based on the comments received. The purpose of the remaining questions in this section is to gauge the general level of acceptability of the approach that we are piloting, and to judge how many hours of staff time hospitals would be willing and capable of dedicating to validation activities to support to ensure reliable electronic clinical quality measure data. We intend to retain questions 6-8 and 13-15, because we would value hospitals' opinions about these ideas.

Comment: Several commenters urged CMS to implement the recommendations of a March 2014 GAO report to “develop a comprehensive strategy for ensuring that data collected and reported using certified EHR technology are reliable, including testing for and mitigation of reliability issues arising from variance in certified EHR systems tested to different CQM specifications.”

Electronic Health Record Programs: Participation Has Increased, but Action Needed to Achieve Goals, Including Improved Quality of Care: Report to Congressional Committees. (GAO Publication No. GAO-14-207). Retrieved from U.S. Government Accountability Office: http://www.gao.gov/assets/670/661399.pdf.

Response: We agree that reliability of data collected and reported using certified EHR technology is critical. As proposed, our validation pilot is intended to develop a methodology that achieves that goal. We intend to address problems arising from the fact that certified EHR systems may have been tested to specifications issued in different years, by only including in the pilot only those data certified to 2014 specifications. We will take the recommendations of the GAO report as a whole into consideration in future policy-making.

Comment: One commenter specifically wanted to know how the validation pilot would align with Meaningful Use specifications when the QRDA does not take into account any information from scanned documents, text, and documentation added at a later time.

Response: We understand that Meaningful Use specifications require that QRDA files extract data only from structured fields and therefore, the QRDA does not take into account data from scanned documents, text, and documentation added at a later time. Our proposed validation strategy was developed to acknowledge that because the QRDA does not take into account data from scanned documents, text, and documentation added at a later time, even the perfect EHR system could produce clinically meaningless validation results in contrast to chart-abstracted validation. In addition, as described above in this section, many commenters have observed errors in standards and specifications. By employing CDAC to look at the entire content of the record during our validation process, as we have described in our proposal, we hope to be able to identify those situations in which the calculated measure does not produce results consistent with the intent of the measure. We recognize that our validation pilot test may uncover problems that are not the fault of the provider, hospital, or developer, which is one of the goals of this pilot. We note that we have not proposed a process for scoring hospitals based on validation findings.

Comment: One commenter requested further details regarding what controls will be put into place to allow CDAC to remotely view records in real-time.

Response: The process that we intend to use to access medical records remotely contains several important controls to prevent unauthorized access to hospital systems. We clarify that access would be pursuant to a request by CDAC for the minimum necessary access to such records that includes an assertion of CDAC's legal authority (including the applicable basis(-es) under HIPAA) for such access. The Bomgar software that we intend to use is installed on a secure CMS-owned system that has safeguards in place in accordance with the HIPAA Security Rule to protect sensitive patient data. The Bomgar software is configured to transmit all information exchanged during the medical record review through CMS-owned hardware at a secure facility. All information needed to access hospital systems remotely is guarded by strong HTTPS secure socket layer (SSL) encryption, which protects the information as it is transmitted from the hospital to the CDAC. This hardware and software, which CDAC will use to access medical records remotely, will not store any information about the medical records themselves. Only a limited number of CDAC personnel, authorized by CMS, will have access to the Bomgar device. For more information, see: video http://www.bomgar.com/products/security.

In addition, CDAC contractors employ security controls to protect medical record information as follows: (1) all screen captures saved and QRDA files received by CDAC contractors are controlled and monitored according to security standards established by the National Institute of Standards and Technology (NIST); (2) all Protected Health Information (PHI) is encrypted on all CDAC servers; and (3) firewalls and servers are monitored by CMS security contractors. Only a limited number of CDAC personnel have been granted access to view any PHI. These CDAC personnel undergo background checks and undergo privacy and security training prior to being issued passwords to view records containing PHI. All of these security controls are audited in compliance with CMS Security Standards.

http://csrc.nist.gov/,, last accessed 7/17/2014.

Comment: One commenter who supported CMS' validation plan for electronic clinical quality measures also agreed “that the development or identification of existing computerized applications that can assist hospitals in self-validation and functionalities will be useful in self-validation of eCQMs.” The commenter believed this process could take the place of the current internal inter-rater reliability (IRR) efforts (on abstracted data) and ensure accurate data capture practices.

Response: We will consider this suggestion to develop tools to replace more labor-intensive quality control efforts such as inter-rater reliability efforts (that is, comparing chart-abstracted results from two different abstractors) in development of future policies.

We thank the commenters for their comments. We will consider them as we develop plans to validate electronic clinical quality measure data.

After consideration of public comments on our proposal to conduct a validation pilot test for electronically specified measures in FY 2015, we are finalizing the policy as proposed with a few minor modifications.

We will compare results generated from QRDA-1 files with data from up to 10 sources identified through chart-abstraction as proposed. In addition, we will compare measure outcomes abstracted from electronic clinical quality measure specifications to those abstracted according to chart-abstracted specifications. Also, we plan to remove the questions related to “acceptability of remote technology for validation” and to EHR functionality from the “Electronically Specified Clinical Quality Measures Program Walk-through and Interview” document and reflect our intended flexible approach to accommodate hospitals that cannot produce patient lists or QRDA-1 files in real time as long as submissions can occur during the data collection period for the pilot project. We also intend to reach out to stakeholders to collaborate in preparing for the pilot and interpreting results after the pilot.

f. Data Submission Requirements for Quality Measures That May Be Voluntarily Electronically Reported for the FY 2017 Payment Determination

We believe that collection and reporting of data through health information technology will greatly simplify and streamline reporting for many CMS quality reporting programs. Through electronic reporting, hospitals will be able to leverage EHRs to capture, calculate, and electronically submit quality data that is currently manually chart-abstracted and submitted to CMS for the Hospital IQR Program. As we noted in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51614), we recognize the need to align and harmonize measures across CMS quality reporting programs to minimize the reporting burden imposed on hospitals. In the Medicare EHR Incentive Program Stage 2 final rule (77 FR 54083 through 54087), we finalized a total of 29 clinical quality measures from which hospitals must select at least 16 measures covering three National Quality Strategy (NQS) domains to report beginning in FY 2014. We anticipate that, as health information technology evolves and infrastructure is expanded, we will have the capacity to accept electronic reporting of many of the chart-abstracted measures that are currently part of the Hospital IQR Program.

In the FY 2014 IPPS/LTCH PPS final rule, for the STK (with the exception of STK-1), VTE, ED, and PC measure sets, we allowed hospitals to either: (1) electronically report at least one quarter of CY 2014 (Q1, Q2, or Q3) quality measure data for each measure in one or more of those four measure sets; or (2) continue reporting all measures in those four measure sets using chart-abstracted data for all four quarters of CY 2014 (78 FR 50818).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28242 through 28243) for the FY 2017 payment determination, we proposed to expand this policy, such that providers may select to voluntarily report any 16 of the 28 Hospital IQR Program electronic clinical quality measures that align with the Medicare EHR Incentive Program as long as those 16 measures span three different NQS domains. The 28 measures are listed in the table below. Only 28 of the 29 measures adopted in the Medicare EHR Incentive Program are applicable for the Hospital IQR Program, because the measure ED-3 Median time from ED arrival to ED departure for discharged ED patients (NQF #0496) is an outpatient setting measure. We expect eligible hospitals to select measures that best apply to their patient mix.

Short nameMeasure nameNQF numberNQS domain Available data submission modes
ED-1Median time from ED arrival to ED departure for admitted ED patientsNQF #0495Patient and Family EngagementElectronic clinical quality measure or chart-abstracted.
ED-2Admit Decision Time to ED Departure Time for Admitted PatientsNQF #0497Patient and Family EngagementElectronic clinical quality measure or chart-abstracted.
PC-01Elective delivery (Collected in aggregate, submitted via Web-based tool or electronic clinical quality measure)NQF #0469Clinical Process/EffectivenessElectronic clinical quality measure or chart-abstracted aggregated with Web-based submission.
Stroke-2Discharged on antithrombotic therapyNQF #0435Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
Stroke-3Anticoagulation therapy for atrial fibrillation/flutterNQF #0436Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
Stroke-4Thrombolytic therapyNQF #0437Clinical Process/EffectivenessElectronic clinical quality measure or chart-abstracted.
Stroke-5Antithrombotic therapy by the end of hospital day twoNQF #0438Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
Stroke-6Discharged on statin medicationNQF #0439Clinical Process/EffectivenessElectronic clinical quality measure or chart-abstracted.
Stroke-8Stroke educationN/APatient and Family EngagementElectronic clinical quality measure or chart-abstracted.
Stroke-10Assessed for rehabilitationNQF #0441Care CoordinationVoluntary electronic clinical quality measure.
VTE-1Venous thromboembolism prophylaxisNQF #0371Patient SafetyElectronic clinical quality measure or chart-abstracted.
VTE-2Intensive care unit venous thromboembolism prophylaxisNQF #0372Patient SafetyElectronic clinical quality measure or chart-abstracted.
VTE-3Venous thromboembolism patients with anticoagulation overlap therapyNQF #0373Clinical Process/EffectivenessElectronic clinical quality measure or chart-abstracted.
VTE-4Patients receiving un-fractionated Heparin with doses/labs monitored by protocolN/AClinical Process/EffectivenessVoluntary electronic clinical quality measure.
VTE-5VTE discharge instructionsN/APatient and Family EngagementElectronic clinical quality measure or chart-abstracted.
VTE-6Incidence of potentially preventable VTEN/APatient SafetyElectronic clinical quality measure or chart-abstracted.
AMI-2Aspirin Prescribed at Discharge for AMINQF #0142Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
AMI-7aFibrinolytic Therapy Received Within 30 minutes of Hospital ArrivalNQF #0164Clinical Process/EffectivenessElectronic clinical quality measure or chart-abstracted.
AMI-8aPrimary PCI Received Within 90 Minutes of Hospital ArrivalNQF #0163Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
AMI-10Statin Prescribed at DischargeNQF #0639Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
PN-6Initial Antibiotic Selection for community-acquired pneumonia (CAP) in Immunocompetent PatientsNQF #0147Efficient Use of Healthcare ResourcesVoluntary electronic clinical quality measure.
SCIP-Inf-1aProphylactic Antibiotic Received Within One Hour Prior to Surgical IncisionNQF #0527Patient SafetyVoluntary electronic clinical quality measure.
SCIP-Inf-2aProphylactic Antibiotic Selection for Surgical PatientsNQF #0528Efficient Use of Healthcare ResourcesVoluntary electronic clinical quality measure.
SCIP-Inf-9Urinary catheter removed on Postoperative Day 1 (POD 1) or Postoperative Day 2 (POD 2) with day of surgery being day zeroNQF #0453Patient SafetyVoluntary electronic clinical quality measure.
PC-05Exclusive Breast Milk Feeding and the subset measure PC-05a Exclusive Breast Milk Feeding Considering Mother´s ChoiceNQF #0480Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
EHDI-1aHearing Screening Prior to Hospital DischargeNQF #1354Clinical Process/EffectivenessVoluntary electronic clinical quality measure.
CAC-3Home Management Plan of Care (HMPC) Document Given to Patient/CaregiverN/APatient and Family EngagementVoluntary electronic clinical quality measure.
HTNHealthy Term NewbornNQF #0716Patient SafetyVoluntary electronic clinical quality measure.
Medicare EHR Incentive Program Stage 2 final rule (77 FR 54083 through 54087).

Comment: A commenter requested clarification regarding whether hospitals are required to report on the ED-1 and ED-2 measures for FY 2015. If hospitals are required to report on these measures, the commenter would like clarification regarding whether the data must be submitted electronically as opposed to chart-abstracted.

Response: ED-1 and ED-2 are shown as voluntary electronic clinical quality measures in the table on 79 FR 28242, but are not identified as voluntary measures in the table on 79 FR 28241. We would like to clarify that both ED-1 and ED-2 are required measures that can be submitted either as chart-abstracted measures or as electronic clinical quality measures under the voluntary reporting option.

For the FY 2017 payment determination, we also proposed to expand the reporting requirement of electronic clinical quality measures to require a full year's data collection and submission instead of a minimum of one quarter. In addition, for the FY 2017 payment determination, we proposed to require data submission within approximately 60 days after the end of a calendar year quarter. We have listed the proposed submission deadlines in the table below. We also refer readers to section IX.D.2. of the preamble of this final rule for a description of the electronic clinical quality measures data reporting periods and proposed submission deadlines.

CY 2015/FY 2017 Electronic Clinical Quality Measures Data Reporting Periods and Proposed Submission Deadlines

CY 2015 quarterReporting period (2015)Proposed submission deadline
1January 1-March 31May 30, 2015.
2April 1-June 30Aug 30, 2015.
3July 1-September 30Nov 30, 2015.
4October 1-December 31Feb 28, 2016.

As an incentive for hospitals to voluntarily submit electronic clinical quality measures, we proposed that for the FY 2017 payment determination, hospitals successfully submitting electronic clinical quality measures according to our procedures will not have to validate those measures by submitting chart-abstracted data.

By proposing these changes, we would further align the Hospital IQR Program and the Medicare EHR Incentive Program and promote greater electronic clinical quality measure data reporting for hospitals. In addition, we believe that these changes would ease hospitals' administrative burden, as they will be able to report the same clinical quality measures once to partially satisfy both the Hospital IQR and Medicare EHR Incentive Programs' requirements.

We invited public comment on this proposal.

Comment: One commenter requested that CMS allow hospitals to electronically report data for one calendar quarter instead of an entire CY.

Response: We refer readers to section IX.A.2.h.(1) of the preamble of this final rule where we are finalizing a modification of our proposal, which will only require one CY quarter of electronic clinical quality measure data for those hospitals that elect to participate in the voluntary electronic clinical quality measures reporting option.

Comment: A commenter supported the inclusion of voluntary reporting for certain electronic clinical quality measures for the Hospital IQR Program, and noted that voluntary reporting allows hospitals to be better prepared for submitting new quality measures from EHRs and to correct any operational issues that arise. Another commenter supported the long-term goal of using EHRs to streamline and reduce the burden of quality reporting while increasing access to real-time information to improve care and patient outcomes. One commenter supported the proposal that hospitals electronically report a full year of data on the 12 Hospital IQR Program measures that overlap with the 2014 Medicare EHR Incentive Program measures.

Response: We thank the commenters for their support and, as noted in the response above, we have modified in our finalized policy the number of quarters of data to be reported by those hospitals that elect to participate in the voluntary electronic clinical quality measures reporting option.

Comment: One commenter requested clarification on whether the deadlines for submitting chart-abstracted measures remain the same given the proposal for requiring data submission within 60 days after the calendar year quarter ends for electronic clinical quality measures. Commenters also stated that the shortened time frame for reporting measure data poses a burden on facilities and increases the possibility of errors, which could affect measure scores and, therefore, payment.

Response: We would like to clarify that we did not propose any changes to the submission requirements for chart-abstracted measures (79 FR 28245). We retained the 41/2 months quarterly submission deadline (78 FR 50811). In addition, we are not finalizing the 60 day quarterly submission deadline for electronic clinical quality measures. We refer readers to section IX.A.2.h.(1) of the preamble of this final rule where this is discussed in more detail.

Comment: Some commenters supported the alignment of measures and reporting requirements and timelines across quality reporting and incentive programs, specifically noting that this alignment would reduce hospital's administrative burden and confusion, uses the later Hospital IQR Program deadlines, reduce the number of quarters required until the transition is complete, and does not delay incentive payments. Some commenters argued that CMS' timeline for alignment is aggressive and requested CMS give hospitals time to comply with this requirement. Commenters noted that EHRs are not ready for year two of Stage 1 meaningful use criteria or Stage 2 meaningful use criteria.

Other commenters opposed CMS' proposal to require Q4 2014 and Q1 2015 data submission by May 15, 2015, stating that it does not provide enough time for data submission, particularly for hospitals that conduct manual chart abstraction.

Response: We thank the commenters for supporting our proposal to align reporting between the Hospital IQR Program and the Medicare EHR Incentive Program. We proposed to begin aligning the reporting periods between the two programs beginning with the CY 2015 reporting period. We believe some commenters may have confused the proposed electronic clinical quality measure requirements of the Medicare EHR Incentive Program with the proposed electronic clinical quality measure submission requirements for the Hospital IQR Program.

We proposed for the Hospital IQR Program, that hospitals choosing to submit electronic clinical quality measures would need to submit all four quarters of CY 2015, whereas the Medicare EHR Incentive Program proposed to require only the first three quarters of CY 2015 (79 FR 28245 through 28246). However, we are not finalizing our proposal for hospitals to submit electronic clinical quality measures for all four quarters for the Hospital IQR Program and are instead finalizing a modified policy. We refer readers to section IX.A.2.h.(1) of the preamble of this final rule where this is discussed in more detail. We recognize that many hospitals have faced challenges moving to the latest CQM versions, which is why electronic clinical quality measure reporting remains voluntary at this time.

Comment: One commenter noted that they appreciated the opportunity to gain experience with voluntary electronic reporting for the Hospital IQR Program before such reporting is made mandatory. The commenter also asked that CMS provide further explanation on the set of voluntary electronic clinical quality measures within the Hospital IQR Program.

Response: We refer the commenter to the table above listing the 28 possible electronic clinical quality measures. If a hospital chooses to submit electronic clinical quality measures, the hospital must submit 16 of the 28 possible measures covering three NQS domains. Please note that 12 of the 28 measures are measures required in the Hospital IQR Program. These 12 measures do cover three NQS domains. We would like to clarify that if a hospital chooses to submit electronic clinical quality measures, chart-abstraction of those submitted measures is not necessary.

Comment: One commenter urged CMS to be mindful of safety net hospitals' limited resources when proposing new requirements for reporting measure data electronically. The commenter advised that electronic reporting of quality data requires significant work to obtain, validate, and report and that it also requires information technology and quality management resources. The commenter stated that many hospitals are struggling to meet the current electronic data reporting requirements and that additional requirements will increase hospital expenses for labor, data analysis, and validation.

Response: We note that reporting electronic clinical quality measure data remains voluntary for CY 2015 reporting/FY 2017 payment determination. We believe that our electronic clinical quality measure reporting voluntary reporting option is not unduly burdensome to hospitals, and will allow hospital an opportunity to prepare for electronic reporting of quality measure data. As data becomes more standardized, it is expected that provider burden will decrease over time. In addition, we have modified our proposal for CY 2015 so that for those hospitals choosing to submit electronic clinical quality measures, only one quarter of data submission is necessary to meet the Hospital IQR Program requirement. We want to clarify that we have not made proposals for CY 2016 electronic clinical quality measure reporting/FY 2018 payment determination. These will be addressed in future rulemaking.

Comment: One commenter raised concern that participation in the voluntary electronic clinical quality measure program under the Hospital IQR Program would be low and would therefore, not provide the data to inform future policy direction. In order to make the voluntary electronic clinical quality measure program more attractive to hospitals, the commenter recommended that CMS work with payers and quality assurance organizations to further align measure sets, provide electronic clinical quality measure specifications at least nine months before each relevant reporting period, allow providers to post or omit electronically-generated electronic clinical quality measure data to Hospital Compare, and either require only one-quarter of electronic clinical quality measure data in order to fulfill EHR MU and Hospital IQR Program requirements, or incorporate a robust logic model to monitor and evaluate the burdens and benefits associated with more frequent reporting.

Response: We are actively working with measure developers/stewards to align measure sets and revise measure specifications, as needed. Issues identified by measure stakeholders should be reported to ONC's JIRA tool at: http://jira.oncprojectracking.org/browse/CQM where all stakeholders can comment and follow the progress of the issue. Electronic clinical quality measure specifications are published/updated annually at: http://cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/eCQM_Library.html. Also, we are modifying our proposal so that for those hospitals choosing to submit electronic clinical quality measures, only one quarter of data submission is necessary to meet the Hospital IQR Program requirements. We refer readers to section IX.A.2.h.(1) of the preamble of this final rule where this is discussed in more detail.

After consideration of the public comments we received, we are finalizing our proposals with some modifications. We are finalizing our policy for hospitals that chose to participate in the voluntary electronic reporting option in CY 2015 must report any 16 of the 28 measures across 3 NQS domains as proposed. We are also finalizing that we will only accept the April 2014 version of the measure specifications for CY 2015 reporting/FY 2017 payment determination. Policies for electronic clinical quality measure reporting in CY 2016 and subsequent years will be made in future rulemaking. We are finalizing a modified version of our proposal to expand the reporting requirement of electronic clinical quality measures to require a full year's data collection to only requiring one quarter's worth of data. In addition, we are finalizing a modified version of our proposal to require data submission within approximately 60 days after the end of a calendar year quarter to require submission of CY Q1, Q2, or Q3 data by November 30, 2015. We refer readers to section IX.A.2.h.(1) of the preamble of this final rule for a more detailed discussion.

We note that hospitals choosing to report at least one quarter of quality measure data electronically are not required, but are encouraged, to also submit the same data via chart-abstraction. We understand that many hospitals may be submitting chart-abstracted quality measure data to TJC so the reporting burden would not be increased. Hospitals will gain experience in understanding the differences in the submission methods.

Hospitals voluntarily submitting electronically specified clinical quality measures will utilize their existing QualityNet account to submit electronic quality measure data.

12. Data Accuracy and Completeness Acknowledgement Requirements for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53554) for information for details on DACA requirements for the FY 2017 payment determination and subsequent years.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28253) we did not propose any changes to DACA form requirements.

We did not invite public comment regarding DACA requirements, but received one comment that we are addressing below.

Comment: One commenter expressed concerns that the Data Accuracy and Completeness Acknowledgement statement for hospitals does not provide a means for hospitals to indicate to CMS any errors they have discovered in their quality reporting throughout the year. The commenter observed that a hospital may discover in the fourth quarter an error in the data that was submitted in the first quarter of the year, but the DACA only permits a `Yes' or `No' response regarding whether all of the data was complete and accurate to the best of their knowledge at the time of submission, which does not provide a means for fixing any errors. The commenter observed that there also should be a process for fixing such errors from prior years.

Response: We currently provide a review and correction process for Hospital IQR Program process of care, HAI, and HCAHPS data during the submission period. Hospitals can review their measure rate before the submission deadline, and can review patient-level data to correct any identified errors on previously submitted data. We strongly encourage hospitals to closely review their Hospital IQR Program measure and patient feedback reports to detect these errors before the submission deadline. We do not allow patient-level data correction after the submission deadline or for previous years. We must set a deadline to ensure timely computation of measure rates, Hospital VBP performance scores and payment adjustment factors.

13. Public Display Requirements for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2008 IPPS final rule (72 FR 47360), the FY 2011 IPPS/LTCH PPS final rule (75 FR 50230), the FY 2012 IPPS/LTCH PPS final rule (76 FR 51650), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53554), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50836) for details on public display requirements for the FY 2017 payment determination and subsequent years.

The Hospital IQR Program quality measures are typically reported on the Hospital Compare Web site at: http://www.medicare.gov/hospitalcompare,, but on occasion are reported on other CMS Web sites such as http://www.cms.gov and/or https://data.medicare.gov.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28253) we did not propose any changes to public display requirements.

14. Reconsideration and Appeal Procedures for the FY 2017 Payment Determination and Subsequent Years

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51650 through 51651), the FY 2014 IPPS/LTCH PPS final rule (78 FR 50836), and at 42 CFR 412.140(e) for details on reconsideration and appeal procedures for the FY 2017 payment determination and subsequent years.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28253) we did not propose any changes to the reconsideration and appeals procedures.

15. Hospital IQR Program Extraordinary Circumstances Extensions or Exemptions

We refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51651 through 51652), the FY 2014 IPPS/LTCH PPS final rule (78 FR 50836 through 50837), and 42 CFR 412.140(c)(2) for details on the Hospital IQR Program extraordinary circumstances extensions or waivers. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28253) we did not propose any substantive changes to these policies or the processes. However, in the future, we will refer to the process as the Extraordinary Circumstances Extensions or Exemptions process. We are currently in the process of revising the Extraordinary Circumstances/Disaster Extension or Waiver Request form, previously approved under OMB control number 0938-1171.

In addition, we proposed to make a conforming change from the phrase “extension or waiver” to the phrase “extension or exemption” in 42 CFR 412.140(c)(2). Section 412.140(c)(2) currently states that upon request by a hospital, CMS may grant an extension or waiver of one or more data submission deadlines in the event of extraordinary circumstances beyond the control of the hospital. Specific requirements for submission of a request for an extension or waiver are available on QualityNet.org. We proposed to revise this language to state that upon request by a hospital, CMS may grant an extension or exemption of one or more data submission deadlines in the event of extraordinary circumstances beyond the control of the hospital. Specific requirements for submission of a request for an extension or exemption are available on QualityNet.org.

We did not receive any public comments on this proposal and we are finalizing this policy as proposed.

B. PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program

1. Statutory Authority

Section 3005 of the Affordable Care Act added new sections 1866(a)(1)(W) and (k) to the Act. Section 1866(k) of the Act establishes a quality reporting program for hospitals described in section 1886(d)(1)(B)(v) of the Act (referred to as “PPS-Exempt Cancer Hospitals” or “PCHs”). Section 1866(k)(1) of the Act states that, for FY 2014 and each subsequent fiscal year, a PCH must submit data to the Secretary in accordance with section 1866(k)(2) of the Act with respect to such a fiscal year. Section 1866(k)(2) of the Act provides that, for FY 2014 and each subsequent fiscal year, each hospital described in section 1886(d)(1)(B)(v) of the Act must submit data to the Secretary on quality measures specified under section 1866(k)(3) of the Act in a form and manner, and at a time, specified by the Secretary.

Section 1866(k)(3)(A) of the Act requires that any measure specified by the Secretary must have been endorsed by the entity with a contract under section 1890(a) of the Act, unless an exception under section 1866(k)(3)(B) of the Act applies. The National Quality Forum (NQF) currently holds this contract. The NQF is a voluntary, consensus-based, standard-setting organization with a diverse representation of consumer, purchaser, provider, academic, clinical, and other health care stakeholder organizations. The NQF was established to standardize health care quality measurement and reporting through its consensus development processes. We have generally adopted NQF-endorsed measures in our reporting programs.

However, section 1866(k)(3)(B) of the Act provides an exception. Specifically, it provides that, in the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary.

Under section 1866(k)(3)(C) of the Act, the Secretary was required to publish the measures select for PCHs no later than October 1, 2012, with respect to FY 2014.

Section 1866(k)(4) of the Act requires the Secretary to establish procedures for making public the data submitted by PCHs under the PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program. Such procedures must ensure that a PCH has had the opportunity to review the data that are to be made public with respect to the PCH prior to such data being made public. The Secretary must report measures of processes, structural measures, measures of outcomes, patients' perspective on care, efficiency, and costs of care that relate to services furnished by PCHs on the CMS Web site.

2. Covered Entities

Section 1886(d)(1)(B)(v) of the Act excludes particular cancer hospitals from payment under the IPPS. This final rule covers only those PPS-excluded cancer hospitals meeting eligibility criteria specified in 42 CFR 412.23(f).

3. Previously Finalized PCHQR Program Quality Measures

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53556 through 53561), we finalized five quality measures for the FY 2014 program and subsequent years. Specifically, we finalized two of the CDC NHSN-based HAI quality measures (outcome measures): (1) CLABSI; and (2) CAUTI. We also finalized three cancer-specific process of care measures: (1) Adjuvant chemotherapy is considered or administered within 4 months (120 days) of diagnosis to patients under the age of 80 with the American Joint Committee on Cancer (AJCC) III (lymph node positive) colon cancer; (2) Combination chemotherapy is considered or administered within 4 months (120 days) of diagnosis for women under 70 with AJCC T1c, or Stage II or III hormone receptor negative breast cancer; and (3) Adjuvant hormonal therapy. We also discussed the collection requirements and submission timeframes for these measures in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53563 through 53566).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50838 through 50840), we finalized one new quality measure for the FY 2015 program and subsequent years. Specifically, we finalized the CDC's NHSN HAI measure of Surgical Site Infection (SSI). We did not remove or replace any of the previously finalized measures from the PCHQR Program for the FY 2015 program and subsequent years.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50840 through 50846), we finalized 12 new quality measures for the FY 2016 program and subsequent years. Specifically, we finalized six new SCIP measures, five new clinical process/oncology care measures, and the HCAHPS Survey for reporting beginning with the FY 2016 program and subsequent years. We did not remove or replace any of the previously finalized measures from the PCHQR Program for the FY 2016 program and subsequent years. We also discussed the collection requirements and submission timeframes for these measures in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50850 through 50853).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28254), we did not propose to remove or replace any of the previously finalized measures from the PCHQR Program for the FY 2017 program and subsequent years.

4. Update to the Clinical Process/Oncology Care Measures Beginning With the FY 2016 Program

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28254), beginning with the FY 2016 program, we proposed to update the specifications for each of the five clinical process/oncology care measures so that, for each measure, PCHs must report all-patient data. We believe that the delivery of high quality care in the PCH setting is critically important and that collecting data on all patients will enable us to ensure that high quality care is delivered to Medicare beneficiaries in this setting. In addition, all-patient data increase transparency in the health care system and align with State and Federal initiatives. Our proposal to require PCHs to collect all-patient data provides us with the data necessary to inform the public accurately about the quality of care and patient outcomes in the PCH setting. In addition, this proposal will align the specifications of the clinical process/oncology care measures with those of the SCIP PCHQR measures, for which all-patient data are required for submission.

All-Payer Claims Database (APCD) Fact Sheet; available at: http://www.apcdcouncil.org/issue-briefs-and-fact-sheets.

We welcomed public comments on this proposal for the clinical process/oncology care measures for the FY 2016 program and subsequent years.

Comment: One commenter supported the proposal to require PCHs to report all-patient data for the five clinical process/oncology care measures, noting that it is consistent with reporting requirements in other CMS quality reporting programs.

Response: We thank the commenter for its support.

After consideration of the public comments we received, we are finalizing our proposal requiring PCHs to submit all-patient data for the five clinical process/oncology care measures beginning with the FY 2016 program.

5. New Quality Measure Beginning With the FY 2017 Program

a. Considerations in the Selection of Quality Measures

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53556) and in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50837 through 50838), we indicated that we have taken a number of principles into consideration when developing and selecting measures for the PCHQR Program, and that many of these principles are modeled on those we use for measure development and selection under the Hospital IQR Program:

  • Public reporting should rely on a mix of standards, outcomes, process of care measures, and patient experience of care measures, including measures of care transitions and changes in patient functional status.
  • The measure set should evolve so that it includes a focused core set of measures appropriate to cancer hospitals that reflects the level of care and the most important areas of service furnished by those hospitals. The measures should address gaps in the quality of cancer care.
  • We also consider input solicited from the public through rulemaking and public listening sessions.
  • We consider suggestions and input from a PCH Technical Expert Panel (TEP), convened by a CMS measure development contractor, which rated potential PCH quality measures for importance, scientific soundness, usability, and feasibility. The TEP membership includes health care providers specializing in the treatment of cancer, cancer researchers, consumer and patient advocates, disparities experts, and representatives from payer organizations.

Like the Hospital IQR Program, the PCHQR Program supports the National Quality Strategy (NQS), national priorities, HHS Strategic Plans and Initiatives, the CMS Quality Strategy, and strives to reduce the burden on participating PCHs whenever possible. The PCHQR Program also takes into consideration the recommendations of the Measure Applications Partnership (MAP). The MAP is a multi-stakeholder body convened by the NQF for the purpose of providing input to HHS on the selection of measures.

b. New Quality Measure Beginning With the FY 2017 Program

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28254 through 28456), we proposed to adopt one new clinical effectiveness measure for the FY 2017 program and subsequent years: External Beam Radiotherapy for Bone Metastases (NQF #1822). The proposed clinical effectiveness measure was included on a publicly available document entitled “List of Measures under Consideration for December 1, 2013,” a list of quality and efficiency measures being considered for use in various Medicare programs. The proposed measure was submitted to the MAP Hospital Workgroup for review. The MAP supported the inclusion of this measure in the PCHQR Program. The MAP's conclusions may be found in the “MAP Pre-Rulemaking Report: 2014 Recommendations on Measures Under Consideration by HHS,” which is available at: https://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report__2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. We considered the MAP's input and recommendations for this proposed measure for the PCHQR Program, and specifically, we note that the proposed measure addresses the MAP priority of palliative care for cancer patients. In addition, the proposed measure addresses the NQS domain of effective clinical care.

We believe that this NQF-endorsed measure, developed by the American Society for Radiation Oncology (ASTRO), meets the requirement under section 1866(k)(3)(A) of the Act that measures specified for the PCHQR generally be endorsed by the entity with a contract under section 1890(a) of the Act (currently the NQF). This measure assesses the percentage of patients (both Medicare and non-Medicare) with painful bone metastases and no history of previous radiation who receive EBRT with an acceptable dosing schedule. The measure numerator includes all patients with painful bone metastases and no previous radiation to the same site who receive EBRT with any of the following recommended fractionation schemes: 30Gy/10fxns; 24Gy/6fxns; 20Gy/5fxns; or 8Gy/1fxn. The measure denominator includes all patients with painful bone metastases and no previous radiation to the same site who receive EBRT. The following patients are excluded from the denominator: patients who have had previous radiation to the same site; patients with femoral axis cortical involvement greater than 3 cm in length; patients who have undergone a surgical stabilization procedure; and patients with spinal cord compression, cauda equina compression, or radicular pain. For the reasons explained more fully below, we believe that this measure will reduce the rate of EBRT services overuse, support our commitment to promoting patient safety, and support the NQS domains.

Bone metastases are a common manifestation of malignancy. Some cancer types have a bone metastasis prevalence as high as 70 to 95 percent. EBRT can provide significant pain relief in 50 to 80 percent of patients with painful bone metastases.

Coleman RE. Metastatic bone disease: clinical features, pathophysiology and treatment strategies. Cancer Treat Rev. 2001;27:165-176.

Lutz S, Berk L, Chang E, et al. Palliative radiotherapy for bone metastases: An ASTRO evidence-based guideline. Int J Radiat Oncol Biol Phys. 2011;79(4):965-976.

In October 2009, ASTRO organized a Task Force to perform an assessment of existing recommendations in order to address a lack of palliative radiotherapy guidelines. Based on a review of the literature, the Task Force recommended the following EBRT dosing schedules for patients with previously unirradiated painful bone metastases: 30 Gy over the course of 10 fractions; 24 Gy over the course of 6 fractions; 20 Gy over the course of 5 fractions; and a single 8 Gy fraction. Despite the recommendations, the actual doses applied for EBRT continue to include dosing schedules as high as 25 fractions. Other studies support the conclusion that shorter EBRT schedules produce similar pain relief outcomes when compared to longer EBRT schedules, and that patients prefer shorter EBRT schedules because of their convenience, increased tolerability, and reduced side effects.

Ibid.

Ibid.

In addition, the ASTRO Task Force found that the frequency and severity of side effects associated with a single fraction were the same or less than those associated with multiple fraction regimens, indicating that shorter treatment schedules may be preferable. The proposed External Beam Radiotherapy for Bone Metastases measure seeks to address the performance gap in treatment variation, ensure appropriate use of EBRT, and prevent the overuse of radiation therapy. We believe that this measure is necessary to support patient preferences for shorter EBRT schedules as well as to ensure patient safety, given that shorter treatment courses show similar or fewer side effects while producing similar clinical outcomes.

Lutz S, Berk L, Chang E, et al. Palliative radiotherapy for bone metastases: An ASTRO evidence-based guideline. Int J Radiat Oncol Biol Phys. 2011;79(4):965-976.

We believe the proposed measure is applicable to the PCH setting because it addresses cancer care associated with radiation therapy. The adoption of measures that apply to multiple health care settings is one of our objectives in promoting quality care consistently across all health care settings. Detailed specifications for this proposed measure may be found at: http://www.Fqualityforum.Forg/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=70374.

In summary, in addition to the 18 measures that we have previously finalized for the PCHQR Program, we proposed one new measure for reporting beginning with the FY 2017 program. The proposed policies regarding the form, manner, and timing of data collection for this measure are discussed in later sections. We welcomed public comment on this proposal.

Comment: Several commenters supported the proposed EBRT for bone metastases measure, noting that it aims to address the variation in practice patterns for using radiation therapy for palliative care and promotes improved quality of inpatient care provided to Medicare beneficiaries. However, the commenters encouraged CMS to conduct a performance gap analysis of radiation therapy practice that is specific to the PCH setting.

Response: We thank the commenters for their support. Radiation therapy is a common treatment modality for some cancers, and the ASTRO Task Force (2009) found that the literature demonstrates widespread variation in palliative radiation dose fractionation schedules. Because of this variation, we believe it is important to protect patient safety in the PCH setting by addressing potentially unnecessary and harmful radiation doses. We understand that PCHs, specifically, provide EBRT services, and we believe that the ASTRO Task Force findings demonstrate that the EBRT for bone metastases measure is relevant and appropriate for the PCH setting.

We agree with the commenters' suggestions that we conduct a “performance gap analysis” to assess the appropriateness of the EBRT measure in the PCH setting. We intend to conduct that analysis when we have collected data beginning with the FY 2017 PCHQR Program.

Comment: One commenter supported the adoption of the EBRT for bone metastases measure but recommended that CMS revise the measure to include a broader population of patients receiving radiation therapy.

Response: We appreciate the commenter's feedback. The measure is NQF-endorsed for the population described in the specifications. We will continue to work closely with ASTRO to assess the current clinical evidence base for the broader PCH population. We will consider incorporating any future measure updates supported by clinical evidence.

After consideration of the public comments we received, we are finalizing the EBRT for bone metastases measure for the FY 2017 program and subsequent years.

The table below lists all previously adopted measures as well as the finalized measure for the PCHQR Program for the FY 2017 program and subsequent years.

PCHQR Program Measures for the FY 2017 Program and Subsequent Years

[Including measure finalized in this final rule]

Topic
Safety and Healthcare-Associated Infection—HAI:
• (NQF #0139) NHSN Central Line-Associated Bloodstream Infection (CLABSI) Outcome Measure
• (NQF #0138) NHSN Catheter-Associated Urinary Tract Infections (CAUTI) Outcome Measure
• (NQF #0753) Harmonized Procedure Specific Surgical Site Infection (SSI) Outcome Measure (currently includes SSIs following Colon Surgery and Abdominal Hysterectomy Surgery)
Clinical Process/Cancer-Specific Treatments:
• (NQF #0223) Adjuvant Chemotherapy is Considered or Administered Within 4 Months (120 days) of Diagnosis to Patients Under the Age of 80 with AJCC III (lymph node positive) Colon Cancer
• (NQF #0559) Combination Chemotherapy is Considered or Administered Within 4 Months (120 days) of Diagnosis for Women Under 70 with AJCC T1c, or Stage II or III Hormone Receptor Negative Breast Cancer
• (NQF #0220) Adjuvant Hormonal Therapy
SCIP:
• (NQF #0218) Surgery Patients who Received Appropriate VTE Prophylaxis within 24 Hrs Prior to Surgery to 24 Hrs After Surgery End Time
• (NQF #0453) Urinary Catheter Removed on Post-Operative Day 1 or Post-Operative Day 2 with Day of Surgery Being Day Zero
• (NQF #0527) Prophylactic Antibiotic Received Within 1 Hr Prior to Surgical Incision
• (NQF #0528) Prophylactic Antibiotic Selection for Surgical Patients
• (NQF #0529) Prophylactic Antibiotic Discontinued Within 24 Hrs After Surgery End Time
• (NQF #0284) Surgery Patients on Beta Blocker Therapy Prior to Admission who Received a Beta Blocker During the Perioperative Period
Clinical Process/Oncology Care Measures:
• (NQF #0382) Oncology-Radiation Dose Limits to Normal Tissues
• (NQF #0383) Oncology: Plan of Care for Pain
• (NQF #0384) Oncology: Pain Intensity Quantified
• (NQF #0390) Prostate Cancer-Adjuvant Hormonal Therapy for High-Risk Patients
• (NQF #0389) Prostate Cancer-Avoidance of Overuse Measure-Bone Scan for Staging Low-Risk Patients
Patient Engagement/Experience of Care:
• (NQF #0166) HCAHPS
Clinical Effectiveness Measure:
• (NQF #1822) External Beam Radiotherapy for Bone Metastases
* Previously finalized measures.
** Finalized for the FY 2017 program and subsequent years in this final rule.

6. Possible New Quality Measure Topics for Future Years

We seek to develop a comprehensive set of quality measures for widespread use for informed decision-making and quality improvement in the PCH setting. Therefore, in future rulemaking, we intend to propose to adopt new or updated measures, such as measures that assess the safety and efficiency of the diagnosis and treatment of cancer, measures that take into account novel diagnostic and treatment modalities, measures that assess symptoms and functional status, and measures of appropriate disease management. Additional measure topics we intend to consider include patient-centered care planning and care coordination, shared decision-making, measures of quality of life outcomes, and measures of admissions for complications of cancer and treatment for cancer. We believe that such measures will help us further our goal of achieving better health care and improved health for Medicare beneficiaries who obtain cancer services through the widespread dissemination and use of quality of care information.

We welcomed public comments and specific suggestions for measure topics for the following measure domains: outcomes; quality of life; clinical quality of care; care coordination; patient safety; patient and caregiver experience of care; population/community health; and efficiency. These domains align with those of the NQS, and we believe that selecting measures to address these domains will promote better cancer care while aligning the PCHQR Program with other established quality reporting and pay-for-performance programs such as the Hospital IQR Program, the Hospital OQR Program, and the Hospital VBP Program.

Comment: Several commenters supported the types of measures that CMS stated its intent to adopt for the PCHQR Program, specifically measures that take into account the use of novel treatments and diagnostic tests, noting that CMS' approach will ensure that cancer patients have appropriate access to new treatments.

Response: We thank the commenters for their support and will consider this feedback for future rulemaking.

Comment: Several commenters suggested measure topics that CMS should consider for future years. They recommended that CMS: (1) develop and adopt measures on topics including benign and malignant hematology; (2) consider measures that address non-small cell lung cancer (NSCLC) treatment; (3) develop measures of risk-adjusted, stage-specific survival rates for various types of cancer; (4) adopt validated outcomes measures over process-based measures; (5) emphasize the importance of the HCAHPS survey; and (6) consider palliative care measures.

Response: We appreciate the commenters' suggestions and will consider this feedback for future rulemaking. We note that, in the FY 2014 IPPS/LTCH PPS final rule, we adopted the HCAHPS survey for use in the PCHQR Program measure set beginning with the FY 2016 program (78 FR 50844 through 50845).

Comment: One commenter supported the measure topics proposed for consideration for the PCHQR Program in future years. Several commenters also described the importance of ensuring that measures adopted for the PCHQR Program be supported by the MAP, tested for their applicability, and assessed for potential unintended consequences that may result from their use in specific patient populations. Another commenter recommended that CMS continue to align measures adopted for the PCHQR Program with those in other IPPS quality reporting programs.

Response: We thank the commenters for their support and comments. We will consider this feedback in future rulemaking.

Comment: One commenter commended CMS for focusing attention on addressing high priority measure gaps such as outcomes, quality of life measures, safety, and overuse of care to be considered for future use in the PCHQR Program. The commenter encouraged CMS to coordinate with partners in addressing the following challenges: measures that require multiple data sources; research that demonstrates gaps in care; and the need to develop a “core” set of measures for a population with varied diagnoses.

Response: We thank the commenter for its support and will strive continually to collaborate with external stakeholders.

Generally, we retain measures from the previous years' PCHQR Program measure sets for subsequent years. However, in future years, we will consider developing criteria to determine whether or not to remove or replace measures from the PCHQR Program measure set. In developing removal criteria, we will consider those criteria used by other CMS quality reporting programs in order to align the PCHQR Program with those programs.

We also welcomed public comments on the criteria for removal or replacement of measures from the PCHQR Program.

Comment: Several commenters noted that, in the FY 2015 IPPS/LTCH PPS proposed rule, CMS proposed to remove nearly all of the SCIP measures from the Hospital IQR Program and recommended that CMS consider removing the six SCIP measures from the PCHQR Program. Commenters also recommended that CMS adopt criteria for determining “topped out” measures and measure removal in future years.

Response: At this time, we do not have sufficient data to determine whether these SCIP measures are “topped-out” in the PCH setting. We recognize that the PCHQR patient population is exclusively comprised of cancer patients, unlike the patient population at acute care hospitals that are included in the Hospital IQR Program.

As a result, we will retain the PCH SCIP measures until we have adopted “topped out” policy and until we have sufficient data to conduct “topped-out” analyses in future years and we will continue to monitor and evaluate the PCHQR SCIP measures. As noted above, we will consider adopting “topped out” and other measure removal criteria similar to those adopted by other quality reporting programs, including the Hospital IQR Program, in future rulemaking.

In an effort to reduce the reporting burden for PCHs, in future years, we will consider proposing to require PCHs to report electronically-specified clinical quality measures for the PCHQR Program. We believe that the collection and reporting of data through health information technology would greatly simplify and streamline reporting for many CMS quality reporting programs, including the PCHQR Program. Through electronic reporting, PCHs would be able to leverage EHRs to capture, calculate, and electronically submit quality data that are currently manually chart-abstracted and submitted to CMS for the PCHQR Program. In developing future proposals for electronic clinical quality measures adoption, we will consider the need to align and harmonize measures across various quality reporting programs to minimize the reporting burden imposed on PCHs.

We welcomed public comments on the development of electronic clinical quality measure reporting criteria for future years.

Comment: Several commenters supported CMS' proposal to develop electronic clinical quality measure reporting criteria for future years and recommended that CMS consider the content validity and clinical appropriateness of any measures adopted for the PCHQR Program.

Response: We thank the commenters for their support and will consider this feedback in future rulemaking.

7. Maintenance of Technical Specifications for Quality Measures

We maintain technical specifications for the PCHQR Program measures, and we periodically update those specifications. The specifications may be found on the QualityNet Web site at: https://qualitynet.org/dcs/ContentServer?cid=1228772356060&pagename=QnetPublic%2FPage%2FQnetTier2&c=Page.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53504 through 53505), we finalized a policy under which we use a subregulatory process to make nonsubstantive updates to measures used for the Hospital IQR Program. We also adopted this process for all measures adopted for the PCHQR Program. With respect to what constitutes substantive versus nonsubstantive changes, we expect to make this determination on a case-by-case basis. Examples of nonsubstantive changes to measures might include updated diagnosis or procedure codes, medication updates for categories of medications, broadening of age ranges, and exclusions for a measure. We believe that nonsubstantive changes may include updates to measures based upon changes to guidelines on which the measures are based.

We will continue to use rulemaking to adopt substantive updates to the measures we have adopted for the PCHQR Program. Examples of changes that we might consider to be substantive would be those in which the changes are so significant that the measure is no longer the same measure, or when a standard of performance assessed by a measure becomes more stringent (for example, changes in acceptable timing of medication, procedure/process, or test administration). Another example of a substantive change would be where the NQF has extended its endorsement of a previously endorsed measure to a new setting, such as extending a measure from the inpatient setting to hospice. We also note that, to the extent a PCHQR measure is endorsed by the NQF, the NQF measure maintenance process incorporates an opportunity for public comment and engagement.

We believe the endorsement processes, as well as our treatment of substantive versus nonsubstantive measure changes, adequately balances our need to incorporate updates to PCHQR Program measures in the most expeditious manner possible while preserving the public's ability to comment on updates that so fundamentally change an endorsed measure that it is no longer the same measure that we originally adopted.

8. Public Display Requirements Beginning with the FY 2014 Program

Section 1866(k)(4) of the Act requires the Secretary to establish procedures for making the data submitted under the PCHQR Program available to the public. Such procedures must ensure that a PCH has the opportunity to review the data that are to be made public with respect to the PCH prior to such data being made public. Section 1866(k)(4) of the Act also provides that the Secretary must report quality measures of process, structure, outcome, patients' perspective on care, efficiency, and costs of care that relate to services furnished in such hospital on the CMS Web site.

In order to meet these requirements, in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53562 through 53563), we finalized our policy to display publicly PCHQR Program data on the Hospital Compare Web site (http://www.hospitalcompare.hhs.gov/) and established a preview period of 30 days prior to making such data public.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50847 through 50848), we finalized our proposal to display publicly in 2014 and subsequent years the data for the measures listed below:

  • Adjuvant Chemotherapy is considered or administered within 4 months (120 days) of diagnosis to patients under the age of 80 with AJCC III (lymph node positive) colon cancer (NQF #0223); and
  • Combination Chemotherapy is considered or administered within 4 months (120 days) of diagnosis for women under 70 with AJCC T1c, or Stage II or III hormone receptor negative breast cancer (NQF #0559).

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28257), we proposed to display publicly in 2015 and subsequent years the data for the Adjuvant Hormonal Therapy measure (NQF #0220).

We also proposed to display publicly no later than 2017 and for subsequent years the data for the measures listed below:

  • NHSN Catheter-Associated Urinary Tract Infections (CAUTI) Outcome Measure (NQF #0138); and
  • NHSN Central Line-Associated Bloodstream Infection (CLABSI) Outcome Measure (NQF #0139).

At present, all PCHs are reporting CLABSI and CAUTI data to the NHSN under the PCHQR Program. However, due to the low volume of data produced and reported by the small number of facilities (in fewer than 2 years), the CDC is unable to calculate reasonable and reliable baseline estimates, or expected rates, which are needed for the purpose of calculating these measure rates. Therefore, we estimate that the first public posting of the CLABSI and CAUTI PCHQR Program data reported to the NHSN from the PCHs will occur no later than 2017.

We invited public comment on these proposals.

Comment: Commenters recommended that CMS revise the CLABSI, CAUTI, and SSI measures to account for cancer-specific risks and consider the variation in the cancer patient population case-mix, especially regarding the percentage of patients discharged to palliative or hospice care, when assessing performance on these measures for public display, and recommended that CMS display publicly ICU versus non-ICU rates for the CLABSI and CAUTI data.

Response: We appreciate the commenters' feedback and will consider it for future years. We note that the CDC is the measure steward and is responsible for maintaining the measure specifications for the CLABSI, CAUTI, and SSI measures. CDC works closely with external partners and subject-matter experts to develop and maintain NHSN definitions and criteria that are both standardized and clinically relevant. A concerted effort is made to take into account the heterogeneous patient populations that are monitored and tracked using NHSN, cancer patients being one of many such populations. However, CDC recognizes that the HAI definitions may not account for all heterogeneity and variation among the patient populations and will continue to work with subject-matter experts to gain input and insight on additional criteria that are needed to better represent specific populations where possible. In addition, now that we have received data specifically from PCHs, those data can be reviewed, along with all other NHSN data, when the SIRs are to be recalculated to determine baselines based on the FY 2014 program year. If strong variations are found, we will consider revising the calculation for PCHs.

Comment: One commenter supported CMS' proposal to delay the display of both NHSN CAUTI and CLABSI until no later than 2017 in order to ensure that reliable expected rates can be calculated, and recommended that CMS evaluate the NHSN SSI data under the same standard.

Response: We thank the commenter for its support. The main purposes of the PCHQR Program are to report publicly quality of care information that consumers can use to make decisions about their health care and to encourage PCHs to improve their quality of care. Accordingly, we will delay public reporting of CLABSI and CAUTI data until no later than 2017 so that reliable baseline estimates and expected rates can be determined. We believe this delay is necessary in order to provide meaningful and reliable data available for consumers to make informed health care decisions. After considering the comment, we agree that this same standard should apply to the SSI measure.

After consideration of the public comments we received, we are finalizing the proposal to display publicly beginning in 2015 the data for the Adjuvant Hormonal Therapy measure (NQF #0220), and to display publicly the CLABSI and CAUTI data no later than 2017.

9. Form, Manner, and Timing of Data Submission Beginning With the FY 2017 Program

a. Background

Section 1866(k)(2) of the Act requires that, beginning with the FY 2014 PCHQR Program, each PCH must submit to the Secretary data on quality measures specified under section 1866(k)(3) of the Act in a form and manner, and at a time, as specified by the Secretary.

Data submission requirements and deadlines for the PCHQR Program are generally posted on the QualityNet Web site at: http://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228772864228.

b. Reporting Requirements for the New Measure: External Beam Radiotherapy for Bone Metastases (NQF #1822) Beginning With the FY 2017 Program

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28257 through 28258), we proposed that PCHs report the External Beam Radiotherapy for Bone Metastases (NQF #1822) measure beginning with January 1, 2015 discharges and for subsequent years. We proposed that PCHs would report this measure to us via a CMS Web-based Measures Tool on an annual basis (July 1 through August 15 of each respective year). This approach is consistent with the data submission deadlines finalized for the clinical process/oncology care measures (78 FR 50850 through 50851) and PCHs are already preparing to begin submitting PCHQR data using this timeline. We also believe that annual data submission of once per year (as opposed to quarterly data submission of four times per year) will reduce PCHs' costs and burden. We believe that these proposed dates will provide enough advance notice for PCHs to prepare to report the measure.

We proposed to collect the EBRT for Bone Metastases measure rates for the FY 2017 program and subsequent years using all-patient (both Medicare and non-Medicare) data from the four quarters (Q1, Q2, Q3, and Q4) of CY 2015, and that PCHs must submit aggregate data for the measure for each of these quarters during a data submission window that would be open from July 1 through August 15, 2016. For the FY 2017 program and subsequent years, we refer readers to the reporting periods and data submission window outlined in the table below in this section.

For data collection, we proposed that PCHs submit aggregate-level data through the CMS Web-based Measures Tool or submit an aggregate data file through a vendor (via QualityNet infrastructure). We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50850 through 50851) for more information on the CMS Web-based Measures tool.

We welcomed public comment on the proposed reporting periods, data submission timeframes, and data collection methods/modes for the proposed measure for the FY 2017 program and subsequent years.

Comment: One commenter requested that CMS provide clarification on whether a sampling methodology (including population and sampling guidelines) will be permitted for the EBRT for bone metastases measure because this approach will lessen the burden on PCHs.

Response: We agree that an all-patient EBRT sampling methodology would provide the public with quality measure data that represents the entire patient population of PCHs. We believe that this approach would facilitate PCH education through a consistent sampling methodology across PCHQR measures. Accordingly, we are finalizing a sampling methodology for the EBRT measure in this final rule that is consistent with the sampling methodology standards finalized for the clinical process/oncology care and SCIP measures. We will incorporate this EBRT sampling methodology in the next feasible regularly scheduled PCHQR specifications manual semiannual update.

Comment: One commenter recommended that CMS adopt the same reporting requirements proposed for the clinical process/oncology care, clinical process/cancer specific treatment, and SCIP measures for the new EBRT for bone metastases measure.

Response: The EBRT for bone metastases reporting proposals (79 FR 28257) are consistent with the clinical process/oncology care proposals (79 FR 28258). These proposals allow two data submission options to submit aggregate data: via a CMS Web-based Measures Tool or an aggregate data file.

After consideration of the public comments we received, we are finalizing the sampling methodology by allowing PCHs to use the same sampling approach that we are finalizing for the clinical process/oncology care measures (we refer readers to the sampling table found in section IX.B.9.d. of the preamble of this final rule (New Sampling Methodology for the Clinical Process/Oncology Care Measures Beginning with the FY 2016 Program)) for the EBRT measure sampling purposes. In addition, we are finalizing our proposed reporting requirements for the EBRT measure, beginning with the FY 2017 PCHQR Program. The table below outlines the finalized reporting periods and submission timeframes for FY 2017, FY 2018, and subsequent years for the EBRT for bone metastases measure.

Finalized External Beam Radiotherapy for Bone Metastases (NQF #1822) Measure-Reporting Periods and Submission Timeframes for the FY 2017 Program and Subsequent Years

Program year (FY)Reporting periods (CY)Data submission deadlines
2017Q1 2015 discharges (January 1, 2015-March 31, 2015)July 1, 2016-August 15, 2016.
Q2 2015 discharges (April 1, 2015-June 30, 2015)
Q3 2015 discharges (July 1, 2015-September 30, 2015)
Q4 2015 discharges (October 1, 2015-December 31, 2015)
2018Q1 2016 discharges (January 1, 2016-March 31, 2016)July 1, 2017-August 15, 2017.
Q2 2016 discharges (April 1, 2016-June 30, 2016)
Q3 2016 discharges (July 1, 2016-September 30, 2016)
Q4 2016 discharges (October 1, 2016-December 31, 2016)
Subsequent YearsQ1 discharges (January 1-March 31 of each year 2 years before the program year)July 1-August 15 of each year before the program year.
Q2 discharges (April 1-June 30 of each year 2 years before the program year)
Q3 discharges (July 1-September 30 of each year 2 years before the program year)
Q4 discharges (October 1-December 31 of each year 2 years before the program year)

c. Reporting Options for the Clinical Process/Cancer Specific Treatment Measures Beginning With the FY 2015 Program and the SCIP and Clinical Process/Oncology Care Measures Beginning With the FY 2016 Program

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28258 through 28259), we proposed to modify the data submission requirements for the three clinical process/cancer specific treatment measures that we adopted in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53564), and the six SCIP measures and five clinical process/oncology care measures that we adopted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50846). Under those requirements, PCHs submit aggregate-level clinical process/cancer specific treatment measure data to a CMS contractor, aggregate-level clinical process/oncology care measure data through the CMS Web-based Measures Tool, and patient-level SCIP measure data through the QualityNet infrastructure. We proposed to allow PCHs to report the clinical process/cancer specific treatment, SCIP, and clinical process/oncology care data to CMS using one of two mechanisms. Under the first option, which was newly proposed for the SCIP and clinical process/oncology care measure sets, PCHs or their authorized vendors may enter aggregate numerator and denominator data into a CMS Web page located on the secure part of the CMS QualityNet infrastructure. Under the second option, which was newly proposed for the clinical process/cancer specific treatment, SCIP, and clinical process/oncology care measures, PCHs or their authorized vendors may submit an aggregate data file through a CMS secure QualityNet file exchange process. We proposed these options in order to decrease the reporting burden for PCHs.

We believe that the newly proposed submission option, which is described further below for the SCIP measures, will result in a considerable burden reduction for PCHs, as it includes once annually, rather than once quarterly, submission deadlines and submission of aggregate data as opposed to patient-level data for the SCIP measures.

In addition, we proposed a second option, allowing PCHs to submit an annual aggregate data file stratified by four quarters for each of the SCIP measures. We stated that we believed this additional option would provide the public with sufficiently reliable quality measure information while reducing PCH burden through providing two data collection options. We also stated that we would provide detailed technical file format specifications on the public QualityNet Web site (www.qualitynet.org) following publication of this final rule. We outlined the new submission deadlines for the SCIP measures in the table below.

We stated that these requirements would replace, for the purposes of the PCHQR Program, the update to the SCIP timeline and IT infrastructure that we finalized for the PCHQR Program in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50851 through 50852).

We invited public comment on the proposed new reporting mechanism that would apply to the three clinical process/cancer specific treatment measures, five clinical process/oncology care treatment measures, and six SCIP measures.

Comment: One commenter supported the proposal to allow two reporting options for the clinical process/cancer specific treatment, clinical process/oncology care, and SCIP measures. One commenter supported the proposal to update the reporting periods and submission timelines for the six SCIP measures, noting that the proposal simplifies the PCHQR Program's data reporting process.

Response: We appreciate the commenters' support of this proposal. Our intent is to reduce burden and effort and align CMS infrastructure where appropriate by offering alternative options for PCHs to submit measure data. We are finalizing the two reporting options for the clinical process/cancer specific treatment and clinical process/oncology care measures as proposed. However, the six PCHQR SCIP measures, we are only finalizing the second proposed option, under which PCHs or their vendors may submit an annual aggregate data file stratified by four quarters data via the CMS QualityNet portal.

We are not finalizing the first proposed option that would have allowed PCHs to submit aggregate numerator and denominator data into a CMS Web-Based Measures Tool for the SCIP measures because we were recently informed by our IT developers that the proposed CMS Web page would not be modified to collect aggregate SCIP data by the previously finalized January 2015 initial discharge date. As a result, we are retaining as a second option for these measures the data submission that is currently in place, under which PCHs may submit patient-level data to CMS through the QualityNet infrastructure.

Comment: One commenter expressed concern that the proposed submission options for the clinical process/cancer specific treatment measures, which allow for a data submission other than through the CMS contractor (77 FR 53564) which uses the Commission on Cancer Rapid Quality Reporting System (RQRS), could result in declining patient outcomes and less PCH accountability.

Response: We appreciate the commenter's feedback. We strongly believe that the vendor submission approach, allowing for vendors to submit aggregate data files is consistent across all PCHQR measures and other CMS quality reporting programs to submit data on behalf of the respective hospital facilities. In addition, we believe this approach will greatly reduce reporting burden, minimize duplication of effort, and increase efficiency because vendors commonly submit more than one measure set at the same time (for example, annually or quarterly) on behalf of the facilities.

Comment: One commenter recommended that CMS update NQF #0383 (Oncology: Plan of Care for Pain) to include a minimum threshold for pain in the denominator and to provide a more specific definition for “visit” that includes oncology visits (for example, for palliative care). The commenter also recommended all changes to the measure specifications of the clinical process/oncology care measures be communicated to NQF and PCHs.

Response: We appreciate the commenter's feedback and will consider it in future rulemaking.

Comment: Several commenters asked CMS to consider whether the SCIP measures have been adequately tested in the PCH patient population, noting that the measures may inadvertently encourage care that is not applicable to the PCH setting. For example, one commenter noted that SCIP-Inf-3 requires that prophylactic antibiotics be discontinued within 24 hours after surgery end time, but that this approach may not be well-suited for oncologic patient populations.

Response: We note that we have considered the appropriateness of these measures for the PCH settings, as the inclusion and exclusion criteria for the SCIP measures adopted for the PCHQR Program exclude patients from the measure denominator when the care does not apply. For example, the SCIP-Inf-3 measure specifications include an exclusion criterion for patients with a Reason to Extend Antibiotics. We believe it is important to note that the SCIP measures include all cancer surgeries (and not limited to orthopedic surgeries) performed by both PCHs and many acute care hospitals. We will continue to collaborate with PCHs that have questions about the SCIP measures, and to incorporate nonsubstantive updates into the PCHQR specifications manual.

After consideration of the public comments we received, we are finalizing the proposed reporting requirements for the clinical process/cancer specific treatment and clinical process/oncology care measures beginning with the FY 2015 and FY 2016 program years respectively with one modification. We are not finalizing the CMS Web-Based Measures Tool (aggregate-level data) for the SCIP measures because we are able to leverage the existing patient-level CMS SCIP IT collection infrastructure. PCHs may submit the SCIP measures using two options: (1) Authorized vendor submission of an aggregate data file into the secure CMS QualityNet portal to CMS; or (2) submission of data via the secure CMS QualityNet portal. This finalized policy aligns our existing reporting infrastructure across the PCHQR Program and other CMS quality improvement programs and provides an additional vendor option to report SCIP data to CMS.

The reporting periods and submission timeframes for the clinical process/cancer specific treatment and clinical process/oncology care measures are outlined in the FY 2013 and FY 2014 IPPS/LTCH PPS final rules (77 FR 53564 and 78 FR 50851, respectively). The table below outlines the finalized aggregate data file reporting periods and submission timeframes for FY 2016, FY 2017, and subsequent years for the SCIP measures. Patient-level SCIP reporting period and data submission timeframes are available on the QualityNet Web site (https://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228773716091).

Finalized Six SCIP Measures-Aggregate Data File Reporting Periods and Submission Timeframes for the FY 2016 Program and Subsequent Years

Program year (FY)Reporting periods (CY)Data submission deadlines
2016Q1 2015 discharges (January 1, 2015-March 31, 2015)July 1, 2015-August 15, 2015.
2017Q2 2015 discharges (April 1, 2015-June 30, 2015)July 1, 2016-August 15, 2016.
Q3 2015 discharges (July 1, 2015-September 30, 2015)
Q4 2015 discharges (October 1, 2015-December 31, 2015)
Subsequent YearsQ1 discharges (January 1-March 31 of each year 2 years before the program year)July 1-August 15 of each year before the program year.
Q2 discharges (April 1-June 30 of each year 2 years before the program year)
Q3 discharges (July 1-September 30 of each year 2 years before the program year)
Q4 discharges (October 1-December 31 of each year 2 years before the program year)

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28259), we did not propose any changes to the previously finalized procedural requirements, Notice of Participation (NOP) requirements, or Data Accuracy and Completeness Acknowledgement (DACA) requirements. We refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53563 through 53567) for more information on these requirements.

d. New Sampling Methodology for the Clinical Process/Oncology Care Measures Beginning With the FY 2016 Program

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50842), we adopted a policy under which PCHs could report the five clinical process/oncology care measures finalized for the FY 2016 program and subsequent years using the same sampling methodology that we allow for the reporting of those measures under the PQRS. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28259), we proposed to replace the previously adopted sampling methodology with a sampling methodology similar to the one we have allowed hospitals to use to report the SCIP measures under the Hospital IQR Program. The sampling methodology specified in the PQRS Specifications Manual is specific to the physician office setting. We believe that the methodology we proposed is more applicable to PCHs because it was developed for hospital-level reporting.

The proposed methodology will allow for different numbers of cases to be reported based on each PCH's cancer patient population size. This is necessary for the PCHQR Program because bed size varies among PCHs from 20 to >250 beds. The sampling methodology for the clinical process/oncology care measures is shown below, and we believe it will decrease the reporting burden on PCHs while producing reliable measure rates.

American Hospital Directory: http://www.ahd.com/freesearch.php.

Average quarterly initial population size “N”Minimum required sample size “N”
>12525.
51-12520 percent of the initial patient population.
10-5010.
<10No sampling; 100 percent of the initial patient population.

We also proposed that PCHs report population and sample size counts (by measure) for Medicare and non-Medicare discharges by quarter for the five clinical process/oncology care measures for the FY 2016 program and subsequent years.

We proposed these requirements in order to support our effort to align with existing reporting requirements used in other CMS quality reporting programs, such as the Hospital IQR Program, which requires participating hospitals to submit population and sample size counts for certain measures in addition to the all-payer data needed to calculate measure rates. We view it as vital for PCHs to determine accurately their aggregate population and appropriate sample size data in order for us to assess PCHs' data reporting accuracy and completeness for their total population of cases, including both Medicare and non-Medicare patients.

We welcomed public comments on the proposed sampling guidelines and proposed population and sample size reporting requirements for the clinical process/oncology care measures for the FY 2016 program and subsequent years.

Comment: Several commenters supported the proposal to replace the PQRS physician-level sampling methodology with the proposed new sampling methodology. However, one commenter requested clarification on whether the proposal to collect all-patient data for the clinical process/oncology care measures conflicts with the proposed sampling methodology and also on whether the sampling methodology is based on the number of patients applicable for each measure, or on bed size (that is, hospital-level sample size determination).

Response: We thank the commenters for their support. The term “all-patient data” refers to data regarding both Medicare and non-Medicare patients. Consistent with the sampling methodology standards that we adopted for these measures under the Hospital IQR Program, when PCHs identify the initial patient population, they will use “all-patient data” to determine the population of patients meeting the measure criteria prior to individual measure denominator exclusions. Thus, the sample will include both patients included and excluded from the measure denominator. We believe that this sampling methodology reduces potential bias in measure rates from sampling all patients included in the measure's initial patient population. This initial patient population is usually defined by groups of ICD-9-CM principal procedure or diagnosis codes, which may be readily identified by PCHs by using computer billing records common to Medicare and non-Medicare health insurance payers. The PCHs will subsequently identify the sample size based on the patient population (“all-patient data”). This sampling process is applicable for each clinical process/oncology care measure.

Comment: One commenter requested that CMS provide the specifications for the clinical process/oncology care measures and their new sampling method.

Response: We appreciate the commenter's feedback. We have partnered closely with all 11 PCHs and will provide training and education materials on all measures, including the clinical process/oncology care measures and the applicable proposed sampling methodology. These materials will be available on our QualityNet Web site (http://www.qualitynet.org).

Comment: One commenter recommended that CMS revise the proposed reporting requirements for the clinical process/oncology care measures in order to require that reporting of population and sample size counts be based on electronically available data only.

Response: We appreciate the commenter's feedback. We interpret the comment to recommend that data be submitted via the CMS Web-based Measures Tool only. However, we believe it is most appropriate and feasible at this time to provide PCHs with data submission options. We also understand from past discussions with PCHs that the 11 PCHs vary in their implementation of EHRs. We will consider future available data collection options for PCHs, including electronic Clinical Quality Measures (eCQMs). We also believe that requiring population and sample size count reporting based on electronic data might adversely burden PCHs that do not yet have the means to collect electronic data.

Comment: One commenter recommended that benchmarks for the clinical process/oncology care measures be based on statistically significant aggregate calculations only.

Response: We thank the commenter for its feedback. Currently, we do not have a policy to develop benchmarks. In our effort to monitor and evaluate program growth and sustainability, we will be observing the clinical process/oncology care measures baselines and expected rates.

10. Exceptions From Program Requirements

In our experience with other quality reporting and performance programs, we have noted occasions when providers have been unable to submit required quality data due to extraordinary circumstances that are not within their control (for example, natural disasters). We do not wish to unduly increase their burden during these times. Therefore, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50848), we finalized our policy that, for the FY 2014 program and subsequent years, PCHs may request and we may grant exceptions (formerly referred to as waivers) with respect to the reporting of required quality data when extraordinary circumstances beyond the control of the PCH warrant. When exceptions are granted, we will notify the respective PCH. We are in the process of revising the Extraordinary Circumstances/Disaster Extension or Waiver Request form (CMS-10432), approved under OMB control number 0938-1171.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28259), we did not propose any substantive changes to this PCHQR exception process.

C. Long-Term Care Hospital Quality Reporting (LTCHQR) Program

1. Background

In accordance with section 1886(m)(5) of the Act, as added by section 3004(a) of the Affordable Care Act, the Secretary established the Long-Term Care Hospital Quality Reporting (LTCHQR) Program. Under section 1886(m)(5)(A)(i) of the Act, for the rate year 2014 and each subsequent rate year, in the case of an LTCH that does not submit data to the Secretary in accordance with section 1886(m)(5)(C) of the Act with respect to such a rate year, any annual update (which we also refer to as a “payment determination”) to a standard Federal rate for discharges for the hospital during the rate year, and after application of section 1886(m)(3) of the Act, shall be reduced by two percentage points. As we discussed in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51743 through 51744), for the purposes of the LTCH PPS, the term “rate year” and the term “fiscal year” both refer to the time period beginning October 1 and ending September 30. In order to avoid any possible confusion, we will use the term “fiscal year” rather than “rate year” in our discussion of the LTCHQR Program.

Under section 1886(m)(5)(D)(i) of the Act, the quality measures for the LTCHQR Program are measures selected by the Secretary that have been endorsed by an entity that holds a contract with the Secretary under section 1890(a) of the Act, unless section 1886(m)(5)(D)(ii) of the Act applies. This contract is currently held by the National Quality Forum (NQF). Additional information regarding the NQF and its measure review processes is available at: http://www.qualityforum.org/Measuring_Performance/Measuring_Performance.aspx.

While as a general matter the Secretary must select endorsed measures for the LTCHQR Program, section 1886(m)(5)(D)(ii) of the Act provides that an exception may be made in the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity that holds a contract with the Secretary under section 1890(a) of the Act. In such a case, section 1886(m)(5)(D)(ii) of the Act authorizes the Secretary to specify a measure that is not so endorsed, as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary.

The LTCHQR Program was implemented in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51743 through 51756).

2. General Considerations Used for Selection of Quality Measures for the LTCHQR Program

We seek to promote higher quality and more efficient health care for the beneficiaries we serve. Quality reporting programs, including public reporting of quality information, advance such quality improvement efforts. Quality measurement remains the key tool to the success of these programs. Therefore, the selection of only the highest caliber of measures is a priority for CMS.

We seek to adopt measures for the LTCHQR Program that promote better, safer, and more efficient care. Our measure development and selection activities for the LTCHQR Program take into account national priorities, such as those established by the National Priorities Partnership (http://www.qualityforum.org/Setting_Priorities/NPP/National_Priorities_Partnership.aspx), the HHS Strategic Plan (http://www.hhs.gov/secretary/about/priorities/priorities.html), the National Quality Strategy (NQS) http://www.ahrq.gov/workingforquality/nqs/nqs2011annlrpt.htm), and the CMS Quality Strategy (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityInitiativesGenInfo/CMS-Quality-Strategy.html).

We also must consider input from the NQF Measures Application Partnership (MAP) when selecting measures under the LTCHQR Program. The MAP is composed of multi-stakeholder groups convened by the NQF, our current contractor under section 1890 of the Act. The NQF must convene these stakeholders and provide us with the stakeholders' input on the selection of certain categories of quality and efficiency measures as part of a pre-rulemaking process described in section 1890A of the Act. We, in turn, must take this input into consideration in selecting those categories of measures. The NQF MAP met in December 2013 and January 2014 and provided input to CMS as required under section 1890A(a)(3) of the Act. This input appears in the MAP's January 2014 Pre-Rulemaking Report available for download at: http://www.qualityforum.org/Publications/2014/01/MAP_Pre-Rulemaking_Report__2014_Recommendations_on_Measures_for_More_than_20_Federal_Programs.aspx. Measures proposed for the LTCHQR Program in this final rule are measures CMS included under the List of Measures under Consideration (MUC List) for December 1, 2013, a list that the Secretary must make available to the public by December 1 of each year, as part of the pre-rulemaking process, as described in section 1890A(a)(2) of the Act. The measures we proposed in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28263 through 28268) for the LTCHQR Program are discussed in the MAP Pre-Rulemaking Report (pp. 192-193). The MAP reviewed each measure proposed in this rule. We refer readers to the following sections of the preamble of this final rule for more information on the MAP's recommendations: IX.C.7.a.(1), Functional Status Quality Measure: Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function; IX.C.7.a.(2), Functional Status Quality Measure: Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support; and IX.C.7.b., Quality Measure: National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure.

After due consideration to any measures that may have been endorsed or adopted by a consensus organization, including the NQF, for the LTCH setting, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28259 through 28278) we proposed measures that are either supported by the MAP for the LTCHQR Program, or that we believe most closely align with the national priorities discussed in this section of the proposed rule. In the absence of the MAP's support, in some cases we proposed measures for which the MAP expressed conditional support and that meet the exception criteria in section 1886(m)(5)(D)(ii) of the Act. Further discussion of why each measure is a high priority in the LTCH setting is included below.

3. Policy for Retention of LTCHQR Program Measures Adopted for Previous Payment Determinations

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53614 through 53615), for the LTCHQR Program, we adopted a policy that once a quality measure is adopted, it will be retained for use in subsequent years, unless otherwise stated. For the purpose of streamlining the rulemaking process, when we initially adopt a measure for the LTCHQR Program for a payment determination, this measure will be automatically adopted for all subsequent years or until we propose to remove, suspend, or replace the measure. For further information on how measures are considered for removal, suspension, or replacement, we refer readers to the FY 2013 IPPS/LTCH PPS final rule.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28260), we did not propose any changes to this policy for retaining LTCHQR Program measures adopted for previous payment determinations.

4. Policy for Adopting Changes to LTCHQR Program Measures

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53615 through 53616), we adopted our policy that if the NQF updates an endorsed measure that we have adopted for the LTCHQR Program in a manner that we consider to not substantively change the nature of the measure, we will use a subregulatory process to incorporate those updates to the measure specifications that apply to the LTCHQR Program. With respect to what constitutes a substantive versus a nonsubstantive change, we expect to make this determination on a measure-by-measure basis. Examples of such nonsubstantive changes might include updated diagnosis or procedure codes, medication updates for categories of medications, broadening of age ranges, and changes to exclusions for a measure. The subregulatory process for nonsubstantive changes will include revision of the LTCHQR Program Manual and posting of updates on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html. Examples of changes that we might consider to be substantive would be those in which the changes are so significant that the measure is no longer the same measure, or when a standard of performance assessed by a measure becomes more stringent, such as changes in acceptable timing of medication, procedure/process, test administration, or expansion of the measure to a new setting.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28260 through 28261), we did not propose any changes to this policy for adopting changes to LTCHQR Program measures.

5. Previously Adopted Quality Measures

a. Previously Adopted Quality Measures for the FY 2015 and FY 2016 Payment Determinations and Subsequent Years

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53624 through 53636), we retained the application of Percent of Residents with Pressure Ulcers That Are New or Worsened (Short-Stay) (NQF #0678) to the LTCH setting (initially adopted in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51745 through 51750)) for the FY 2015 payment determination and subsequent years, and adopted updated versions of National Health Safety Network (NHSN) Catheter-Associated Urinary Tract Infection (CAUTI) Outcome Measure (NQF #0138) and NHSN Central Line-Associated Blood Stream Infection (CLABSI) Outcome Measure (NQF #0139), for the FY 2014 payment determination and subsequent years. We also adopted two new quality measures for the LTCHQR Program for the FY 2016 payment determination and subsequent years, in addition to the three previously adopted measures (the CAUTI measure, CLABSI measure, and Pressure Ulcer measure): (1) Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680); and (2) Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431) (77 FR 53624 through 53636).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50861 through 50863), we adopted the NQF-endorsed version of the Pressure Ulcer measure, Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay) (NQF #0678), for the LTCHQR Program for the FY 2015 payment determination and subsequent years.

Set out below are the quality measures, both previously adopted measures retained in the LTCHQR Program and measures adopted in FY 2013 and FY 2014 IPPS/LTCH PPS final rules, for the FY 2015 and FY 2016 payment determinations and subsequent years.

LTCHQR Program Quality Measures Adopted for the FY 2015 and FY 2016 Payment Determinations and Subsequent Years

NQF Measure IDMeasure titlePayment determination
NQF #0138National Health Safety Network (NHSN) Catheter-Associated Urinary Tract Infection (CAUTI) Outcome MeasureFY 2015 and Subsequent FYs.
NQF #0139National Health Safety Network (NHSN) Central Line-Associated Blood Stream Infection (CLABSI) Outcome MeasureFY 2015 and Subsequent FYs.
NQF #0678Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay)FY 2015 and Subsequent FYs.
NQF #0680Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay)FY 2016 and Subsequent FYs.
NQF #0431Influenza Vaccination Coverage among Healthcare PersonnelFY 2016 and Subsequent FYs.

While we did not propose any changes in the FY 2015 IPPS/LTCH PPS proposed rule to measure specifications for NQF #0678, Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay), we received input from several commenters on this measure.

Comment: A few commenters suggested CMS consider adding a “present on admission” (POA) indicator in the LTCH Continuity Assessment Record and Evaluation (CARE) Data Set. These commenters noted that a POA indicator is critical to aid in the determination of whether a pressure ulcer was developed as a result of care provided by an LTCH.

Response: We recognize the importance of determining pressure ulcers that are “present on admission” and taking this into account when assessing new or worsened pressure ulcers in the LTCH setting. The quality measure Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay) (NQF #0678), is designed to identify pressure ulcers that are present on admission. Items M0800A, M0800B, and M0800C on the LTCH CARE Data Set discharge assessment capture patient-specific data to identify Stage 2, Stage 3, and Stage 4 pressure ulcers that are “new” or “worsened” since the time of admission assessment, thus identifying only those Stage 2, Stage 3 and Stage 4 pressure ulcers that were not present on the admission assessment and/or only those Stage 2, Stage 3 and Stage 4 pressure ulcers that were present at a lower stage on the admission assessment. We refer readers to the measure specifications for the Pressure Ulcer measure, which are available for download at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/ and at www.qualityforum.org/QPS/0678.

Comment: A commenter recommended that “behavioral patients” be excluded from the Pressure Ulcer measure. The commenter noted that the inclusion of all inpatients regardless of age and any other criteria has a significant impact on the reporting burden for LTCHs and that the exclusion of behavioral patients would lessen burden on LTCHs because these patients do not significantly contributing to the Pressure Ulcer measure.

Response: We appreciate the commenter's input on this previously finalized measure. Pressure ulcers are serious medical conditions that can lead to serious life threatening infections, can substantially increase the cost of care, and are an important measure of quality. As a result, we believe that all patients, regardless of their cognitive or behavioral health status, should be assessed for pressure ulcer risk, and appropriate pressure ulcer monitoring, prevention, and management should be implemented for all patients in an LTCH. We refer the commenter to the current measure specifications for NQF #0678, including patient exclusions and inclusions, available at www.qualityforum.org/QPS/0678.

b. Previously Adopted Quality Measures for the FY 2017 and FY 2018 Payment Determinations and Subsequent Years

In the FY 2014 IPPS/LTCH PPS final rule, we adopted three additional measures for the FY 2017 payment determination and subsequent years (78 FR 50863 through 50874) and one additional measure for the FY 2018 payment determination and subsequent years (78 FR 50874 through 50877). These measures are set out in the table below.

LTCHQR Program Quality Measures Previously Adopted for the FY 2017 and FY 2018 Payment Determinations and Subsequent Years

NQF Measure IDMeasure titlePayment determination
NQF #1716National Healthcare Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia Outcome MeasureFY 2017 and Subsequent Years
NQF #1717National Healthcare Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Clostridium difficile Infection (CDI) Outcome MeasureFY 2017 and Subsequent Years
NQF #2512 (Under Review at NQF*)All-Cause Unplanned Readmission Measure for 30 Days Post-Discharge from Long-Term Care HospitalsFY 2017 and Subsequent Years
Application of NQF #0674Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay)FY 2018 and Subsequent Years
* Not NQF endorsed, currently under review at NQF, please see: http://www.qualityforum.org/All-Cause_Admissions_and_Readmissions_Measures.aspx

While we did not propose any changes in the FY 2015 IPPS/LTCH PPS proposed rule to measures previously adopted for the FY 2017 payment determination and subsequent years, we received input from a few commenters regarding three previously finalized measures: NHSN Facility-Wide Inpatient Hospital-Onset Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia Outcome Measure (NQF #1716), NHSN Facility-Wide Inpatient Hospital-Onset Clostridium difficile Infection (CDI) Outcome Measure (NQF #1717), and All-Cause Unplanned Readmission Measure for 30 Days Post-Discharge from Long-Term Care Hospitals (NQF #2512, under review at NQF). While we greatly appreciate the commenters' views on these previously finalized measures, we did not make any proposals relating to them in the FY 2015 IPPS/LTCH PPS proposed rule. Therefore, we will not summarize and address all of these comments in detail in this final rule. However, we will consider all of these comments in future rulemaking and program development.

Comment: A commenter supported the inclusion of the MRSA and CDI measures in the LTCHQR Program noting that the LTCH patients arrive after receiving several weeks of therapy for infections that are difficult to treat and therefore have high levels of exposure to antibiotics. Another commenter also supported these two measures and indicated support for the readmission measure. This commenter urged CMS to adopt outcome measures more quickly and suggested that the three aforementioned measures finalized for FY 2017 be implemented for FY 2016.

Response: We appreciate the commenter's recommendation to adopt the measures more quickly than the previously finalized timeline. However, in order to ensure adequate time to support successful measure implementation across the LTCHs, we believe the previously finalized data collection period and submission deadlines are appropriate. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50880 through 50882) for more information.

Comment: A commenter urged CMS to share, on a monthly basis, claims data with LTCHs for any patients readmitted within 30 days of the LTCH discharge. The commenter noted that providing these data would (1) notify LTCHs of readmissions that will affect their quality reporting data, and (2) enable LTCHs to identify potential systemic problems and implement corrective action plans focused on improving quality of care and reducing preventable readmissions.

Response: We appreciate the commenter's support for this previously finalized quality measure. This commenter recommended a quality improvement process that is separate from the purpose of the readmissions measure. The readmissions measure is intended to report statistically robust estimates of standardized readmission rates over a particular time, while the commenter recommends an intensive quality control process with real time data on specific patients. We will consider these quality improvement process recommendations as we move forward with the LTCHQR Program and future measure development and reporting efforts. To facilitate reduction in readmissions, we encourage all LTCHs to conduct appropriate discharge planning and follow up with their patients to monitor and ensure high-quality care and improved outcomes.

6. Revisions to Data Collection Period and Submission Deadlines for Previously Adopted Quality Measures

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28262 through 28263) we proposed, for the FY 2016 payment determination and subsequent years, to revise data collection period and submission deadlines for a measure that we previously adopted for the LTCHQR Program: Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680). We also proposed, for the FY 2018 payment determination only, revised data collection period and submission deadlines for the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (application of NQF #0674) measure. For the FY 2019 payment determination and subsequent years, data collection for this measure would begin on January 1 and continue through December 31.

a. Revisions to Data Collection Period and Submission Deadlines for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680)

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50858 through 50861), we revised the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680) measure for the FY 2016 payment determination and subsequent years. Specifically, we finalized that for the FY 2016 payment determination, LTCHs must collect data for any patient admitted or discharged during the influenza vaccination season, from October 1, 2014, through April 30, 2015, and submit data for these patients by May 15, 2015.

We sought to better align the data collection period and submission deadlines of the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680) measure with the data collection period and submission deadlines of the Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay) (NQF #0678) measure because both measures are reported using the same data collection instrument, the LTCH CARE Data Set. Therefore, for the FY 2016 payment determination and subsequent years, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28262), we proposed to revise the data collection period and submission deadlines for the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680) measure. Specifically, we proposed that the first data collection period would take place during the fourth quarter of the CY preceding the applicable FY (for example, October 2014 through December 2014 for the FY 2016 payment determination), with data submission by February 15, 2015, and the second data collection period would take place during the first quarter of the subsequent CY (for example, January 2015 through March 2015 for the FY 2016 payment determination), with data submission by May 15, 2015.

The changes are illustrated below for the FY 2016 and FY 2017 payment determinations only, but similar collection period and submission deadlines would also apply to subsequent years. By taking into account the influenza vaccination season, these changes would align data collection and submission for this measure (NQF #0680) with the rest of the LTCH CARE Data Set.

Data Collection Period and Submission Deadlines for LTCHQR Program Quality Data for the FY 2016 and FY 2017 Payment Determinations: Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection periodSubmission deadlinesPayment determination
October 1, 2014—December 31, 2014February 15, 2015FY 2016.
January 1, 2015—March 31, 2015May 15, 2015
October 1, 2015—December 31, 2015February 15, 2016FY 2017.
January 1, 2016—March 31, 2016May 15, 2016

We noted that these changes would only apply to the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680) for the LTCHQR Program, and would not be applicable to any other LTCHQR Program measures, proposed or adopted, unless explicitly stated.

We invited public comments on our proposal to revise the data collection period and submission deadlines for this patient influenza vaccination measure (NQF #0680) for the FY 2016 payment determination and subsequent years. We refer readers to section IX.9.c. of the preamble of this final rule for our responses to comments on this proposal, as well as our final policy on this proposal.

b. Revisions to Data Collection Period and Submission Deadlines for the Application of Percent of Residents Experiencing One or More Falls With Major Injury (Long-Stay) (NQF #0674)

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50874 through 50877), we adopted the Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674) for the FY 2018 payment determination. We further finalized that LTCHs should begin to collect and submit data on this measure using the LTCH CARE Data Set starting January 1, 2016.

To ensure the successful implementation of new and updated versions of LTCH CARE Data Set, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28262 through 28263), we noted that we will be following an implementation cycle beginning April 1, 2016, which will allow for a predictable future release schedule. We believe that adherence to a predictable future release schedule that takes into account both the changes that must be made to the LTCH CARE Data Set, as well as requirements that are managed by LTCHs for such changes, will help ensure successful implementation. Therefore, we will be adhering to a date of April 1 of any given year when releasing future iterations of the LTCH CARE Data Set. This change will effectively delay the implementation of the January 1, 2016, release by three months, allowing LTCHs additional time to become familiar with and to participate in trainings related to the revised LTCH CARE Data Set, as well as time to incorporate given changes into their existing IT infrastructure.

Therefore, we proposed that for the FY 2018 payment determination, data collection for this measure would begin on April 1, 2016. For all subsequent years, data collection for this measure would begin on January 1 and continue through December 31. The changes are illustrated below for the FY 2018 and FY 2019 payment determinations.

Data Collection Period and Submission Deadlines for LTCHQR Program Quality Data for the FY 2018 and FY 2019 Payment Determinations: Application of Percent of Residents Experiencing One or More Falls With Major Injury (Long-Stay) (NQF #0674)

Data collection periodSubmission deadlinesPayment determination
April 1, 2016—June 30, 2016August 15, 2016FY 2018.
July 1, 2016—September 30, 2016November 15, 2016
October 1, 2016—December 31, 2016February 15, 2017
January 1, 2017—March 31, 2017May 15, 2017FY 2019.
April 1, 2017—June 30, 2017August 15, 2017
July 1, 2017—September 30, 2017November 15, 2017
October 1, 2017—December 31, 2017February 15, 2018

We noted that these proposed changes would be applicable only to the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674) measure, and not applicable to any other LTCHQR Program measures, proposed or adopted, unless specifically proposed for such measures.

We invited public comments on these proposals. We refer readers to section IX.9.f. of the preamble of this final rule for our responses to comments on these proposals, as well as our final policy on this proposal.

7. New LTCHQR Program Quality Measures for the FY 2018 Payment Determination and Subsequent Years

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28263 through 28268), we proposed three new quality measures for the FY 2018 payment determination and subsequent years. Two of these are related to functional status, and one measure is related to ventilator-associated events (VAE). One of the proposed functional status quality measures is Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function. The second proposed functional status quality measure is Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support. The quality measures are described in more detail below.

a. New LTCHQR Program Functional Status Quality Measures for the FY 2018 Payment Determination and Subsequent Years

Patients in LTCHs present with clinically complex conditions. In addition to having complex medical care needs for an extended period of time, LTCH patients often have functional limitations due to the nature of their conditions, as well as deconditioning due to prolonged bed rest and treatment requirements (for example, ventilator use). These patients are therefore at high risk for functional decline during the LTCH stay that is both condition-related and iatrogenic.

The National Committee on Vital and Health Statistics, Subcommittee on Health, noted: “[i]nformation on functional status is becoming increasingly essential for fostering healthy people and a healthy population. Achieving optimal health and well-being for Americans requires an understanding across the life span of the effects of people's health conditions on their ability to do basic activities and participate in life situations, in other words, their functional status.”

Subcommittee on Health National Committee on Vital and Health Statistics, “Classifying and Reporting Functional Status” (2001).

The functional assessment items included in the two functional status quality measures were originally developed and tested as part of the Post-Acute Care Payment Reform Demonstration version of the CARE Tool, which was designed to standardize assessment of patients' status across acute and post-acute settings, including LTCHs, inpatient rehabilitation facilities (IRFs), skilled nursing facilities (SNFs), and home health agencies (HHAs). The functional status items on the CARE Tool are daily activities that clinicians typically assess at the time of admission and/or discharge in order to determine patients' needs, evaluate patient progress and prepare patients and families for a transition to home or to another setting.

In the FY 2015 IPPS/LTCH PPS proposed rule, we used the terms “CARE Item Set” and “CARE Tool” interchangeably. For the purpose of consistency and standardization of terminology, we have revised the language to “CARE Tool” throughout this FY 2015 IPPS/LTCH PPS final rule. However, we have retained the term “CARE Item Set” when citing existing reports.

The development of the CARE Tool and a description and rationale for each item is described in a report entitled “The Development and Testing of the Continuity Assessment Record and Evaluation (CARE) Item Set: Final Report on the Development of the CARE Item Set: Volume 1 of 3.” Reliability and validity testing were conducted as part of CMS' Post-Acute Care Payment Reform Demonstration, and we concluded that the functional status items have acceptable reliability and validity. A description of the testing methodology and results are available in several reports, including the report entitled “The Development and Testing of the Continuity Assessment Record And Evaluation (CARE) Item Set: Final Report On Reliability Testing: Volume 2 of 3” and the report entitled “The Development and Testing of The Continuity Assessment Record And Evaluation (CARE) Item Set: Final Report on Care Item Set and Current Assessment Comparisons: Volume 3 of 3.” These reports are available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/CARE-Item-Set-and-B-CARE.html.

Barbara Gage et al., “The Development and Testing of the Continuity Assessment Record and Evaluation (CARE) Item Set: Final Report on the Development of the CARE Item Set ” (RTI International, 2012).

Ibid.

Ibid.

(1) Functional Status Quality Measure: Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function

The first functional status quality measure we proposed for the FY 2018 payment determination and subsequent years is a process quality measure entitled Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function. This quality measure reports the percent of LTCH patients with both an admission and a discharge functional assessment and a care plan that addresses function.

This process measure requires the collection of admission and discharge functional status data by trained clinicians using standardized clinical assessment items, or data elements, that assess specific functional activities (that is, self-care, mobility, cognition, communication, and bladder continence). The self-care and mobility function items are coded using a 6-level rating scale that indicates the patient's level of independence with the activity; higher scores indicate more independence. The number of available response options for coding the cognition, communication, and bladder items ranges from 2 to 7. For this quality measure, inclusion of function in the patient's care plan is determined based on whether a functional goal is recorded at admission for at least one of the standardized self-care or mobility function items using the 6-level rating scale.

An increasing body of reported evidence has supported the safety and feasibility of early mobilization and rehabilitation of critically ill but stable patients, with minimal adverse events and risk to the patient.Early mobility and rehabilitation in these settings have been associated with improved patient outcomes. Therefore, this quality measure addresses the importance of: (1) Conducting a functional assessment at the time of admission addressing self-care, mobility, cognition, communication, and bladder continence; (2) incorporating the functional assessment findings made at the time of admission into the patients' care plan and setting at least one discharge self-care or mobility functional status goal; and (3) conducting a functional assessment at the time of discharge addressing self-care, mobility, cognition, communication, and bladder continence.

J. Adler and D. Malone, “Early mobilization in the intensive care unit: A systematic review,” Cardiopulm Phys Ther J 23, no. 1 (2012).

J. P. Kress, “Clinical trials of early mobilization of critically ill patients,” Crit Care Med 37, no. 10 Suppl (2009).

W. D. Schweickert and J. P. Kress, “Implementing early mobilization interventions in mechanically ventilated patients in the ICU,” Chest 140, no. 6 (2011).

W. D. Schweickert et al., “Early physical and occupational therapy in mechanically ventilated, critically ill patients: a randomised controlled trial,” Lancet 373, no. 9678 (2009).

J. M. Zanni et al., “Rehabilitation therapy and outcomes in acute respiratory failure: an observational pilot project,” J Crit Care 25, no. 2 (2010).

A. Drolet et al., “Move to improve: the feasibility of using an early mobility protocol to increase ambulation in the intensive and intermediate care settings,” Phys Ther 93, no. 2 (2013).

Functional limitations following critical illness are becoming increasingly prevalent as a result of improving critical care medicine and survival rates. Short-term and long-term adverse consequences among critically ill and chronically, critically ill patients in LTCH and Intensive Care Unit (ICU) settings include severe weakness, muscle atrophy, connective-tissue shortening, loss of bone mass, increased risk for blood clots, increased risk for pressure ulcers, deconditioning, deficits in self-care and ambulation, and functional impairment, fatigue, as well as cognitive impairment, including profound and persistent deficits in memory, attention/concentration, and executive function, and the inability to return to work one year after hospital discharge. Cognitive impairment in survivors of critical illness has been associated with anxiety and depression, inability to return to work, and inability of older persons to return home. To mitigate these adverse consequences, traditional practices of bed rest and immobility have been challenged in recent years, and early mobility and rehabilitation have been increasingly recognized as important to improve patients' long-term functional outcomes, with recovery of function being described as both desirable and possible. The lack of early mobility initiation in ICU settings has also been described as a strong predictor of patient outcomes.

Adler and Malone, “Early mobilization in the intensive care unit: a systematic review.”

Ibid.

S. L. Dang, “ABCDEs of ICU: Early mobility,” Crit Care Nurs Q 36, no. 2 (2013).

E. H. Skinner et al., “Development of a physical function outcome measure (PFIT) and a pilot exercise training protocol for use in intensive care,” Crit Care Resusc 11, no. 2 (2009).

Centre for Clinical Practice at NICE (UK), “Rehabilitation after critical illness [Internet].” National Institute for Health and Clinical Excellence (NICE), http://www.nice.org.uk/nicemedia/live/12137/43564/43564.pdf.

Zanni et al., “Rehabilitation therapy and outcomes in acute respiratory failure: an observational pilot project.”

Ibid.

Dang, “ABCDEs of ICU: Early mobility.”

Ibid.

Ibid.

Schweickert and Kress, “Implementing early mobilization interventions in mechanically ventilated patients in the ICU.”

Zanni et al., “Rehabilitation therapy and outcomes in acute respiratory failure: an observational pilot project.”

Adler and Malone, “Early mobilization in the intensive care unit: a systematic review.”

Skinner et al., “Development of a physical function outcome measure (PFIT) and a pilot exercise training protocol for use in intensive care.”

Centre for Clinical Practice at NICE (UK), “Rehabilitation after critical illness [Internet].”

Ibid.

M. E. Wilcox et al., “Cognitive dysfunction in ICU patients: risk factors, predictors, and rehabilitation interventions,” Crit Care Med 41, no. 9 Suppl 1 (2013).

N. E. Brummel et al., “A combined early cognitive and physical rehabilitation program for people who are critically ill: the activity and cognitive therapy in the intensive care unit (ACT-ICU) trial,” Phys Ther 92, no. 12 (2012).

Dang, “ABCDEs of ICU: Early mobility.”

H. J. Engel et al., “ICU early mobilization: from recommendation to implementation at three medical centers,” Crit Care Med 41, no. 9 Suppl 1 (2013).

Wilcox et al., “Cognitive dysfunction in ICU patients: Risk factors, predictors, and rehabilitation interventions.”

Drolet et al., “Move to improve: The feasibility of using an early mobility protocol to increase ambulation in the intensive and intermediate care settings.”

Dang, “ABCDEs of ICU: Early mobility.”

Z. Li et al., “Active mobilization for mechanically ventilated patients: A systematic review,” Arch Phys Med Rehabil 94, no. 3 (2013).

C. L. Rochester, “Rehabilitation in the intensive care unit,” Semin Respir Crit Care Med 30, no. 6 (2009).

Dang, “ABCDEs of ICU: Early mobility.”

The clinical practice guideline Rehabilitation after Critical Illness from the National Institute for Health and Clinical Excellence (NICE) recommends performing clinical assessment to determine the patient's risk of developing physical and nonphysical morbidity during the critical care stay as early as clinically possible, identifying current rehabilitation needs for patients at risk of morbidity, establishing short-term and medium-term rehabilitation goals based on the clinical assessment, starting an individualized structured rehabilitation program as early as possible, and performing clinical reassessment before discharge.

Centre for Clinical Practice at NICE (UK), “Rehabilitation after critical illness [Internet].”

The importance of standardized functional assessment in LTCH settings is also supported by the high prevalence of therapy services provided in this setting, as well as the need for care coordination for patients returning home and receiving follow-up care in the community and patients receiving additional institutional healthcare services after discharge from an LTCH. A study of 1,419 ventilator-dependent patients from 23 LTCHs reported that physical, occupational, and speech therapy were the most commonly provided services among a comprehensive list of 34 procedures, services, and treatments provided during the LTCH stay. The high frequency of physical (84.8 percent), occupational (81.5 percent), and speech (79.7 percent) therapy reflects use of the rehabilitative model of care adopted by many post-ICU ventilator weaning programs, which is important in restoration of function. This high utilization of therapy services supports the need for standardized functional assessment at admission to document functional status, identify the need for therapy, set functional status goals and assist with discharge planning and care coordination.

D. J. Scheinhorn et al., “Post-ICU mechanical ventilation at 23 long-term care hospitals: a multicenter outcomes study,” Chest 131, no. 1 (2007).

Ibid.

Whether an LTCH patient is discharged home or to another care setting for continuing health care, functional status is an important aspect of a person's health status to document at the time of transition. The study also reported that 28.8 percent of patients were discharged directly home or to assisted living, further supporting the importance of functional assessment and early rehabilitation to facilitate discharge planning and home discharge, when possible.

Ibid.

Reported benefits of early mobility and rehabilitation include: (1) Improved strength and functional status; (2) earlier achievement of mobilization milestones, such as out-of-bed mobilization; (3) improvement in mobility and self-care function scores from admission to discharge; (4) greater incidence of return to functional baseline in mobility and self-care, greater unassisted walking and walking distances, and improved self-reported physical function scores at hospital discharge compared with persons not participating in early mobility and rehabilitation; (5) enhanced recovery of functional exercise capacity; (6) improved self-perceived functional status; and (7) reduced physiological and cognitive complications and improved cognitive function. Early mobility and rehabilitation have also been associated with reduced ICU and hospital length of stay; reduced incidence of delirium and improved patient awareness; increased ventilator-free days and improved weaning outcomes; greater incidence of discharge home directly after hospitalization compared with patients not receiving early mobilization; and reduced hospital readmission or death in the year following hospitalization.

Schweickert and Kress, “Implementing early mobilization interventions in mechanically ventilated patients in the ICU.”

Dang, “ABCDEs of ICU: Early mobility.”

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Schweickert et al., “Early physical and occupational therapy in mechanically ventilated, critically ill patients: A randomised controlled trial.”

Li et al., “Active mobilization for mechanically ventilated patients: a systematic review.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

P. E. Morris, “Moving our critically ill patients: Mobility barriers and benefits,” Crit Care Clin 23, no. 1 (2007).

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Scheinhorn et al., “Post-ICU mechanical ventilation at 23 long-term care hospitals: A multicenter outcomes study.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Dang, “ABCDEs of ICU: Early mobility.”

Ibid.

Ibid.

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Kress, “Clinical trials of early mobilization of critically ill patients.”

Schweickert et al., “Early physical and occupational therapy in mechanically ventilated, critically ill patients: A randomised controlled trial.”

Dang, “ABCDEs of ICU: Early mobility.”

Engel et al., “ICU early mobilization: From recommendation to implementation at three medical centers.”

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Schweickert et al., “Early physical and occupational therapy in mechanically ventilated, critically ill patients: A randomised controlled trial.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Dang, “ABCDEs of ICU: Early mobility.”

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Schweickert et al., “Early physical and occupational therapy in mechanically ventilated, critically ill patients: A randomised controlled trial.”

Engel et al., “ICU early mobilization: From recommendation to implementation at three medical centers.”

Adler and Malone, “Early mobilization in the intensive care unit: A systematic review.”

Li et al., “Active mobilization for mechanically ventilated patients: A systematic review.”

Short-term and long-term cognitive impairment are very frequent complications of critical illness, and negatively influence survivors' abilities to function independently. Delirium during hospitalization is highly prevalent in critically ill patients and has been associated with longer lengths of stay, increased duration of mechanical ventilation, and higher risk of death. A longer duration of delirium has been associated with worse short- and long-term cognition and executive function. Given these adverse consequences, the importance of early assessment of cognitive function, including possible delirium, and early initiation of cognitive rehabilitation in critical care settings, is being increasingly recognized. Also, given the positive effects of physical exercise on cognitive function in other populations, the potential positive influence of exercise on cognitive function in the critically ill population is being examined by researchers.

Wilcox et al., “Cognitive dysfunction in ICU patients: Risk factors, predictors, and rehabilitation interventions.”

Brummel et al., “A combined early cognitive and physical rehabilitation program for people who are critically ill: The activity and cognitive therapy in the intensive care unit (ACT-ICU) trial.”

P. P. Pandharipande, T. D. Girard, and E. W. Ely, “Long-term cognitive impairment after critical illness,” N Engl J Med 370, no. 2 (2014).

Wilcox et al., “Cognitive dysfunction in ICU patients: Risk factors, predictors, and rehabilitation interventions.”

Ibid.

Pandharipande, Girard, and Ely, “Long-term cognitive impairment after critical illness.”

Brummel et al., “A combined early cognitive and physical rehabilitation program for people who are critically ill: The activity and cognitive therapy in the intensive care unit (ACT-ICU) trial.”

R. S. Miller et al., “Outcomes of trauma patients who survive prolonged lengths of stay in the intensive care unit,” J Trauma 48, no. 2 (2000).

Brummel et al., “A combined early cognitive and physical rehabilitation program for people who are critically ill: The activity and cognitive therapy in the intensive care unit (ACT-ICU) trial.”

A technical expert panel (TEP) convened by our measure development contractor provided input on the technical specifications of this quality measure, including the items included in the quality measure, inclusion and exclusion criteria. We also solicited public comment on the draft specifications of this quality measure on the CMS Quality Measures Public Comment Page (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/CallforPublicComment.html) between February 21, 2014, and March 14, 2014, and received 22 responses from stakeholders with comments and suggestions. Additional information regarding these comments may be found on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

Based on the evidence discussed above, we proposed to adopt for the LTCHQR Program for the FY 2018 payment determination and subsequent years the quality measure entitled Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function. This quality measure was developed by CMS, and we plan to submit the quality measure to the NQF for review. The MAP met in December 2013 and January 2014, and provided input to CMS as required under section 1890A(a)(3) of the Act. In its January 2014 Pre-Rulemaking Report, the MAP conditionally supported this measure and stated that the measure concept is promising, but requires modification or further development, and that functional status is a critical area of measurement.

Since the time of the MAP meeting, we have continued further development of the measure with input from technical experts, including empirical data analysis. Subsequently, we released draft specifications for the functional status quality measures, and requested public comment between February 21, 2014 and March 14, 2014. We received 22 responses from stakeholders with comments and suggestions during the public comment period, and have updated the quality measures specifications based on these comments and suggestions. The updated specifications are available for review at the LTCHQR Program Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html?redirect=/LTCH-Quality-Reporting/. We refer readers to section IX.C.2. of the preamble of this final rule for more information on the MAP.

In section 1886(m)(5)(D)(ii) of the Act, the exception authority provides that “[i]n the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary.” We reviewed the NQF's consensus endorsed measures and were unable to identify any NQF-endorsed quality measures focused on assessment of function for patients in the LTCH setting. We are unaware of any other quality measures for functional assessment that have been endorsed or adopted by another consensus organization for the LTCH setting. Therefore, we proposed to adopt this functional assessment measure for use in the LTCHQR Program for the FY 2018 payment determination and subsequent years under the Secretary's authority to select non-NQF-endorsed measures.

Additional information regarding the quality measure may be found on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

We proposed that data for the quality measure be collected through the LTCH CARE Data Set, with the submission through the Quality Improvement and Evaluation System (QIES) Assessment Submission and Processing (ASAP) system. For more information on LTCHQR Program reporting using the QIES ASAP system, we refer readers to our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCHTechnicalInformation.html. We noted our intention to revise the LTCH CARE Data Set to include new items that assess functional status, should this measure be adopted. These items, which assess specific functional activities (that is, self-care, mobility, cognition, communication, and bladder continence), would be based on functional items included in the Post-Acute Care Payment Reform Demonstration version of the CARE Tool. The items have been carefully developed and tested for reliability and validity.

We invited public comments on our proposal to adopt the quality measure entitled Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function for the LTCHQR Program, with data collection starting on April 1, 2016, for the FY 2018 payment determination and subsequent years. We refer readers to section IX.C.9.c. of the preamble of this final rule for more information on the data collection period and submission deadline for this quality measure. Our responses to public comments on these quality measures are discussed below in this section of the final rule.

Comment: Several commenters expressed support for the quality measure entitled Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function because functional improvement is an important patient-centered outcome. A few commenters noted that such improvements reduce the likelihood of infection, morbidity, mortality, and cost and significantly improve quality of life in this vulnerable population. A commenter emphasized the importance of improved functional status and improved, earlier mobility for patients who are ventilated. Commenters agreed that functional status is an important measurement gap for LTCHs and support CMS for proposing measures that address this measurement gap area. A commenter noted support for the use of the CARE Tool to streamline reporting across acute and post-acute care settings.

Response: We appreciate the support for functional status quality measures in the LTCH setting and the support for the use of cross-setting standardized assessment items.

Comment: A commenter supported a measure of function in patient care, but suggested that an outcome measure examining progress toward a functional goal would be preferable to a process measure. The commenter also suggested that CMS consider having a process measure that would address having a plan in place that addresses functional deficits at discharge.

Response: We thank the commenter for their support. We interpret the commenter's comment to suggest that we should include an outcome measure pertaining to the attainment of functional goals. We believe that patient attainment of functional goals is important and appreciate the commenter's suggested inclusion of such an outcome measure. We will consider this measurement concept as we further develop the LTCHQR Program. We further interpret the commenter's comment to suggest that we consider adopting a process measure related to functional deficits at discharge so as to ensure care coordination. We agree that such a measure concept is important in that the domain of “care coordination” is a priority to CMS. Therefore, we will consider this concept in future measure development for the LTCHQR Program.

Comment: Many commenters expressed concern that the measure is not NQF-endorsed. Commenters underscored the importance of the NQF review process, emphasizing that NQF endorsement provides assurance that the measure has been tested, can reliably and accurately collect data, is feasible to implement and is usable. For these reasons, commenters encouraged CMS to refrain from adopting measures into the LTCHQR Program until they have been endorsed by NQF for use in the LTCH setting.

Response: We agree that the NQF-endorsement process is an important part of measure development, and we have generally adopted NQF-endorsed measures whenever feasible. However, as discussed above, where such measures do not exist for the LTCH setting, we may adopt measures that are not NQF-endorsed under the Secretary's exception authority set out in section 1886(m)(5)(D)(ii) of the Act.

Comment: A commenter indicated that the quality measure uses the FIM® rating scale.

Response: We interpret this commenter to assert that we are using the FIM® rating scale. We would like to clarify that we are not using the FIM® rating scale, rather we are using a 6-level rating scale developed and tested as part of the Post-Acute Care Payment Reform Demonstration (PAC-PRD).

We reviewed the NQF's consensus endorsed measures and were unable to identify any NQF-endorsed quality measures that focused on assessment of function for patients in the LTCH setting. We are unaware of any other quality measures for functional assessment that have been endorsed or adopted by another consensus organization for the LTCH setting. Therefore, we proposed to adopt this functional assessment measure for use in the LTCHQR Program under the Secretary's authority to select non-NQF-endorsed measures. Further, CMS and its measure development contractor are planning to submit this measure for NQF review in the fall of 2014.

Comment: Several commenters agreed with the MAP's recommendation to adopt functional status measures as part of the LTCHQR Program. While most commenters agreed this was an important measure area for the LTCH population and some commenters noted that it addresses a measure area gap identified by the MAP, many commenters expressed concern that the measure is not yet fully developed and received only conditional support from the MAP. The commenters noted the MAP's conclusions that while the measure concept is promising, the measure is not ready for implementation and requires further modification and development. Commenters encouraged CMS to refrain from adopting any measures not fully supported by the MAP.

Response: We note that this quality measure has been under development for more than 3 years. The steps we undertook as part of the measure development process have included an environmental scan, technical expert panel review, and posting of specifications to solicit public input. As part of the environmental scan, we reviewed the NQF's consensus-endorsed measures and were unable to identify any NQF-endorsed quality measures that focused on assessment of function for patients in the LTCH setting. A TEP focused on functional status quality measures was convened by our measure development contractor and met in person and by phone in the fall of 2013. A report summarizing these TEP meetings titled “Summary of Feedback from TEP on the Development of Cross-Setting Functional Status Quality Measures” is available at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

Since the MAP meeting, we have continued further development of this measure, including posting of the TEP report. As discussed above, we also released draft specifications for the functional status quality measures and solicited public comment between February 21, 2014, and March 14, 2014. We received 22 responses from stakeholders with comments and suggestions during this public comment period and, based on these comments and suggestions, updated the quality measure specifications, which are available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html?redirect=/LTCH-Quality-Reporting/.

Comment: A commenter was concerned that CMS did not convene a TEP for any of the new proposed quality measures and noted that TEPs, “which evaluate . . . quality measures for importance, scientific soundness, usability, and feasibility,” are integral to developing health care setting-appropriate quality measures.

Response: Our measure contractor convened a cross-setting functional status quality measures TEP after a public call for TEP nominations. The TEP met in person on September 9, 2013, and then met via Webinar on October 21, 2013, October 28, 2013, and November 6. 2013. TEP members included experts from LTCHs, as well as IRFs and SNFs. A report summarizing recent TEP meetings focused on functional status quality measures titled “Summary of Feedback from TEP on the Development of Cross-Setting Functional Status Quality Measures” is available at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/. The functional status quality measure development built on work conducted as part of a project funded by the Assistant Secretary for Planning and Evaluation, and that project also included a cross-setting function quality measure TEP, which was held on August 15, 2012. A report summarizing that meeting is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/ASPE-Report-Analysis-of-Crosscutting-Medicare-Functional-Status-Quality-Metrics-Using-the-Continuity-and-Assessment-Record-and-Evaluation-CARE-Item-Set-Final-Report.pdf.

Comment: Some commenters indicated that the measure was inappropriate for the LTCH patient population. These commenters noted that many or most LTCH patients are not mobile or functional on admission, a significant percent are not mobile for the duration of their stay, and the majority of the functional status assessment items such as walking, picking up items and car transfers, would not be applicable to these patients. The commenters also noted that for many LTCH patients, a transfer from the bed to a chair is not possible.

Response: The development of this quality measure included a review of published literature, a review of the findings from the PAC-PRD, discussions with an LTCH expert panel and an opportunity for a public comment period. As evidenced in the literature review, the PAC-PRD findings, and through the technical expert panel input we obtained during the measure development process, we concluded that this measure is important and appropriate for the LTCH setting, and items selected during the measure's development were considered applicable.

With respect to comments about the items selected, during their use in the PAC-PRD, the LTCH staff stated that these items and their associated response scale are able to capture small changes in patient improvement, such as the progression from total dependence for task completion to completing a task with much assistance (that is, more than half the task was completed by the helper). The ability to capture this level of change was found to be significant, particularly as it pertains to the most impaired populations. Further, we made refinements to the items selected based on input from the expert panels convened by our measure development contractor and the public comment process, including those activities, for example, walking, picking up items and car transfers, which the commenter suggested were not applicable to this setting.

Comment: Several commenters conveyed their concern regarding the use of the CARE Tool as the data source for the functional status quality measures due to limited testing in LTCHs and reliability testing results. These commenters noted that several self-care and mobility items have Kappa statistics categorizing inter-rater reliability as “fair” or “moderate,” and were based on a small sample of 46 LTCH patients. These commenters stated that “fair” or “moderate” reliability, while acceptable for exploratory studies or internal quality improvement efforts, is insufficient for national use in the LTCHQR Program. Commenters recommended CMS explain the low Kappa statistics and/or re-test these items in significantly more LTCHs to address reliability issues. These commenters noted that measure testing should be oriented towards the intended setting of use of the measure and suggested additional testing in the LTCH setting be conducted.

Response: The reliability results mentioned by these commenters were only one of several reliability analyses conducted to support the development of this measure as part of the PAC-PRD. In addition to the inter-rater reliability study mentioned by these commenters, we also examined: (1) Inter-rater reliability of the CARE items using videotaped case studies, which included 114 LTCH assessments from 3 LTCHs; (2) internal consistency of the function data, which included more than 7,700 assessments from 28 LTCHs; and (3) Rasch analyses of the function data, which included more than 7,700 assessments from 28 LTCHs. The report describing these additional analyses and an interpretation of the Kappa statistics results is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/The-Development-and-Testing-of-the-Continuity-Assessment-Record-and-Evaluation-CARE-Item-Set-Final-Report-on-Reliability-Testing-Volume-2-of-3.pdf. Therefore, given the totality of the reliability analysis involved, we believe that the development of this measure included a sufficient level of analysis, and at a scientifically acceptable level, such that a quality measure could be derived from these items.

We note that as part of the LTCHQR Program we intend to evaluate the national-level data for this quality measure submitted by LTCHs to CMS. These data will inform ongoing measure development and maintenance efforts, including further analysis of reliability and validity of the data elements and the quality measure.

Comment: Several commenters expressed concern related to undue burden associated with data documentation for the functional status quality measure. Some commenters suggested that several of the functional status assessment items had low or non-response rates when used as part of PAC-PRD. These commenters requested that CMS provide additional information on how the measure has been updated to address these low response rates. Further, in the event no updates have been made, the commenters encouraged CMS to further investigate its use of items having low or high non-response rates and reconsider if all of these items remain necessary and appropriate for LTCHs.

Response: We appreciate the concerns related to any undue burden, including documentation, and take such concerns under consideration when selecting measures for the LTCHQR Program. In the specifications for the measure, we have included several gateway questions that allow the clinician to skip questions that are not pertinent, which we believe helps to reduce undue burden. For example, one item asks if the patient is or is not walking. If the patient is not walking, then the items “Walk 10 feet,” “Walk 50 feet with two turns” and “Walk 150 feet” do not require responses and are therefore skipped.

With respect to the comments that some items had low response rates, we interpreted these comments to refer to the coding responses for when a patient does not or cannot attempt a daily activity, hence the activity did not occur, and the assessor reports a code indicating the reason that the activity was not attempted (for example, “Medical Condition” or “Patient Refusal”). We interpret the comments pertaining to non-response as referring to missing data.

In the development of this measure, despite the low or high non-response rates, inclusion of these items was the result of public input and expert opinion. However, we agree with the commenters that some of the self-care and mobility items may not be activities that many LTCH patients perform, particularly at the time of admission. An analysis of the PAC-PRD data showed that LTCHs had the highest percentage of codes for these items indicating the “Activity Did Not Occur” as compared to IRFs, SNFs and HHAs. We also agree that car transfers and picking up an object are challenging activities for LTCH patients. Due to prior public input we received via public posting and technical expert panels, we had included these particular items because these items were intended to aid in the further development of this measure utilizing national data. However, if these items are removed, the use of the measure can be retained without impacting the measure outcome in any significant manner.

Because commenters expressed concern regarding burden and because commenters expressed concerns about “Activity Did Not Occur” for data elements that pertain to specific physical activities or functions that would be assessed and addressed by a care plan, we believe that we should consider removing assessment items where possible from this measure's specifications, particularly when the items are duplicative to items already included, or would often be coded as “Activity Did Not Occur” (that is, when due to medical condition) and would not specifically be meaningful in a care plan. We have therefore reviewed our measure and the assessment items needed for addressing all the key domains of function we proposed (for example, self-care, mobility, cognition, communication and bladder function). We believe there are items that could be removed from the self-care and mobility domains because they potentially overlap with items that we would retain and also because these items had high “Activity Did Not Occur” rates. Further, these items can be removed from the quality measure without affecting the measure substantively.

We followed specific rationale in our consideration for the removal of these items: (1) That these particular items had high “Activity Did Not Occur” or high non-response rates; and/or (2) that the data elements to be removed were duplicative of the remaining data elements in the quality measure for the LTCH population. We determined this, for each item, based on data analysis and public comment, our review of the item definitions, as well our review of the distribution of scores of LTCH patients from the PAC-PRD. Lastly, we evaluated our ability to finalize a modified measure, and we reviewed the modifications to the measure, through the removal of these items, to ensure that the modification was not substantive in nature.

The data elements specifically analyzed for removal were: “Dressing upper body,” “Dressing lower body,” “Putting on/taking off footwear,” “Shower/bathe self,” “Car transfer,” “1 step,” “4 steps,” “12 steps,” “Walk 10 feet on uneven surfaces” and “Pick up object,” all of which we would remove from the measure specifications for Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function. Following our analysis, the following items have been finalized for removal, with the associated rationale:

  • The item “Dressing Upper Body” had high “Activity Did Not Occur” response rates and overlaps with the item “Wash Upper Body,” which we are retaining, in that both items pertain to upper body movement and the data captured for “Wash Upper Body” would represent the activity. The item “Dressing Lower Body” had high “Activity Did Not Occur” response rates and overlaps with “Toileting Hygiene,” which we are retaining, in that both items include managing lower body clothing.
  • The item “Putting on/taking off footwear” had high “Activity Did Not Occur” response rates and also overlaps with “Toileting Hygiene,” an item which we are retaining, although we are aware that it had moderately high “Activity Did Not Occur” response rates. We note that, although we are aware that the item “Toileting Hygiene” is associated with moderate “Activity Did Not Occur” response rates, we have decided to retain the item “Toileting Hygiene” based on feedback from technical expert panels convened by the measure contractor, the public comments from stakeholders, and the relevance of the item for every patient.
  • The item “Shower/bathe self” had high “Activity Did Not Occur” response rates and overlaps with the tasks involved with the item “Wash Upper Body,” which we are retaining.
  • The mobility items we are removing, “Walking 10 Feet on Uneven Surfaces,” “Car transfer,” “1 step,” “4 steps,” “12 steps” and “Pick up object,” had high non-response rates and overlap with items “Walk 150 Feet” and “Walk 50 feet with 2 turns,” which we are retaining.

As stated in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28263) and the December 1, 2013, MUC List (pp. 39-40, 194-95), this measure provides the percentage of all LTCH patients that receive a functional assessment on admission and discharge and a care plan that addresses function. We believe that this measure, as modified in response to public comment, is consistent with the description of the measure reviewed by the MAP, which did not specify the various functions assessed or addressed by a care plan. Moreover, we believe that modification of the quality measure through the removal of duplicative assessment items with low or high non-response rates does not substantively alter this measure's application or its calculation. We have previously explained that substantive measure changes would include “those in which the changes are so significant that the measure is no longer the same measure, or when a standard of performance assessed by a measure becomes more stringent (for example: Changes in acceptable timing of medication, procedure/process, or test administration)” or “where the NQF has extended its endorsement of a previously endorsed measure to a new setting, such as extending a measure from the inpatient setting to the LTCH setting.” (77 FR 53258, 53615 through 53616).

We believe that in this case, the standard of performance assessed by this measure would become less, not more, stringent due to the modifications, and the measure is not being extended to a new patient setting. Moreover, we believe that the modifications to the measure are not “so significant that the measure [would no longer be] the same measure,” as the measure numerator, denominator, and exclusions are unchanged. Therefore, we believe that the modified version of Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function would not be inconsistent with the descriptions of the measure reviewed by the MAP and that the modifications to the measure are not substantive in nature.

Thus, in response to public comments, we are modifying the proposed quality measure, Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function, through removal of the data items noted above. Specifically, the data elements we are removing for the reasons discussed above are the following: “Dressing upper body,” “Dressing lower body,” “Putting on/taking off footwear,” “Shower/bathe self,” “Car transfer,” “1 step,” “4 steps,” “12 steps,” “Walk 10 feet on uneven surfaces,” and “Pick up object.”

Comment: Several commenters expressed concerns about the need for standardized training to ensure inter-rater reliability for the CARE Tool function items, and noted that this training would add additional burden to facilities. Several commenters also suggested CMS identify the types of LTCH personnel who would collect and report measure data.

Response: We have addressed similar concerns in the past with public outreach including training sessions, training manuals, Webinars, open door forums, help desk support and a Web site that hosts training information http://www.youtube.com/user/CMSHHSgov), and we conduct such activities for the new items. All training materials are available on the Web site free of charge. Procedures for data collection, including who may complete functional assessments, are to follow facility policies, and patient assessments are to be completed in compliance with facility and applicable State and Federal requirements. We do not provide guidance on which disciplines may complete patient assessments.

Comment: Several commenters suggested that patients with program interruptions (that is, periods of time during which the patient is transferred from the LTCH to another care setting and subsequently returned to the same LTCH; see the LTCHQR Program Manual 2.0 for current definition—Chapter 3-A https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/) be excluded from the quality measure, because it would be burdensome to collect data when the patient was transferred and then returned to the LTCH. A commenter explained that an interrupted stay patient is “discharged” from the LTCH and then “readmitted” to the LTCH within a certain fixed period under the 3-day or less interrupted stay policy and the greater than 3-day interrupted stay policy. Thus, the commenter felt it would be unnecessary to assess the patient's functional status at both points of admission and discharge since doing so may result in an inaccurate assessment of the patient's condition. The commenter also suggested that if interrupted stay patients are not excluded, then only the initial admission and the last discharge should be assessed for measure data collection purposes.

Response: For LTCH patients who experience one or more program interruptions (3 calendar days or less), completion of the function items would not be required during the program interruption, that is, at the time of the transfer to the acute care setting and the return. Patients with program interruptions are included in the quality measure, but, as the commenter suggested, assessments would be only be required for the initial admission assessment and at the time of the discharge from the LTCH stay.

Comment: A commenter expressed general concern about the difficulty of assessing cognitive function in the LTCH patient population, including circumstances such as when any mind-altering medication was given to the patient. The commenter stated that cognitive assessment items have no provisions for accounting for such circumstances, nor could they, since any such mandate would interfere with clinical practice. Due to such considerations, the commenter questioned whether it was possible to accurately capture cognitive status via observational assessment, within two days of admission. The commenter noted that timely completion of the cognitive assessment items necessitates a clinician who is familiar with the patient, which in turn affects whether these items get completed on the admission assessment.

Another commenter stated that the cognitive function assessment tool does not adequately measure baseline cognition because of the variation in LTCH patients' clinical conditions and mental status. LTCH patients are extremely fragile, and their cognition is affected by small changes, such as the time of day and the clinical condition of the patient. The commenter also expressed the opinion that the cognitive function assessment tool provides a snapshot of a patient at a given time on a given day, and is not a true reflection of the patient's cognitive functioning. The commenter added that the expertise of a clinical psychologist would be required to complete this tool.

Thus, these two commenters felt assessing the patient to collect data to complete each of the data elements for the measure would require LTCHs to expend significant time and resources reporting data whose value in measuring quality of care in the LTCH setting is questionable.

Response: We acknowledge the complexity of the LTCH patient population, and potential challenges that can limit certain assessments, for example, the inability to perform a cognitive assessment with a ventilator-dependent patient on sedation. We interpret the commenter to indicate that under such circumstances, it will not be feasible to accurately assess a patient's cognition at the beginning of the LTCH stay and that it would be an interruption in clinical care to perform such an assessment. We also interpret the commenter to indicate that in the assessment there would be no capacity to reflect recent administration of medications that impact cognitive status, although assessment of cognition is required for this measure. We agree that at the time of assessment there is information that cannot be obtained from certain patients, such as patients who are ventilator-dependent and on sedation, or in the event the patient is comatose. We agree that there are circumstances that an assessment cannot be performed, and it would be inappropriate to do so, and hence, the assessment should allow for LTCHs to indicate these circumstances when the data could not be collected.

We will address these circumstances by providing instructions on when select items can be skipped due to patient conditions and gateway questions in the LTCH CARE Data Set Version 3.00. In the specifications for the measure, we have included several gateway questions that allow the clinician to skip questions that are not pertinent. For example, one item asks if the patient is or is not walking. If the patient is not walking, then the items “Walk 10 feet,” “Walk 50 feet with two turns” and “Walk 150 feet” do not require responses and are therefore skipped. We agree with the commenter that a clinician familiar with the patient would provide the most accurate assessment of the patients' status.

Consistent with the clinical standard of practice, timely admission assessments are conducted on all patient admissions by a clinician, typically by a registered nurse who obtains assessment information to inform care planning so that the care team can become familiar with the patient and develop and implement sound clinical care and interventions. Thus, from the time of admission to an LTCH, we believe that clinical staff should collect health assessment information about the patient to inform their care. Further, we believe that such assessment data would be captured by a clinician familiar with the LTCH patient.

We interpret the second commenter to indicate that the variability in the LTCH patient cognitive status would make it difficult to obtain a baseline for use in this measure. We also interpret the commenter's concern to be related to the importance of capturing causation in mental status change. Causative factors in cognitive change do not impact the calculation of the quality measure. The measure requires the collection of the Confusion Assessment Method (CAM®) in order to capture fluctuations of cognitive function. We selected the CAM® Instrument after TEP review, and following receipt of several comments from our initial public comment opportunity in March 2014.

We disagree with the commenter's statement that the expertise of a clinical psychologist would be needed to collect the cognitive status assessment, because the CAM® was collected during the PAC-PRD by varying levels of staff, with strong inter-rater reliability without it being performed by a clinical psychologist, and there was no evidence found during this demonstration to support this concern. Furthermore, the CAM® was tested for use by bedside staff for use in the Minimum Data Set Version 3.0 and was implemented on October 1, 2010.

Both commenters suggest that capturing the assessment data would necessitate LTCHs to expend significant time and resources to collect this measure, which they further suggest may not be valuable for this setting. We disagree with these commenters in that the data collected for use in these measures is consistent with general clinical care and the CAM® itself is a specific assessment that is already utilized in the healthcare setting.

Comment: Commenters suggested that CMS take into consideration the addition of a POA indicator in selected portions of the LTCH CARE Data Set. These commenters noted that a POA indicator would be important for performing any risk adjustment of functional status measures to allow for the distinction between complications associated with care at the LTCH and a patient's preexisting conditions.

Response: The admission functional assessment data collected for this quality measure would be based on the patient's functional status at the time of admission, and we would consider the initial assessment to be “present on admission.”

In addition to soliciting comments about our proposal to adopt the functional status measure for the LTCHQR Program, we also invited public comment on our proposal to use the LTCH CARE Data Set and QIES ASAP systems for data collection and submission of the functional status measure. We received no public comments on this proposal.

After consideration of the public comments we received, we are finalizing the adoption of the measure entitled Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function for use in the LTCHQR Program, with the modifications noted in our responses to public comments above.

(2) Functional Status Quality Measure: Functional Outcome Measure: Change in Mobility Among Long-Term Care Hospital Patients Requiring Ventilator Support

Section 1206(c) of Division B of Public Law 113-67, the Pathway to SGR Reform Act of 2013, amended section 1886(m)(5)(D) of the Act to add a new clause (iv) requiring the Secretary to establish by no later than October 1, 2015, “a functional status quality measure for change in mobility among inpatients requiring ventilator support.” Accordingly, the second functional status quality measure that we proposed was an outcome quality measure entitled the Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support. This measure estimates the risk-adjusted change in mobility score between the time of admission and the time of discharge among LTCH patients requiring ventilator support at the time of admission. As noted above, LTCH patients often have functional limitations and receive rehabilitation therapy services so that they can become more independent when performing functional activities. Functional improvement is particularly relevant for patients who require ventilator support because these patients have traditionally had limited mobility due to cardiovascular and pulmonary instability, delirium, sedation, lack of rehabilitation therapy staff, and lack of physician referral.

Zanni et al., “Rehabilitation therapy and outcomes in acute respiratory failure: An observational pilot project.”

Several studies have examined functional improvement among patients in the long-term care hospitals. In a sample of 101 patients in LTCHs (three-quarters were ventilator-dependent), median functional status scores using the Functional Status Score (FSS)-ICU (rolling, supine-to-sit transfers, unsupported sitting, sit-to-stand transfers, and ambulation) improved significantly from admission to discharge, with significant change in all five functional items. A separate study of 103 patients with respiratory failure examined functional improvement and found that by the end of the respiratory ICU stay, 69.4 percent of survivors ambulated more than 100 feet, 8.2 percent ambulated less than 100 feet, 15.3 percent could sit in a chair, 4.7 percent could sit on the edge of the bed, and 2.4 percent did not accomplish any of these activities.

A. Thrush, M. Rozek, and J.L. Dekerlegand, “The clinical utility of the functional status score for the intensive care unit (FSS-ICU) at a long-term acute care hospital: A prospective cohort study,” Phys Ther 92, no. 12 (2012).

P. Bailey et al., “Early activity is feasible and safe in respiratory failure patients,” Crit Care Med 35, no. 1 (2007).

The importance of monitoring improvement in mobility skills among LTCH patients who require ventilator support at the time of admission is also supported by the high prevalence of therapy service provision as part of the treatment plan and the percent of patients discharged home after an LTCH stay. In a study of 1,419 ventilator-dependent patients from 23 LTCHs with weaning programs, physical therapy, occupational therapy, and speech therapy were the three most commonly provided services among 34 procedures, services, and treatments provided during the LTCH admission. The very high frequency of physical (84.8 percent), occupational (81.5 percent), and speech (79.7 percent) therapy reflects use of the rehabilitative model of care adopted by many post-ICU weaning programs, which is important in the restoration of function. Improvement in functional status, including mobility and self-care was noted from admission to discharge. Nearly 30 percent of all patients discharged alive returned directly home or to assisted living.

Scheinhorn et al., “Post-ICU mechanical ventilation at 23 long-term care hospitals: A multicenter outcomes study.”

Ibid.

Ibid.

A TEP convened by our measure development contractor provided input on the technical specifications of this quality measure. We also solicited public comment on the draft specifications of this quality measure, on the CMS Quality Measures Public Comment Page (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/CallforPublicComment.html) between February 21 and March 14, 2014, and received 22 responses from stakeholder with comments and suggestions.

Additional information regard the quality measure may be found on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

We proposed that data for the proposed quality measure be collected through the LTCH CARE Data Set, with the submission through the QIES ASAP system. For more information on LTCHQR Program reporting using the QIES ASAP system, we refer readers to our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCHTechnicalInformation.html. We intend to revise the LTCH CARE Data Set to include new items that assess the functional status and the risk adjustors, should this proposed measure be adopted. These items, which assess specific functional activities (that is, self-care, mobility, cognition, communication, and bladder continence), would be based on functional status items included in the Post-Acute Care Payment Reform demonstration version of the CARE Tool. The items have been carefully developed and tested for reliability and validity.

Based on the evidence discussed above, we proposed to adopt for the LTCHQR Program for the FY 2018 payment determination and subsequent years the quality measure entitled Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support. This quality measure is developed by CMS, and we plan to submit the quality measure to the NQF for review. The MAP met in December 2013 and January 2014, and the NQF provided the MAP's input to CMS as required under section 1890A(a)(3) of the Act. In its January 2014 Pre-Rulemaking Report, the MAP conditionally supported this proposed measure and stated that the measure concept is promising, but requires modification or further development, and that functional status is a critical area of measurement. Since the time of the MAP meeting, we have continued further development of the measure with input from technical experts, including empirical data analysis. Subsequently, we have released draft specifications for the function quality measures, and requested public comment between February 21 and March 14, 2014. We received 22 responses from stakeholders with comments and suggestions during the public comment period, and have updated the quality measures specifications based on these comments and suggestions. The updated specifications are available for review at the LTCHQR Program Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html?redirect=/LTCH-Quality-Reporting/. We refer readers to section IX.C.2. of the preamble of this final rule for more information on the MAP.

In section 1886(m)(5)(D)(ii) of the Act, the exception authority provides that “[i]n the case of a specified area or medical topic determined appropriate by the Secretary for which a feasible and practical measure has not been endorsed by the entity with a contract under section 1890(a) of the Act, the Secretary may specify a measure that is not so endorsed as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary.” We reviewed the NQF's consensus endorsed measures and were unable to identify any NQF-endorsed quality measures focused on improvement of function among patients in the LTCH setting. We are unaware of any other quality measures for functional improvement that have been endorsed or adopted by another consensus organization for the LTCH setting. Moreover, as discussed above, the Secretary is now required to establish such a measure by October 1, 2015. Therefore, we proposed to adopt this functional improvement measure for use in the LTCHQR Program for the FY 2018 payment determination and subsequent years under the Secretary's authority to select non-NQF-endorsed measures.

We invited public comments on our proposal to adopt the quality measure entitled Functional Outcome Measure: Change in Mobility among Patients Requiring Ventilator Support for the LTCHQR Program, with data collection starting on April 1, 2016, for the FY 2018 payment determination and subsequent years. We refer readers to section IX.C.9.c. of the preamble of this final rule for more information on the proposed data collection and submission timeline for this proposed quality measure.

Comment: Several commenters expressed support for the quality measure “Functional Outcome Measure: Change in Mobility Among Long-Term Care Hospital Patients Requiring Ventilator Support,” because functional improvement is an important patient-centered outcome. A few commenters noted that such improvements reduce the likelihood of infection, morbidity, mortality, and cost and significantly improve quality of life in this vulnerable population. A commenter emphasized the importance of improved functional status and improved, earlier mobility by those patients who are ventilated. Several commenters agreed with the MAP's recommendation to adopt functional status measures as part of the LTCHQR Program. Commenters agreed that functional status is an important measurement gap for LTCHs and supported CMS for proposing measures that address this measurement gap area. A commenter noted support for the use of common functional status items across acute and post-acute care settings. A commenter noted that this quality measure is required by public law.

Response: We appreciate the support for the quality measure entitled Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support in the LTCH setting, and for the support of use of standardized functional status items across acute and post-acute care settings.

Comment: Many commenters expressed concern that the measure is not yet fully developed and is not NQF-endorsed. Several commenters noted a number of issues that CMS should consider in the development of these functional status quality measures.

Response: We agree that the NQF endorsement process is an important part of measure development and we have generally adopted NQF-endorsed measures whenever feasible. However, where such measures do not exist for the LTCH setting, as stated in our proposal and noted above, we may adopt measures that are not NQF-endorsed for the LTCHQR Program under the Secretary's exception authority set out in section 1886(m)(5)(D)(ii) of the Act.

We reviewed the NQF's consensus endorsed measures and were unable to identify any NQF-endorsed quality measures that focused on assessment of function for patients in the LTCH setting. We are unaware of any other quality measures for functional assessment that have been endorsed or adopted by another consensus organization for the LTCH setting. Therefore, we proposed to adopt this functional assessment measure for use in the LTCHQR Program for the FY 2018 payment determination and subsequent years under the Secretary's authority to select non-NQF-endorsed measures. We plan to submit an application for NQF endorsement in the fall of 2014.

Comment: While most commenters agreed that functional improvement was an important measure area for the LTCH population and some commenters noted that it addresses a measure area gap identified by the MAP, many commenters expressed concern that the measure is not yet fully developed and had only conditional support from MAP. They noted the MAP's conclusions that while the measure concept is promising, the measure is not ready for implementation and requires further modification and development. Commenters encouraged CMS to refrain from adopting any measures not fully supported by the MAP.

Response: We note that this function quality measure has been under development for more than 3 years. The steps we undertook as part of the measure development process have included an environmental scan, technical expert panel review, and public posting of specifications with public input. A report summarizing the TEP meetings titled “Summary of Feedback from TEP on the Development of Cross-Setting Functional Status Quality Measures” is available at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

Since the time of the MAP meeting, we have continued further development of the measure, and we posted draft specifications for the functional status quality measures for public comment between February 21, 2014, and March 14, 2014. As discussed above, we received 22 responses from stakeholders with comments and suggestions during the public comment period and, based on these comments and suggestions, have updated the quality measures specifications, which are available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html?redirect=/LTCH-Quality-Reporting/.

Comment: A commenter expressed concern that CMS did not convene a TEP for any of the new proposed quality measures, and noted that TEPs, “which evaluate . . . quality measures for importance, scientific soundness, usability, and feasibility,” are integral to developing healthcare setting appropriate quality measures.

Response: Our measure contractor convened a cross-setting function quality measures TEP after a public call for TEP nominations. The TEP met in person on September 9, 2013, and then by Webinar on October 21, 2013, October 28, 2013, and November 6. 2013. TEP members included experts from LTCHs as well as IRFs and SNFs. A report summarizing recent TEP meetings focused on functional status quality measures titled “Summary of Feedback from TEP on the Development of Cross-Setting Functional Status Quality Measures” is available at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/.

The functional status quality measure development builds upon work conducted as part of a project funded by the Assistant Secretary for Planning and Evaluation, and that project also included a cross-setting function TEP, which was held on August 15, 2012. A report summarizing that meeting is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/ASPE-Report-Analysis-of-Crosscutting-Medicare-Functional-Status-Quality-Metrics-Using-the-Continuity-and-Assessment-Record-and-Evaluation-CARE-Item-Set-Final-Report.pdf.

Comment: Several commenters conveyed concerns related to undue burden associated with data documentation for the functional status quality measure.

Response: In the measure specifications, we included several gateway questions that allow the clinician to skip questions that are not pertinent, which we believe helps to reduce undue burden. For example, one item asks if the patient is or is not walking. If the patient is not walking, then the items “Walk 10 feet,” “Walk 50 feet with two turns” and “Walk 150 feet” do not require responses and are therefore skipped.

Comment: A commenter questioned the value of this measure in the LTCH setting, given that many ventilator patients have no mobility at the time of admission. Another commenter noted that for some patients, the proposed measure may not be meaningful. The commenter added that in such cases, it may be appropriate to apply certain exclusions. Another commenter suggested the use of a process measure due to limited improvement in mobility for ventilator patients.

Response: Our analyses of the PAC-PRD data found that many patients admitted to LTCHs on a ventilator have very limited mobility skills on admission, but that many did show some improvement in mobility skills during the LTCH stay, including bed mobility skills. LTCH clinicians in the PAC-PRD appreciated that the items used in this measure could capture even small improvement. We also list exclusion criteria in the draft measure specifications document, including patients with tetraplegia complete and locked-in state as well as patients with incomplete LTCH stays. We appreciate the commenter's suggestion on the use of a process measure, and we note that we are finalizing such as process measure that includes this population.

Comment: A commenter stated that in the testing of the CARE Tool, no analysis was reported of differences in functional scores at admission and discharge, thus calling into question whether there is adequate variability in change in function scores to result in a meaningful measure. This commenter stated that the Rasch analysis for assessing validity was not applied to the sensitivity of the measure for chronically and critically ill patients. The commenter concluded that if little difference in functional scores at admission and discharge is expected, then the meaningfulness of the measure is called into question.

Response: The change in self-care and mobility function for LTCH patients was reported in the Post-Acute Care Payment Reform Demonstration Final Report—Volume 4 available at: http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Reports/Downloads/PAC-PRD_FinalRpt_Vol4of4.pdf. Specifically, on page 57 of this Report, it is noted that the mean self-care change for all patients in the post-acute care setting was an improvement of 12.4 units of self-care function. Among patients with nervous system conditions, LTCH patients and SNF patients achieved very similar unadjusted change in self-care scores (10.4 and 10.1 units of self-care functional improvement, respectively). The mean mobility change for all post-acute care patients was 14.6 units of functional improvement. LTCH patients improved 11.2 units from admission to discharge, which is slightly more than the change in mobility observed among home health patients, which was 10.4 units of change. These results demonstrate that functional improvement was observed among LTCH patients using the function items from the CARE Tool. Our measure development analyses also showed improvement in mobility skills for patients requiring ventilator support.

Comment: In order to more fully evaluate the proposed functional outcome measure, a few commenters requested that CMS provide further details regarding the proposed methodology and expected utilization approach for the measure. Specifically, a commenter was interested in learning more about the risk adjustment procedures. A commenter expressed concern about the lack of a validated model to assess change in mobility among LTCH inpatients requiring ventilator support. Commenters suggested that any such tool would also need to include components for stratification based on comorbidities impacting a patient's ability to demonstrate functional improvement.

Response: The risk adjustment methodology is described in the draft quality measures specification document titled “Draft Specifications for the Proposed Functional Status Quality Measures for Long-Term Care Hospitals” available at the LTCHQR Program Web site at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting. The risk adjustment analyses are being conducted by the measure contractor and the regression coefficients (that is, weights) will be available as part of the NQF application. Risk adjustment for this measure includes variables that adjust for several comorbidities, including chronic kidney disease or dialysis; septicemia or other severe infections; metastatic lung, colorectal, bladder or other severe cancers; diabetes; paraplegia; and hemiplegia. We received several suggestions for risk adjustors as part of the March 2014 public comment process and have tested all suggested variables.

After consideration of the public comments we received, we are finalizing the adoption of the quality measure entitled Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support for use in the LTCHQR Program, as proposed.

b. Quality Measure: National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure

The third quality measure that we proposed was the CDC-developed National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome measure. The term “Ventilator-Associated Events” incorporates a range of ventilator-associated events, including ventilator-associated pneumonia (VAP), pulmonary edema, acute respiratory distress syndrome, sepsis, and atelectasis. The NHSN VAE Outcome measure provides increased measure sensitivity, more objective definitions for ventilator-associated conditions, and the potential for automated outcome detection. The NHSN VAE Outcome measure is designed for use across multiple inpatient care settings, including LTCHs. The measure specifications were created and tested in the acute care setting. During CY 2013, 105 LTCHs submitted VAE data to CDC's NHSN.

Klompas, M., Y. Khan, et al. (2011). “Multicenter Evaluation of a Novel Surveillance Paradigm for Complications of Mechanical Ventilation.” PLoS ONE 6(3): e18062.

Magill, S. S., M. Klompas, et al. (2013). “Developing a new, national approach to surveillance for ventilator-associated events*.” Crit Care Med 41(11): 2467-2475.

Data from CMS-CDC correspondence on February 10, 2014.

According to the CDC, “more than 300,000 patients receive mechanical ventilation in the United States each year.” These patients are at increased risk for infections, such as pneumonia and sepsis, as well as other serious complications including pulmonary edema, pulmonary embolism, and death. These complications can lead to longer stays in the ICU and hospital, increased health care costs and increased risk of disability (or death). The estimated mortality rate in patients aged 85 years and older with acute lung injury on mechanical ventilation is 60 percent.

Centers for Disease Control and Prevention (CDC). Ventilator-Associated Event (VAE). January 2014. http://www.cdc.gov/nhsn/PDFs/pscManual/10-VAE_FINAL.pdf.

Esteban, A., A. Anzueto, et al. (2002). “Characteristics and outcomes in adult patients receiving mechanical ventilation: a 28-day international study.” JAMA 287(3): 345-355.

Klompas, M., Y. Khan, et al. (2011). “Multicenter Evaluation of a Novel Surveillance Paradigm for Complications of Mechanical Ventilation.” PLoS ONE 6(3): e18062.

Rubenfeld, G. D., E. Caldwell, et al. (2005). “Incidence and outcomes of acute lung injury.” N Engl J Med 353(16): 1685-1693.

Centers for Disease Control and Prevention (CDC). Ventilator-Associated Event (VAE). January 2014. http://www.cdc.gov/nhsn/PDFs/pscManual/10-VAE_FINAL.pdf,.

Rubenfeld GD, Caldwell E, Peabody E, et al. Incidence and outcomes of acute lung injury. N Engl J Med 2005: 353:1685-93.

Ventilator-Associated Events represent a high-priority complication in the LTCH setting, given the older, medically complex population in LTCHs and the high prevalence of mechanical ventilation in this setting. A MedPAC analysis of MedPAR data found that 16 percent of LTCH patients used at least one ventilator-related service in 2012. In FY 2012, MS-LTC-DRG 207, a diagnosis-related group that refers to respiratory diagnosis with ventilator support for 96 or more hours, represented the most frequently occurring diagnosis among LTCH patients, at 11.3 percent of all LTCH discharges, and MS-LTC-DRG-4, a diagnosis-related group that refers to tracheostomy with ventilator support for 96 or more hours or primary diagnosis except face, mouth, and neck without major OR procedure, represented an additional 1.3 percent of all LTCH discharges. Together, the two diagnosis-related groups account for a total of nearly 18,000 discharges. Furthermore, the number of ventilated patients in LTCHs is increasing—the number of discharged patients with respiratory diagnosis with ventilator support for 96 or more hours increased 7.4 percent between 2008 and 2011.

MedPAC “Report to Congress: Medicare Payment Policy” Chapter 11 “Long-term care hospital services.” March 2014. http://www.medpac.gov/chapters/Mar14_Ch11.pdf.

Ibid.

Ibid.

Although there are no nationwide or LTCH-specific estimates of the prevalence of ventilator-associated conditions (VACs) and infection-related ventilator-associated complications (IVACs), a recent study of mechanically ventilated patients in ICUs found that approximately 10 percent developed a VAC and 5 percent developed an IVAC. Adherence to clinical practice guidelines for the prevention of VAP has been associated with decreased VAC rates in ICUs. Because VAP, one type of VAC, is considered preventable, surveillance and measurement of infection rates is important to improving quality of care and patient safety.

Muscedere, J., T. Sinuff, et al. (2013). “The clinical impact and preventability of ventilator-associated conditions in critically ill patients who are mechanically ventilated.” Chest 144(5): 1453-1460.

Ibid.

The importance of the NHSN VAE Outcome measure in LTCHs was underscored by the MAP, which stated in its January 2014 Pre-Rulemaking Report that the measure addresses a National Quality Strategy aim or priority that is currently not adequately addressed. The MAP supported the addition of this measure addressing VAEs in the LTCH setting and stated that “although this measure is not NQF-endorsed, it provides useful information for healthcare facilities to help them monitor ventilator use and identify improvements for preventing complications.”

National Quality Forum. Measure Applications Partnership Pre-Rulemaking Report: Public Comment Draft: January 2014. Available: http://www.qualityforum.org/map/.

We reviewed the NQF's consensus endorsed measures and were unable to identify any NQF-endorsed measures for VAEs in the LTCH setting (or a related setting). We are unaware of any other measures for VAEs that have been endorsed or adopted by another consensus organization for the LTCH setting (or a related setting). Therefore, we proposed to adopt the NHSN VAE Outcome measure for use in the LTCHQR Program for the FY 2018 payment determination and subsequent years under the Secretary's authority to select non-NQF-endorsed measures under section 1886(m)(5)(D)(ii) of the Act.

We proposed to use the CDC's NHSN reporting and submission infrastructure for reporting of the NHSN VAE Outcome measure. Details related to the procedures for using CDC's NHSN for data submission and information on definitions, numerator data, denominator data, data analyses, and measure specifications for the NHSN VAE Outcome measure can be found at: http://www.cdc.gov/nhsn/PDFs/pscManual/10-VAE_FINAL.pdf.

CDC's NHSN is the data collection and submission framework currently used for reporting the CAUTI (NQF #0138) and CLABSI (NQF #0139) measures for the LTCHQR Program. Further, CDC's NHSN is the data collection and submission framework adopted for data collection and reporting for the Influenza Vaccination Coverage among Healthcare Personnel measure (NQF #0431) starting on October 1, 2014, and for the NHSN Facility-Wide Inpatient Hospital-Onset Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia Outcome Measure (NQF #1716) and NHSN Facility-Wide Inpatient Hospital-Onset Clostridium difficile Infection (CDI) Outcome Measure (NQF #1717) starting on January 1, 2015. By building on the CDC's NHSN reporting and submission infrastructure, we intend to reduce the administrative burden related to data collection and submission for this measure under the LTCHQR Program. We refer readers to section IX.C.9.d. of the preamble of this final rule for more information on the data collection and submission timeline for this quality measure.

We invited public comments on our proposal to adopt the NHSN VAE Outcome measure for the LTCHQR Program, with data collection beginning on January 1, 2016, for the FY 2018 payment determination and subsequent years. We also invited public comments on our proposal to use the CDC's NHSN for data collection and submission for this measure.

Comment: Many commenters agreed that the NHSN VAE Outcome measure is an appropriate quality measurement area for the general LTCH patient population. Several commenters agreed with the NQF MAP's recommendation to adopt HAI measures as part of the LTCHQR Program. Commenters agreed that HAI measures represent an important measurement gap for LTCHs and supported CMS' proposal of a measure that addresses this measurement gap area. A commenter noted that the NHSN VAE Outcome measure is well aligned with the newly identified chronically critically ill (CCI) category of patients.

Response: We thank these commenters' for their support of our effort to implement HAI measures that address important measurement gap areas identified by the NQF MAP and other stakeholder groups.

Comment: Some commenters fully supported CMS' proposal to implement the NHSN VAE Outcome measure for the FY 2018 payment year. These commenters agreed with CMS' rationale that VAEs represent a high-priority complication in the LTCH setting and appreciated CMS' consideration for the utility of this measure given that it can be used across multiple settings.

Some commenters specifically noted that the measure offers a mechanism for LTCHs of long-term mechanical ventilation to objectively measure the impact of care improvement initiatives. Furthermore, these commenters stated that reporting the NHSN VAE Outcome measure would raise awareness to the medical detriment of extended time on mechanical ventilation and would encourage facilities to implement strategies to reduce time on mechanical ventilation. Further, these commenters noted that the foundational elements for VAE definition (positive-end expiratory pressure (PEEP), fraction of inspired oxygen (FiO2), temperature, and white blood cell count (WBC)) are readily available, objective, rational, and reportable. The commenters stated that measuring and reporting VAE along with tracking care improvement initiatives could help to quantify the extent to which VAEs are preventable.

Response: We appreciate these commenters' support of our proposal and rationale to implement the NHSN VAE Outcome measure.

Comment: Many commenters expressed concern that the measure is not NQF-endorsed, though several commenters noted that the measure is supported by the MAP. Commenters underscored the importance of the NQF review processes, emphasizing that NQF-endorsement provides assurance that the measure has been tested, can reliably and accurately collect data, is feasible to implement, and is usable. For these reasons, commenters encouraged CMS to refrain from adopting measures into the LTCHQR Program until they have been endorsed by the NQF for use in the LTCH setting. Commenters also emphasized the importance of review by the NQF via the full consensus development process, stating that time-limited endorsement from the NQF is insufficient to consider a measure for adoption in the LTCHQR Program. In addition to securing NQF-endorsement, commenters encouraged CMS to refrain from adopting any measures not supported by the NQF MAP and a TEP.

Response: We agree that the NQF-endorsement process is an important part of measure development and we have generally adopted NQF-endorsed measures whenever feasible. However, where such measures do not exist for the LTCH setting, as stated in our proposal, we may adopt measures that are not NQF-endorsed under the Secretary's exception authority set out in section 1886(m)(5)(D)(ii) of the Act. As also stated in our proposal, we reviewed the NQF's consensus endorsed measures for VAEs and were unable to identify an NQF-endorsed measure for the LTCH setting. We note that the CDC has conveyed to us that they received preliminary positive feedback from the NQF on the NHSN VAE Outcome measure and plans to submit the measure for NQF endorsement in 2015.

In addition, the NQF MAP supported the use of this measure in the LTCHQR Program and concluded that “although this measure is not NQF-endorsed, it provides useful information for healthcare facilities to help them monitor ventilator use and identify improvements for preventing complications.” Because the NHSN VAE Outcome measure was developed for use in multiple inpatient settings, including LTCHs, and because several stakeholder groups have agreed that the measure provides useful information that can prevent ventilator-associated events and impact patient outcomes, we believe the measure is appropriate for implementation in the LTCHQR Program.

National Quality Forum. Measure Applications Partnership Pre-Rulemaking Report: Final Report January 2014. Available: http://www.qualityforum.org/map/ (page 31).

Comment: A commenter questioned the appropriateness of the NHSN VAE Outcome measure for the LTCH patient population since the primary focus of care for the LTCH patient may include aggressive ventilator weaning. This commenter expressed concern that the definitions for VAE surveillance used in the NHSN VAE Outcome measure are different from the patient outcomes and clinical indicators of VAEs, such as the VAP, used in LTCHs. Further, this commenter noted that the surveillance monitoring approach used by the NHSN VAE Outcome measure does not align with LTCH patient goals (which often include aggressive ventilator weaning). Since LTCHs typically use identification of a symptomatic patient and laboratory culture results to identify VAEs, the commenter stated that implementing surveillance monitoring (in particular, ongoing monitoring of positive-end expiratory pressure and fraction of inspired oxygen) to adhere to the NHSN algorithm would be difficult and taxing in the LTCH setting and would divert resources away from other, more valuable monitoring and prevention efforts in the LTCH setting.

Response: Although we recognize that the implementation of this measure adds burden for LTCHs, the NHSN VAE measure was developed to be more sensitive to VAEs than other VAE identification measures and is also more objective than other measures. The VAE algorithm avoids the use of chest radiograph and the reliance on specific clinical signs and symptoms to identify VAP due to their subjectivity and the fact that they may be poorly or inconsistently documented in the medical record. The VAE surveillance definition algorithm used in the NHSN VAE Outcome measure was developed by a workgroup based on objective, streamlined, and potentially automatable criteria that will intentionally identify a broad range of conditions and complications occurring in mechanically-ventilated adult patients. The measure was designed for use across several different healthcare settings, including LTCHs, and in 2013, 105 LTCHs successfully submitted VAE data to CDC's NHSN, indicating that LTCHs were able to implement and operationalize this measure in their facilities. The NHSN VAE Outcome measure was also developed to facilitate potential automated outcome detection, which will contribute to increased objectivity and decreased burden on LTCHs.

Magill, S. S., M. Klompas, et al. (2013). “Developing a new, national approach to surveillance for ventilator-associated events*.” Crit Care Med 41(11): 2467-2475.

Centers for Disease Control and Prevention (CDC). Ventilator-Associated Event (VAE). January 2014. http://www.cdc.gov/nhsn/pdfs/pscManual/10-VAE_FINAL.pdf.

Centers for Disease Control and Prevention (CDC). Ventilator-Associated Event (VAE). January 2014. http://www.cdc.gov/nhsn/pdfs/pscManual/10-VAE_FINAL.pdf.

Data from CMS-CDC correspondence on February 10, 2014.

Comment: A commenter expressed concern about the validity of certain aspects of NHSN VAE Outcome measure. The commenter noted the NHSN VAE Outcome measure is used as a way to capture ventilator-associated pneumonia; however, the VAP portion of this measure is no longer valid or effective.

Response: The NHSN VAE algorithm was developed and carefully tested to be more sensitive to VAEs (including VAPs) than other VAE measures and to be more objective than other measures. The algorithm was developed based on objective, streamlined, and potentially automatable criteria and was developed and tested for a range of healthcare settings, including LTCHs.

Magill, S. S., M. Klompas, et al. (2013). “Developing a new, national approach to surveillance for ventilator-associated events*.” Critical Care Medicine 41(11): 2467-2475.

Centers for Disease Control and Prevention (CDC). Ventilator-Associated Event (VAE). January 2014. http://www.cdc.gov/nhsn/pdfs/pscManual/10-VAE_FINAL.pdf.

Research indicates the VAE algorithm detects clinical conditions such as pneumonia, ARDS, atelectasis and pulmonary edema, clinical conditions that may be preventable. In terms of what is most appropriate for making comparisons, benchmarking, etc., the overall VAE rate, which represents all events that met at least the VAC definition, and the “IVAC-plus” rate, which represents all events that met at least the IVAC definition, would be suitable for these purposes, and all facilities should be able to detect VACs and IVACs. Rates of individual events (for example, “VAC only,” “IVAC only,” and “especially possible and probable VAP”) could be used by LTCHs as “internal quality improvement” measures. “Possible and probable VAP” definitions were developed for internal quality improvement purposes rather than inter-facility comparisons because practices within and among facilities with regard to diagnostic testing of respiratory tract samples vary widely and so are not ideal for inclusion in surveillance definitions that could potentially be used to make such comparisons in the future. Using the third tier of VAE (“possible or probable VAP”) for public reporting and/or for benchmarking or comparison purposes would therefore not be recommended.

Comment: Several commenters expressed concerns regarding recent changes in the NHSN VAE Outcome measure algorithm and definitions, which were updated in January 2013 and July 2013, with additional modifications made in January 2014. A commenter noted that the updated algorithm has been in place for a relatively short period of time (implemented in the NHSN in 2013); thus, the commenter questioned whether data submitted under the new algorithm has been analyzed and validated, particularly in the LTCH setting. The commenter encouraged CMS to exercise caution in adopting the NHSN VAE Outcome measure as part of the LTCHQR Program since the measure was created and tested in the acute care hospital setting and the updated algorithm is still fairly new.

Another commenter expressed similar concerns, noting that the NHSN continues to modify the data collection algorithm based on assessment and user feedback. This commenter mentioned that the NHSN has not yet provided comparative data to enable facilities to set adequate benchmarks for targets. Another commenter noted that further experience is necessary with VAE surveillance in the LTCH setting before moving forward with the adoption of this as a quality measure. This commenter provided an example requiring clarification is whether the epidemiology of VAE differs in a LTCH setting where tracheostomies are largely predominant.

Another commenter stated the recent change in the NHSN algorithm no longer uses the Ventilator-Associated Pneumonia bundle. The commenter also stated that the previously used VAP bundle, referenced in the proposed rule, was applicable to Intensive Care Units, is outdated, and is not an appropriate measure for LTCHs.

Response: We agree that comprehensive measure testing is an important part of measure development. The CDC algorithm was developed for several healthcare settings and initial testing was conducted in acute care facilities prior to implementation. As one commenter pointed out, the CDC continues to test the algorithm and to modify it based on assessment and end-user feedback. CDC has implemented an ongoing process to continually improve this measure and ensure it is up to date and reflects the most recently available testing and user feedback results. Although more testing and validation is helpful to inform the use of a measure, based on evidence cited, measure testing conducted to date, and the fact that 105 LTCHs collected and reported data to the CDC's NHSN in CY 2013, we believe the impact this measure could have on the quality of care and patient outcomes supports our proposal to implement this measure starting January 1, 2016.

Data from CMS-CDC correspondence on February 10, 2014.

We will continue to work closely with CDC to review measure testing results and feedback on an ongoing basis and continue to assess the validity of this measure and its impact on the quality of care in LTCHs. Further, CMS and CDC will develop and provide guidance to LTCHs to support the implementation of this measure, including clarification on measure specifications. This guidance will be informed by the current and ongoing CDC NHSN experience with VAE surveillance in the LTCH setting.

Finally, we agree with the commenter who states that the former VAP bundle is out of date; hence, we have not adopted this bundle for implementation in the LTCHQR Program.

Comment: Commenters requested clarification regarding how CMS intends to define VAEs in the LTCH setting. These commenters encouraged CMS to report only the two standardized infection ratios (SIRs) listed in the NHSN specifications for the measure: VACs and IVACs. The commenters referred to the proposed rule, which states that VAE “incorporates a range of ventilator-associated events, including ventilator-associated pneumonia (VAP), pulmonary edema, acute respiratory distress syndrome, sepsis, and atelectasis” (79 FR 28267). The commenters clarified that according to the current specifications, VAE is defined not by the five aforementioned clinical conditions, but instead by quantitative changes in specific pathophysiologic parameters, including a decline in a patient's oxygenation level after a period of stability or improvement on the ventilator, evidence of infection or inflammation (for example, elevated body temperature), and laboratory evidence of respiratory infection. Commenters noted that the pathophysiologic changes which define VACs and IVACs could be due to a variety of clinical conditions including, but not limited to, those mentioned in the proposed rule. These commenters underscored that, as suggested by the current specifications, the use of quantitative parameters is appropriate at this time because available definitions of specific conditions leading to VAEs are fairly subjective, which could lead to unreliable or invalid data collection and reporting. Commenters noted that, as specified by the NHSN, the NHSN VAE Outcome measure reports two SIRs, VAC and IVAC, which are not intended to be a “roll-up” of the five clinical conditions listed in the proposed rule. The commenters encouraged CMS to report the measure in a manner consistent with those specifications.

Response: Our intent for the NHSN VAE Outcome measure as part of the LTCHQR Program is to collect and report data in alignment with NHSN measure specifications. Specifically, we will collect and report data on the two SIRs (VAC and IVAC) in alignment with the NHSN specifications. The measure would not be reported via a “roll-up,” or combined prevalence or incidence count of the five clinical conditions mentioned in the comment (ventilator-associated pneumonia (VAP), pulmonary edema, acute respiratory distress syndrome (ARDS), sepsis, and atelectasis). In the event that the measure specifications are revised through ongoing measure development by the CDC, the measure steward, we will align the data collection and reporting for the measure with revised measure specifications.

Comment: A commenter expressed concerns about the NHSN VAE Outcome measure based on recent publications (Klouwenberg et al., 2014 and Lilly et al., 2014 ) and noted that these studies demonstrate that the new definition of VAE has poor sensitivity for detecting clinically verified VAP. The commenter expressed concern about the appropriateness of developing a quality measure based upon a clinical definition that research has shown to have poor sensitivity. The commenter encouraged CMS to work with stakeholders to improve the VAE definition before implementing the NHSN VAE Outcome measure.

Klein Klouwenberg PM, van Mourik MS, Ong DS, Horn J, Schultz MJ, Cremer OL, Bonten MJ; MARS Consortium. Electronic Implementation of a Novel Surveillance Paradigm for Ventilator-associated Events. Feasibility and Validation. Am J Respir Crit Care Med. 2014 Apr 15;189(8):947-55

Lilly CM, Landry KE, Sood RN, Dunnington CH, Ellison RT 3rd, Bagley PH, Baker SP, Cody S, Irwin RS; for the UMass Memorial Critical Care Operations Group. Prevalence and Test Characteristics of National Health Safety Network Ventilator-Associated Events. Crit Care Med. 2014 May 7. [Epub ahead of print].

Response: We appreciate the commenter's concern regarding the sensitivity of the measure for detecting clinically verified VAP. Ultimately, it is a clinical diagnosis that is made by taking into account several pieces of information at the bedside. There is not a universally accepted standard approach that all LTCHs can agree on. With this in mind, the intent of VAE surveillance is not to provide a new surveillance VAP definition but instead to provide an objective measure—based on information that should be available for any patient on mechanical ventilation in any facility—that captures a broad range of conditions and complications in patients on mechanical ventilation understanding that infections are not the only potentially preventable complications of mechanical ventilation. Research indicates the VAE algorithm detects clinical conditions that may be preventable, including, but not limited to, pneumonia, ARDS, atelectasis, and pulmonary edema.

Comment: Several commenters recommended that CMS delay the January 1, 2016, implementation start date for this measure. A commenter recommended CMS delay implementation until data submitted under the new VAE algorithm is reviewed for reliability and in order to allow time to support adequate training and resources for VAE data collection. Several commenters expressed a need for the NHSN VAE Outcome measure to be further tested and refined for the LTCH setting before it is adopted for use in the LTCHQR Program.

A few commenters expressed a concern that the NHSN VAE Outcome measure was developed and tested in the acute care setting and recommended that CMS exercise caution in implementing the measure in the LTCH setting. A commenter stated there is need for better data on VAEs and responsiveness to quality improvement programs before the measure is considered suitable for inter-facility comparisons or pay-for-performance programs. This commenter asked that the implementation of NHSN VAE Outcome measure be delayed until the measure can be validated in the LTCH setting, more is known about what portion of VAE is preventable, and until risk adjustment strategies for the measure have been developed. Another commenter expressed similar concerns about lack of LTCH-specific data currently available under the new VAE algorithm and stated that implementation of the measure in the LTCHQR Program would be premature until further data is available and standards of care are developed for preventing VAEs.

Response: The CDC algorithm was developed for several health care settings, including LTCHs. While initial testing was conducted in acute care setting, the CDC continues to test the algorithm and to modify it based on assessment and end-user feedback. Further, LTCHs are acute care facilities and hence, while setting-specific testing is important, based on extensive evidence cited in our proposal, we believe that the impact this measure could have on the quality of care and patient outcomes in the LTCH setting justifies the need to implement this measure beginning January 1, 2016. CMS will continue to work closely with the CDC to review measure testing results and feedback on an ongoing basis and continue to assess the validity and reliability of this measure and its impact on the quality of care in LTCHs.

Comment: A few commenters expressed concern about the resource-intensive nature for data collection for this measure. A commenter expressed concern about the limitations of existing resources in LTCHs and noted that implementation of the measure will divert resources to NHSN VAE Outcome measure data collection and reporting activities and away from other valuable prevention activities. A commenter noted that some LTCHs do not have EHRs to facilitate data collection for this measure. Another commenter noted the complexity of the measure algorithm and the variety of patient scenarios that could be implicated and stated that these represent additional burden in collecting data for the NHSN VAE Outcome measure.

Response: Based on evidence cited in our proposal, we believe the impact this measure could have on quality of care and patient outcomes justifies additional resources needed for measure data collection. We recognize that the implementation of this measure adds data collection and reporting burdens for facilities; however, we believe the initial burden to implement the measure and train staff is necessary to improve the quality of care for patients in the LTCHs. In addition, in 2013, approximately 25 percent (n=105) of all currently Medicare-certified LTCHs reported data on this measure to CDC's NHSN; this is evidence in support the feasibility of implementation of this measure in the LTCH setting. In addition, this measure was developed to facilitate potential automated outcome detection, which could eventually lead to decreased burden for LTCH.

Data from CMS-CDC correspondence on February 10, 2014.

Further, CMS and CDC will undertake training and stakeholder communication and outreach efforts in CY 2015 and CY 2016 to support the implementation of this measure in the LTCHQR Program, similar to our ongoing efforts since 2012 to support the implementation of previously adopted measures, including the CLABSI, CAUTI, and Healthcare Professional Influenza Vaccination measures.

In addition to soliciting comments on our proposal to adopt the NHSN VAE Outcome measure for the LTCHQR Program, we also invited comments on our proposal to use the CDC's NHSN system for data collection and submission for this measure.

We received no comments on the use of the NHSN system for data collection and submission of the VAE Outcome measure. Therefore we are finalizing the National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure, as proposed, for FY 2018 payment update determination and subsequent years.

8. LTCHQR Program Quality Measures and Concepts Under Consideration for Future Years

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28268 through 28269), we stated that we are considering whether to propose one or more of the quality measures and quality measure topics listed in the table below for future years in the LTCHQR Program. We invited public comments on these measures and measure topics. We specifically invited public comments regarding the clinical importance of these measures and measure topics in the LTCH setting, feasibility of data collection and implementation, current use of these measures and measure topics in the LTCH setting, and the usability of data for these measures and measure topics to inform future quality improvements in the LTCH setting.

Future Measures and Measure Topics Under Consideration for Proposal for the LTCH Quality Reporting Program

National Quality Strategy Priority: Patient Safety
• Measures addressing Ventilator Bundle.
• Measures addressing avoidable injuries secondary to polypharmacy.
• Application of Hospital-Based Inpatient Psychiatric Services (HBIPS)-2 Hours of Physical Restraint Use (NQF #0640).
• Application of Percent of Residents Who Were Physically Restrained (Long Stay) (NQF #0687).
National Quality Strategy Priority: Effective Clinical Processes
• Severe Sepsis and Septic Shock: Management Bundle.
• Venous Thromboembolism Prophylaxis (NQF #0371).
• Ventilator Weaning Rate.
• Pain Management.
National Quality Strategy Priority: Patient- and Caregiver-Centered Care
• Depression Assessment and Management.
• Application of Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) (NQF #0166).
• Measures addressing patients' experience of care.
• Measures addressing pain control—patients' preference.
National Quality Strategy Priority: Communication and Coordination of Care
• Application of Medication Reconciliation (NQF #0097).
• Application of Medication Reconciliation Post-Discharge (NQF #0554).
• Reconciled Medication List Received by Discharged Patients (Discharges from an Inpatient Facility to Home/Self Care or Any Other Site of Care) (NQF #0646).
• Transition Record with Specified Elements Received by Discharged Patients (Discharges from an Inpatient Facility to Home/Self Care or Any Other Site of Care) (NQF #0647).
• Timely Transmission of Transition Record (Discharges from an Inpatient Facility to Home/Self Care or Any Other Site of Care) (NQF #0648).
• Measures addressing care transitions.

Comment: Commenters supported the addition of patient experience of care measures for use in the LTCHQR Program. Specifically, a commenter supported an application of the HCAHPS survey for use in the LTCHQR Program. The commenter supported the collection of patient and caregiver experience through surveys that provide feedback that only a patient or their caregiver can provide. The commenter urged CMS to undertake the necessary testing to modify the HCAHPS survey to be appropriate for use within the LTCHQR Program. The commenter suggested some modifications to the HCAHPS that would be necessary prior to implementation. These include testing the HCAHPS questions in LTCHs and the addition of new questions about key topics relevant to the LTCH patient population. The commenter added that for many patients proxy respondents would be necessary to achieve a reliable response rate.

Response: We appreciate the commenters' support of the HCAHPS survey in the LTCHQR Program, and we will take their recommendations into consideration in our measure development and testing efforts, as well as in our ongoing efforts to identify and propose appropriate measures for the LTCHQR Program in the future.

Comment: Commenters noted the “Severe Sepsis and Septic Shock: Management Bundle” was not ready for use in the LTCHQR Program. A commenter noted that the bundle was endorsed for the acute care hospital setting and would need refinement and testing for use in the LTCH setting. Another commenter recommended additional review of “Severe Sepsis and Septic Shock: Management Bundle” before proposing the bundle as a formal measure. The commenter noted that although sepsis is one of the leading causes of hospitalization and readmissions and results in significant morbidity, mortality, and increased cost in health care, the current bundle definition, including central line placement and central hemodynamic monitoring, may have other unintended consequences. The commenter underscored the NQF Patient Safety Standing Committee's recent recommendation that the item requiring measurement of central venous pressure be removed from this bundle. The commenter added that this recommendation is based on recent literature published on sepsis protocols, which found no significant benefit of the mandated use of central venous catheterization and central hemodynamic monitoring in all patients.

Response: We appreciate the commenters' acknowledgement of the significant burden sepsis can cause on health care outcomes and costs. We will take their comments regarding this measure into consideration in our measure development efforts, as well as in our ongoing efforts to identify and propose appropriate measures for the LTCHQR Program in the future.

Comment: A commenter did not support the inclusion of the “Institute for Healthcare Improvement Ventilator Bundle,” as several components of the bundle (daily sedation reduction and daily weaning of ventilator settings) may not be applicable to patients who are on a long-term ventilator and may never be weaned. Another commenter supported the development of palliative care measures for the LTCHQR Program. Another commenter recommended CMS consider development and pilot testing of measure(s) related to antimicrobial stewardship, citing this measurement area as an important one given the fact that LTCHs are often at the epicenter of clusters and outbreaks of multidrug-resistant organisms. Another commenter recommended CMS consider including The Joint Commission tobacco performance measure set in the LTCHQR Program since identifying and treating tobacco use is a cost-effective and medically effective clinical intervention demonstrated to improve health and reduce costs. Another commenter supported the addition of care coordination measures in the LTCHQR Program and noted that since patients in the LTCH setting often receive services from multiple providers, a care coordination measure would represent an important opportunity to ensure holistic, high-quality care for the LTCH population. Finally, a commenter indicated support and a recommendation to include new quality measures, after the measures have been fully developed, tested, and endorsed by a multi-stakeholder consensus organization. The commenter supported quality of life, functional status, and other patient-reported outcomes performance measures.

Response: We appreciate the commenters' recommendations, and we will take the commenters' suggestions into consideration in our measure development efforts, as well as in our ongoing efforts to identify and propose appropriate measures for the LTCHQR Program in the future.

9. Form, Manner, and Timing of Quality Data Submission for the FY 2016 Payment Determination and Subsequent Years

a. Background

Section 1886(m)(5)(C) of the Act requires that, for the FY 2014 payment determination and subsequent years, each LTCH submit to the Secretary data on quality measures specified by the Secretary and that such data shall be submitted in a form and manner, and at a time, specified by the Secretary. As required by section 1886(m)(5)(A)(i) of the Act, for any LTCH that does not submit data in accordance with section 1886(m)(5)(C) of the Act with respect to a given rate year, any annual update to the standard Federal rate for discharges for the hospital during the rate year must be reduced by two percentage points.

b. Finalized Timeline for Data Submission Under the LTCHQR Program for the FY 2016 and FY 2017 Payment Determinations (Except NQF #0680 and NQF #0431)

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50857 through 50861 and 50878 through 50881), we finalized the data submission timelines and submission deadlines for measures for the FY 2016 and FY 2017 payment determinations. We refer readers to the FY 2014 IPPS/LTCH PPS final rule for a more detailed discussion of these timelines and deadlines. Specifically, we refer readers to the table at 78 FR 50878 of the FY 2014 IPPS/LTCH PPS final rule for the data collection period and submission deadlines for the FY 2016 payment determination and the tables at 78 FR 50881 of that final rule for the data collection timelines and submission deadlines for the FY 2017 payment determination.

c. Revision to the Previously Adopted Data Collection Period and Submission Deadlines for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680) for the FY 2016 Payment Determination and Subsequent Years

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50858 through 50861), we revised the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short Stay) (NQF #0680) measure for the FY 2016 payment determination and subsequent years. In that rule (78 FR 50861, 50880 through 50882), we also revised the data collection period and submission deadlines for the FY 2016 through FY 2018 payment determinations for this measure.

For the reasons discussed in section IX.C.6.a. of the preamble of the proposed rule (79 FR 28262), we proposed to change to the data collection timeframes and submission deadlines for the FY 2016 payment determination and subsequent years. Specifically, as discussed in section IX.C.6.a. of the preamble of the proposed rule, for the FY 2016 payment determination, we proposed submission deadlines of February 15, 2015, and May 15, 2015, for this measure for data collection periods October 1-December 31, 2014, and January 1-March 31, 2015, respectively, instead of the previously finalized submission deadline of May 15, 2015, for the data collection period of October 1, 2014-April 30, 2015. The changes applicable to this measure (NQF #0680) are illustrated below for the FY 2016 payment determination. Please refer to section IX.C.6 of the preamble of this final rule for further information regarding this revision.

Data Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2016 Payment Determination for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection period (CY):Final submission deadlines for the LTCHQR program FY 2016 payment determination
Q4 (October 1—December 31, 2014)February 15, 2015.
Q1 (January 1—March 31, 2015)May 15, 2015.

Further, as discussed in section IX.C.6.a. of the preamble of the proposed rule (79 FR 28262), we proposed similar deadlines for the FY 2017 payment determination and subsequent years for the LTCHQR Program. The changes applicable to this measure (NQF #0680) are illustrated below.

Data Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2017 Payment Determination and Subsequent Years for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection period (CY):Final submission deadlines for the LTCHQR program payment determination (FY)
Q4 of the CY two years before the payment determination year (for example, October 1—December 31, 2015 for the FY 2017 payment determination)February 15 of the FY preceding the payment determination year (for example, February 15, 2016 for the FY 2017 payment determination).
Q1 of the CY one year before the payment determination year (for example, January 1—March 31, 2016 for the FY 2017 payment determination)May 15 of the FY preceding the payment determination year (for example, May 15, 2016 for the FY 2017 payment determination).

We invited public comment on our proposal to revise the data collection timeline for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680) for the FY 2016 payment determination and subsequent years.

Comment: A few commenters supported CMS' proposal to revise the data collection period and submission deadlines for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680) for the FY 2016 payment determination and subsequent years. A commenter also noted this alignment reflects the influenza season and will reduce data entry time for LTCH staff.

Response: We greatly appreciate commenters' support of our proposal to revise the data collection period and submission deadlines for NQF #0680 to better align with the influenza vaccination season.

Comment: A commenter recommended that the NQF #0680 measure not apply to patients transferred from acute care hospitals since this would represent a duplicative compliance requirement between the two care settings.

Response: We did not propose any changes to measure specifications for NQF #0680. As we stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50860), the specifications for NQF #0680 are written to ensure that “LTCHs follow current clinical guidelines to assess whether a patient should receive an influenza vaccine and to ensure that, when clinically indicated, each patient only receives one influenza vaccine.” For patients who did not receive the influenza vaccine in the LTCH, item O0250 on the LTCH CARE Data Set allows the LTCH to indicate why the vaccine was not received in the facility, including selecting an option indicating that the patient received the vaccine outside of the facility.

In addition, because this measure reports on patients who received the influenza vaccine either inside or outside the facility/hospital, for a patient who received the vaccine at another facility prior to arriving at the LTCH, there is no incentive for the LTCH to over-vaccinate or provide duplicative vaccination. Facilities will need to adhere to the principles of proper care coordination and documentation to avoid over-immunization as well as under-immunization. However, the measure specifications are designed to encourage facilities to vaccinate only when the patient has not already received the vaccination in another setting and only when clinically indicated. We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50860) for more information on this topic.

After consideration of the public comments we received, we are finalizing the revision to the data collection period and submission timeline for Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680) for the FY 2016 payment determination and subsequent years.

d. Data Submission Mechanisms for the FY 2018 Payment Determination and Subsequent Years for New LTCHQR Program Quality Measures and for Revisions to Previously Adopted Quality Measures

For the two functional status measures and the application of the Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) measure, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28270), we proposed that all LTCHs would be required to collect data using the LTCH CARE Data Set. We will release the technical data submission specifications and update LTCHQR Program Manual for the LTCH CARE Data Set (Version 3.00) to include items related to the functional status measures and the application of the Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674) measure in CY 2015. The QIES ASAP system would remain the data submission mechanism for the LTCH CARE Data Set. Further information on data submission of the LTCH CARE Data Set for the LTCHQR Program Reporting using the QIES ASAP system is available at: https://www.qtso.com/ and http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCHTechnicalInformation.html.

The LTCH CARE Data Set (Version 2.01) was approved on June 10, 2013, by OMB in accordance with the PRA. The OMB Control Number is 0938-1163. Expiration Date June 30, 2016. Available on the CMS Web site at: http://www.cms.gov/Regulations-and-Guidance/Legislation/PaperworkReductionActof1995/PRA-Listing-Items/CMS1252160.html. CMS will revise the LTCH CARE Data Set and submit for OMB review for PRA approval to support data collection for the two functional status measures and the application of the Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674).

For the NHSN VAE Outcome measure, we proposed that LTCHs would be required to use the CDC's NHSN reporting and submission infrastructure. Details related to the procedures for using CDC's NHSN for data submission and information on definitions, numerator data, denominator data, data analyses, and measure specifications for the NHSN VAE Outcome Measure can be found at: http://www.cdc.gov/nhsn/PDFs/pscManual/10-VAE_FINAL.pdf.

We invited public comments on these proposals.

Comment: A commenter supported the use of the LTCH CARE Data Set for the two functional status measures. The commenter appreciated CMS' use of the LTCH CARE Data Set to streamline reporting across acute and post-acute settings.

Response: We appreciate the commenter's feedback and support of the use of the LTCH CARE Data Set for collection of the functional status measures.

We received no comments on our proposed data submission mechanisms for the NHSN VAE Outcome measure.

After consideration of the public comments we received, we are finalizing that all LTCHs would use the LTCH CARE Data Set (Version 3.00) to collect data for the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674) and the two functional status measures. We are also finalizing that the QIES ASAP system will remain the data submission mechanism for the LTCH CARE Data Set. Further, we are finalizing that for the NHSN VAE Outcome measure, LTCHs would use the CDC's NHSN reporting and submission infrastructure for the LTCHQR Program.

e. Data Collection Period and Submission Deadlines Under the LTCHQR Program for the FY 2018 Payment Determination

In sections IX.C.9.c. and f. of the preamble of this final rule, we discuss our proposal, for the FY 2016 payment determination and subsequent years, to revise the data collection period and submission deadlines for the Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680) measure and, for the FY 2018 payment determination and subsequent years, to revise the data collection period and submission deadlines for the application of the Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674) measure. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50882), we adopted the data collection period and submission deadlines for the remaining quality measures applicable to the FY 2018 payment determination as listed in the following tables.

Timeframes for Data Collection of LTCHQR Program Quality Data for the FY 2018 Payment Determination

NQF measure IDData collection period
NQF #0138January 1, 2016-December 31, 2016.
NQF #0139January 1, 2016-December 31, 2016.
NQF #0678January 1, 2016-December 31, 2016.
NQF #0431October 1, 2016 (or when vaccine becomes available)-March 31, 2017.
NQF #1716January 1, 2016-December 31, 2016.
NQF #1717January 1, 2016-December 31, 2016.

Timeline for Submission of LTCHQR Program Quality Data for the FY 2018 Payment Determination for All Measures Except Influenza Vaccination Coverage Among Healthcare Personnel (NQF #0431) and Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection period: CY 2016Final submission deadlines for the LTCHQR program FY 2018 payment determination
Q1 (January-March 2016)May 15, 2016.
Q2 (April-June 2016)August 15, 2016.
Q3 (July-September 2016)November 15, 2016.
Q4 (October-December 2016)February 15, 2017.

Timeline for Submission of LTCHQR Program Quality Data for the FY 2018 Payment Determination: Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431)

Data collection periodFinal submission deadlines for the LTCHQR Program FY 2018 payment determination
October 1, 2016 (or when vaccine becomes available)-March 31, 2017May 15, 2017.

For the new measures that we proposed to adopt for the FY 2018 payment determination and subsequent years, we proposed the following data collection period and submission deadlines.

Data Collection Period for New LTCHQR Program Measures for the FY 2018 Payment Determination

NQF measure ID or measure name (when NQF measure ID not available)Data collection period
National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome MeasureJanuary 1, 2016-December 31, 2016.
Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator SupportApril 1, 2016-December 31, 2016.
Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses FunctionApril 1, 2016-December 31, 2016.

Submission Deadlines of LTCHQR Program Quality Data for the FY 2018 Payment Determination: National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure

Data collection periodFinal submission deadlines for the LTCHQR program FY 2018 payment determination
Q1 (January-March 2016)May 15, 2016.
Q2 (April-June 2016)August 15, 2016.
Q3 (July-September 2016)November 15, 2016.
Q4 (October-December 2016)February 15, 2017.

Submission Deadlines of LTCHQR Program Quality Data for the FY 2018 Payment Determination: Functional Outcome Measure: Change in Mobility Among Long-Term Care Hospital Patients Requiring Ventilator Support and Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function

Data collection periodFinal submission deadlines for the LTCHQR program FY 2018 payment determination
Q2 (April-June 2016) *August 15, 2016.
Q3 (July-September 2016)November 15, 2016.
Q4 (October-December 2016)February 15, 2017.
* Note that data collection implementation begins Q2.

We invited public comments on these data collection timelines and submission deadlines for the three new quality measures for FY 2018 payment determination.

We received no comments on these proposals. Therefore, we are finalizing the data collection period and submission deadlines for the three measures (the two functional measures and the NHSN VAE Outcome measure), as proposed.

f. Data Collection Timelines and Submission Deadlines for the Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) for the FY 2018 Payment Determination and Subsequent Years

In the FY 2014 IPPS/LTCH PPS final rule, we revised the application of the Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) measure for the FY 2018 payment determination and subsequent years (78 FR 50874 through 50877). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28272), we proposed, for the FY 2018 payment determination only, to move the start date for data collection of this measure to April 1, 2016, instead of the previously finalized start date of January 1, 2016. Data collection and submission of this measure will continue through December 31, 2016, as previously finalized for the FY 2018 payment determination. This change in the data collection start date will only affect CY 2016 data collection and submission for the LTCHQR Program for the FY 2018 payment determination. For all subsequent years, data collection for this measure will begin on January 1 and continue through December 31. We note that these proposed changes will be applicable only to the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) measure, and not applicable to any other LTCHQR Program measures, proposed or adopted, unless explicitly stated. We refer readers to section IX.C.6. of the preamble of this final rule for further information and rationale.

Data Collection Timelines and Submission Deadlines of LTCHQR Program Quality Data for the FY 2018 Payment Determination for the Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674)

Data collection period: CY 2016Final submission deadlines for the LTCHQR program FY 2018 payment determination
Q2 (April-June 2016) *August 15, 2016.
Q3 (July-September 2016)November 15, 2016.
Q4 (October-December 2016)February 15, 2017.
* Note that data collection implementation begins Q2.

We invited public comment on the proposed data collection timeline and quarterly submission deadlines for the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) for the FY 2018 payment determination.

Comment: Commenters supported CMS' proposal to delay the start of data collection for the NQF #0674 measure until April 1, 2016, for the FY 2018 payment determination.

Response: We appreciate commenters' support of our proposal to revise the data collection period and quarterly submission deadlines for the application of NQF #0674 and are finalizing the proposed revision to the data collection period and quarterly submission deadlines for this measure for the FY 2018 payment determination. We reiterate that this change in data collection of this measure would only apply to the FY 2018 payment determination year only; for all subsequent years, data collection for this measure would begin on January 1 and continue through December 31.

After consideration of the public comments we received, we are finalizing the data collection period and quarterly submission deadlines for the application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674) for the FY 2018 payment determination, as proposed. For all subsequent years, data collection for this measure would begin on January 1 and continue through December 31.

g. Data Collection Timelines and Submission Deadlines Under the LTCHQR Program for the FY 2019 Payment Determination and Subsequent Years

For the quality measures applicable to the FY 2019 payment determination and subsequent years, including those that we proposed in section IX.C.7. of the preamble of the proposed rule, we proposed the following data collection timelines and submission deadlines.

Data Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2019 Payment Determination

NQF neasure ID or measure name (when NQF measure ID not available)Data collection period
National Healthcare Safety Network (NHSN) Catheter-Associated Urinary Tract Infection (CAUTI) Outcome Measure (NQF #0138)January 1, 2017-December 31, 2017.
National Healthcare Safety Network (NHSN) Central Line-Associated Bloodstream Infection (CLABSI) Outcome Measure (NQF #0139)January 1, 2017-December 31, 2017.
Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay) (NQF #0678)January 1, 2017-December 31, 2017.
Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)October 1, 2017-March 31, 2018.
Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431)October 1, 2017-March 31, 2018.
National Healthcare Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Methicillin-resistant Staphylococcus areus (MRSA) Bacteremia Outcome Measure (NQF #1716)January 1, 2017-December 31, 2017.
National Healthcare Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Clostridium difficile Infection (CDI) Outcome Measure (NQF #1717)January 1, 2017-December 31, 2017.
Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long-Stay) (NQF #0674)January 1, 2017-December 31, 2017.
National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome MeasureJanuary 1, 2017-December 31, 2017.
Functional Outcome Measure: Change in Mobility among Patients Requiring Ventilator SupportJanuary 1, 2017-December 31, 2017.
Percent of LTCH Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses FunctionJanuary 1, 2017-December 31, 2017.

Data Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2019 Payment Determination for All Measures Except Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431) and Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection period: CY 2017Final submission deadlines for the LTCHQR program FY 2019 payment determination
Q1 (January-March 2017)May 15, 2017.
Q2 (April-June 2017)August 15, 2017.
Q3 (July-September 2017)November 15, 2017.
Q4 (October-December 2017)February 15, 2018.

Data Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2019 Payment Determination: Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680)

Data collection periodFinal submission deadlines for the LTCHQR Program FY 2019 payment determination
October 1, 2017-December 31, 2017February 15, 2018.
January 1, 2018-March 31, 2018May 15, 2018.

Collection Period and Submission Deadlines of LTCHQR Program Quality Data for the FY 2019 Payment Determination: Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431)

Data collection periodFinal submission deadlines for the LTCHQR Program FY 2019 payment determination
October 1, 2017-March 31, 2018May 15, 2018.

We invited public comment on these proposals. We received no comments on these proposals. Therefore, we are finalizing the data collection period and submission deadlines for the FY 2019 payment determination and subsequent years, as proposed.

10. LTCHQR Program Data Completion Thresholds for the FY 2016 Payment Determination and Subsequent Years

a. Overview

Section 1886(m)(5)(C) of the Act requires that, for the FY 2014 payment determination and subsequent years, each LTCH submit to the Secretary data on quality measures specified by the Secretary in a form and manner, and at a time, specified by the Secretary. As required by section 1886(m)(5)(A)(i) of the Act, for any LTCH that does not submit data in accordance with section 1886(m)(5)(C) of the Act with respect to a given fiscal year, any annual update to the standard Federal rate for discharges for the hospital during the rate fiscal year must be reduced by two percentage points. To date, we have not established a standard for compliance other than that LTCHs submit all applicable required data for all finalized measures, by the previously finalized quarterly deadlines. In response to input from our stakeholders seeking additional specificity related to the LTCHQR Program compliance affecting FY payment update determinations and, due to the importance of ensuring the integrity of quality data submitted to CMS, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28273 through 28275), we proposed to set specific LTCHQR Program thresholds for completeness of LTCH quality data beginning with data affecting the FY 2016 payment determination and subsequent years.

The LTCHQR Program, through the FY 2012, FY 2013, and FY 2014 IPPS/LTCH PPS final rules, requires LTCHs to submit quality data using two separate data collection/submission mechanisms: Measures collected using the LTCH CARE Data Set (LCDS) are submitted through the CMS Quality Improvement Evaluation System (QIES); and measures stewarded by the CDC (such as Healthcare-Acquired Infection (HAI) and vaccination measures), are submitted using the CDC's National Healthcare Safety Network (NHSN). We have also previously finalized a claims-based measure (All-Cause Unplanned Readmission Measure for 30 Days Post Discharge from Long Term Care Hospitals); however, claims-based measures do not require LTCHs to actually submit quality data to CMS, as they are calculated using claims data submitted to CMS for payment purposes. Thus, for claims-based measures, there is no submitted quality data to which we could apply data completion thresholds.

To ensure that LTCHs are meeting an acceptable standard for completeness of submitted data, we proposed that for the FY 2016 payment determination and subsequent years, LTCHs meet or exceed two separate program thresholds: One threshold for completion of quality measures data collected using the LCDS and submitted through QIES; and a second threshold for quality measures data collected and submitted using the CDC's NHSN. We proposed that LTCHs must meet or exceed both thresholds discussed below, in order to avoid receiving a 2 percentage point reduction to their annual payment update for a given FY, beginning with FY 2016.

We proposed to hold LTCHs accountable for different data completion thresholds for each of the two data submission mechanisms; an 80 percent data completion threshold for data collected using the LCDS and submitted through the QIES mechanism; and a 100 percent data completion threshold for data submitted through the CDC's NHSN. We proposed to hold LTCHs to the higher data completion threshold for the CDC's NHSN initially, because many LTCHs have been mandated by States to report infection data using the CDC's NHSN system for surveillance purposes, prior to the start of the LTCHQR Program on October 1, 2012, and, therefore, we believe LTCHs are more familiar with the NHSN collection and submission process.

In contrast, LTCHs had never submitted quality data using a standardized data collection instrument before October 1, 2012, such as the LCDS submitted through the QIES mechanism. In addition, we require the submission of LCDS admission and discharge data through QIES, in order for LTCHs to meet the proposed data accuracy compliance standard, which with regard to discharge data, may be more difficult to collect on patients that are discharged emergently or against medical advice, in effect making it more difficult to meet a higher level of compliance initially. Lastly, through the FY 2014 IPPS/LTCH PPS final rule, we finalized accelerated quarterly deadlines for submission of quality data, beginning January 2014, of 45 days beyond the end of each CY quarter, as opposed to the 135 day post-quarterly deadline LTCHs were previously required to meet. We feel that this is an additional challenge that LTCHs may face. We invited comment on other obstacles LTCHs may face in meeting a higher level of compliance with regard to submission of quality data using the LCDS.

Comment: A few commenters noted that individual LTCHs may have a higher than average percentage of incomplete data due to emergent discharges, as well as patients with fecal management systems. Commenters stated that emergent discharges do not allow for the collection of complete data, and that CMS guides LTCHs to enter a dash (-) for item H0400 (Bowel Continence) for those patients that have fecal management systems in place, rendering any associated admission assessment incomplete. These commenters suggested that 10 percent to 15 percent of any LTCH's patients may fall under one of the two above categories, making it difficult to comply with proposed data completion thresholds. Finally, the commenters suggested that completeness in the LTCH CARE Data Set Planned Discharge assessments may be a better metric of a facility's compliance with quality reporting completion thresholds.

Response: The proposed data completion threshold for data submitted using the LTCH CARE Data set is 80 percent. We have considered emergent discharges as one reason that LTCHs may not meet data completion thresholds approaching 100 percent. While we understand that LTCHs may not have the opportunity to complete data item H0400 (Bowel Continence) for those patients with fecal management systems in place, we believe that LTCHs should be able to meet our currently proposed threshold of 80 percent and can confirm that the majority of LTCHs are meeting this threshold presently. With respect to the future expansion of our data completion threshold policy, we will monitor LTCH performance on each required item and take steps to account for any such low response rate. If we find that the majority of LTCHs are failing to consistently respond to any one of our required items, we will either take action to modify that item on the LTCH CARE Data Set, or we will address the problem as it relates to data completion threshold compliance in future rulemaking.

With regard to the commenters' suggestion that we base completion thresholds on only planned discharge assessment, we respectfully disagree. We believe that the LTCH CARE Data Set admission assessment is an important factor in collecting data with regard to risk adjustment items. However, we will consider the effect of the inclusion of unplanned discharge data elements in our compliance determinations based on data completion thresholds, as we monitor this program.

b. LTCHQR Program Data Completion Threshold for the Required LTCH CARE Data Set (LCDS) Data Items

The LCDS is composed of data collection items designed to inform quality measure calculations, including risk-adjustment calculations, as well as internal consistency checks for logical inaccuracies. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28274), we proposed that beginning with quality data affecting the FY 2016 payment determination and subsequent years, LTCHs must meet or exceed a proposed LCDS data completion threshold of 80 percent. We proposed to assess the completeness of submitted data by verifying that for all LCDS assessments submitted by any given LTCH, at least 80 percent of those LCDS Assessments must have 100 percent of the required quality data items completed, where, for the purposes of this rule, “completed” is defined as having provided actual patient data, as opposed to a non-informative response, such as a dash (-), that indicates the LTCH was unable to provide patient data. The proposed threshold of 80 percent is based on the need for substantially complete records, which allows appropriate analysis of quality measure data for the purposes of updating quality measure specifications as they undergo yearly and triennial measure maintenance reviews with the NQF. In addition, complete data is needed to understand the validity and reliability of quality data items, including risk-adjustment models. Finally, we want to ensure complete quality data from LTCHs, which will ultimately be reported to the public, allowing our beneficiaries to gain an understanding of LTCH performance related to these quality metrics, and helping them to make informed health care choices.

Our data suggest that the majority of current LTCHs are in compliance with, or exceeding, this proposed threshold already. Our decision to set this proposed data completion threshold at a lower level initially, with the intent to raise the proposed 80 percent threshold in subsequent program years, is based on our understanding that LTCHs are still new to quality reporting, and that their experience and understanding, with respect to reporting quality data using a standardized data collection instrument, and thus their compliance, will increase over time. However, we invited public comment on circumstances that might prevent LTCHs from meeting this level of compliance. All items that we proposed to require under the LTCHQR Program are identified in Appendix D of the LTCHQR Program Manual version 2.01, which is available for download on the CMS Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCHTechnicalInformation.html.

We also proposed that any LTCH that does not meet the proposed requirement that 80 percent of all LCDS assessments submitted contain 100 percent of all required quality data items, will be subject to a reduction of 2 percentage points to the applicable FY annual payment update beginning with FY 2016. In order to establish this program threshold, we analyzed all LCDS submissions from January 2013 through September 2013, and we believe that the majority of LTCHs will be able to meet the proposed 80 percent data completion threshold. It is our intent to raise this threshold over the next 2 years, through the formal notice-and-comment rulemaking process. As stated above, we feel that as LTCHs continue to submit data using a standardized data collection instrument, such as the LCDS, and as they continue to take advantage of the resources we provide to guide LTCHs in their submission of this data (national trainings, CMS Special Open Door Forums, LTCHQR Program Manual, and technical trainings available on our Web site), we feel LTCH performance with respect to data completion will improve over time. We proposed that this threshold will have to be met by LTCHs, in addition to the CDC NHSN threshold discussed below, in order to avoid receiving a 2 percentage point reduction to the applicable FY annual payment update.

c. LTCHQR Program Data Completion Threshold For Measures Submitted Using the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN)

The LTCHQR Program through the FY 2012, FY 2013, and FY 2014 IPPS/LTCH PPS final rules, requires that LTCHs submit CDC-stewarded quality measure data using the CDC's NHSN, including data for the previously finalized CAUTI, CLABSI, and Influenza Vaccination Coverage among Healthcare Personnel (HCP) quality measures. More specifically, we require LTCHs follow CDC quality measure protocols, which require the LTCHs to complete all data fields required for both numerator and denominator data within NHSN, including the “no events” field for any month during which no infection events were identified. LTCHs are required to submit this data on a monthly basis (except for the HCP measure, which is only required to be reported once per year). However, LTCHs have until the associated quarterly deadline (45 calendar days beyond the end of each CY quarter) by which to report infection data to the CDC for each of the three months within any given quarter. For more information on the LTCHQR Program quarterly deadlines, we refer readers to section IX.C.9.b. of the preamble of this final rule.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28275), we proposed that beginning with FY 2016 payment determination and subsequent years, this previously finalized requirement for monthly reporting must be met in addition to the proposed LCDS data completion threshold discussed above in order to avoid a 2 percentage point reduction to the applicable FY annual payment update. That is, we proposed that LTCHs must meet a threshold of 100 percent for measures submitted via the NHSN, achieved by submitting relevant infection, vaccination, or other required quality measure data for each month of any given CY, in addition to meeting the above-proposed data item completion threshold for required quality data items on the LCDS. As the LTCHQR Program expands, and LTCHs begin reporting measures that were previously finalized, but not yet implemented, or newly proposed and finalized measures, we proposed to apply this same threshold.

d. Application of the 2 Percentage Point Reduction for LTCHs That Fail To Meet the Data Completion Thresholds

As we discussed above, we have proposed that LTCHs must meet two separate data completion thresholds in order to avoid a 2 percentage point reduction to their applicable FY annual payment update; a data completion threshold of 80 percent for those required data elements collected using the LCDS and submitted through QIES; and a second data completion threshold of 100 percent for quality measure data submitted through the CDC's NHSN. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28275), we proposed that these data completion thresholds must be met in addition to the data validation threshold of 75 percent we discuss below, in order to avoid a 2 percentage point reduction to their applicable FY annual payment update. While we proposed that LTCHs must meet both the proposed data completion and data validation thresholds, LTCHs cannot have their applicable annual payment update reduced twice. That is, should an LTCH fail to meet either one or both of the proposed thresholds, it will only receive one reduction of 2 percentage points to its applicable fiscal year annual payment update.

We invited public comment on these proposals.

Comment: A few commenters supported CMS' proposal to establish data completion thresholds, noting that it is a fundamental step to ensure the accuracy of the LTCH quality reporting data. A few commenters stated that CMS' proposed policy will facilitate more accurate public reporting in the future and agreed with our proposed numeric standards.

Response: We thank the commenters for their support.

Comment: Commenters recommended that CMS apply the data completion standards no earlier than the FY 2017 payment determination, instead of FY 2016. These commenters further stated that a significant amount of data for FY 2016 has already been collected and submitted and that it would be inappropriate and unfair to apply the data completion standards to data submitted before the standards were even proposed and therefore known to LTCHs.

Response: Currently, the compliance standard applicable to each LTCH is to timely submit all required quality data, and LTCHs should already be ensuring that the data that they submit is complete and accurate. Thus, applying the data completion standards to CY 2014 data merely ensures that LTCHs are complying with applicable standards and that payments made to LTCHs are based on complete and accurate quality data.

Comment: A commenter suggests that LTCHs should not be penalized by a 2 percentage point reduction to the annual payment update based on completion thresholds, citing that emergency discharges make it difficult to complete assessments.

Response: We believe that the number of unplanned discharges in LTCHs is not so substantial that it will prevent LTCHs from meeting or exceeding the proposed data completion threshold of 80 percent for data submitting using the LTCH CARE Data Set. We will continue to monitor submission patterns and completion thresholds for all data items and appropriately investigate and address any submission patterns that lead us to believe that a systematic issue is preventing LTCHs from complying with our data completion thresholds.

After consideration of the public comments we received, we are finalizing the LTCHQR Program data completion threshold for the FY 2016 payment determination and subsequent years, as proposed.

11. Data Validation Process for the FY 2016 Payment Determination and Subsequent Years

a. Data Validation Process

Historically, we have built consistency and internal validation checks into our data submission specifications to ensure that the basic elements of the LCDS assessments conform to requirements such as proper format and facility information. These internal consistency checks are automated and occur during the LTCH submission process, and help ensure the integrity of the data submitted by LTCHs by rejecting submissions or issuing warnings when LTCH data contain logical inconsistencies. These internal consistency checks are referred to as “system edits” and are further outlined in the LTCH Data Submission Specifications version 1.01, which are available for download on the LTCH Quality Reporting Technical Information Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCHTechnicalInformation.html.

Validation is intended to provide added assurance of the accuracy of the data that will be reported to the public as required by section 1886(m)(5)(E) of the Act. In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28275 through 28276) we proposed, for the FY 2016 payment determination and subsequent years, to validate the data elements submitted to CMS for quality purposes. Initially, for the FY 2016 payment determination, this data accuracy validation will apply only to the LCDS items that inform the measures Percent of Patients or Residents with Pressure Ulcers That are New or Worsened (Short-Stay) (NQF #0678). We intend to expand this validation process for quality measures affecting the FY 2017 payment determination and subsequent years through future notice-and-comment rulemaking.

We proposed to validate the data elements submitted to CMS for Percent of Residents or Patients with Pressure Ulcers That are New or Have Worsened (Short-Stay) (NQF #0678) under the LTCHQR Program by requesting the minimum chart data necessary to confirm a statistically valid random sample of 260 LTCHs. From the random sample of 260 LTCHs, 5 LCDS assessments submitted through the National Assessment Collection Database would be randomly selected by the CMS validation contractor. In accordance with § 164.512 (d)(1)(iii) of the HIPAA Privacy Rule, we would request from these LTCHs the specified portions of the 5 Medicare patient charts that correspond to the randomly selected assessments, which would need to be copied and submitted via traceable mail to a CMS contractor for validation. We proposed that the specific portions of the 5 beneficiary charts would be identified in the written request, but may include: Admission and discharge assessments, relevant nursing notes following the admission, relevant nursing notes preceding the discharge, physician admission summary and discharge summary, and any Assessment of Pressure Ulcer Form the facility may utilize. We proposed that the CMS contractor would utilize the portions of the patient charts to compare that information with the quality data submitted to CMS. Differences that would affect measure outcomes or measure rates would be identified and reported to CMS. These differences could include but are not limited to unreported worsened pressure ulcers.

We proposed that all data that has been submitted to the National Assessment Collection Database under the LTCHQR Program would be subject to the data validation process. Specifically, we proposed that the contractor would request copies of the randomly selected medical charts from each LTCH via certified mail (or other traceable methods that require an LTCH representative to sign for CMS correspondence), and the LTCH would have 45 days from the date of the request (as documented on the request letter) to submit the requested records to the contractor. If the LTCH does not comply within 30 days, the contractor would send a second certified letter to them, reminding the LTCH that it must return copies of the requested medical records within 45 calendar days following the date of the initial contractor medical record request. If the LTCH still does not comply, then the contractor would assign a “zero” score to each measure in each missing record. If, however, the LTCH does comply, the contractor would review the data submitted by the LTCH on the LCDS assessments for the required data elements associated with the Pressure Ulcer measure, until such time that LTCHs begin to submit additional quality measures that are collected using the LCDS. Initially, this review would consist solely of those required data elements that inform the Pressure Ulcer measure calculation and checks for logical inconsistencies. As LTCHs begin to report additional finalized measures, we intend to expand this validation process to quality measures affecting the FY 2017 payment determination and subsequent years, through future notice-and-comment rulemaking. The contractor would then calculate the percentage of matching data elements, which would constitute a validation score. Because we would not be validating all records, we would need to calculate a confidence interval that incorporates a potential sampling error.

To receive the full FY 2016 annual payment update, we proposed that LTCHs in the random sample must attain at least a 75 percent validation score, based upon our validation process, which would use charts requested from patient assessments submitted for CY 2013. We would calculate a 95 percent confidence interval associated with the observed validation score. If the upper bound of this confidence interval is below the 75 percent cutoff point, we would not consider a hospital's data to be “validated” for payment purposes. We proposed that LTCHs failing the validation requirements would be subject to the 2 percent annual payment update reduction, beginning with their fiscal year annual payment update. In addition, all LTCHs validated would receive educational feedback, including specific case details.

Comment: Several commenters believed that the proposed validation is a fundamental step to ensure the accuracy of the LTCH quality reporting data.

Response: We thank the commenters for their support for this proposal.

Comment: Several commenters suggested that CMS begin the validation standards no earlier than FY 2017. Although the commenters believed that validation is an important step to ensuring that hospitals are collecting measure data appropriately, they believed it would be inappropriate to validate data submitted for FY 2016 payment determination, as much of those data will be submitted prior to the effective date of CMS' finalized data accuracy validation policy on October 1, 2014.

Response: We agree that validation is important not only to ensure hospitals are collecting data appropriately, but also in providing feedback to LTCHs regarding possible differences in the findings of our validation effort. We believe the feedback a facility will receive, even if they are well above the validation minimum, could be valuable to both the LTCHs and to CMS. We are confident that most LTCHs have been submitting data accurately. Although much of the data for FY 2016 has been submitted, the FY 2013 IPPS/LTCH PPS final rule (77 FR 53620) states that LTCHs are required to submit the subset of data elements necessary to enable CMS to validate that the pressure ulcer measure data elements were accurately reported. We believe that we are operating within our authority to validate quality data. Currently, the compliance standard applicable to each LTCH is to timely submit all required quality data, and LTCHs should already be ensuring that the data that they submit is complete and accurate. Thus, validating CY 2014 data ensures that LTCHs are complying with applicable standards and that payments made to LTCHs are based on complete and accurate quality data.

Comment: Several commenters recommended that the CMS make the validation process as transparent as possible, particularly since it is new to the LTCHQR Program.

Response: We will use the requested charts to validate the following data elements: Functional mobility: “Lying to Sitting on Side of Bed;” “Bowel continence;” “Active Diagnosis;” “PVD;” “Active Diagnosis;” “Diabetes Mellitus;” “Height;” “Weight;” “Worsening stage 2 Pressure Ulcer;” “Worsening stage 3 Pressure Ulcer;” and, “Worsening stage 4 Pressure Ulcer.” We intend to share our data accuracy validation findings with the randomly selected LTCHs, so that they may gain an understanding of any discrepancies between the medical record and the LTCH CARE Data Assessment to which the medical record is being compared. We will also incorporate examples of our findings into LTCH training, special open door forums, and LTCH manuals, ensuring that the greater LTCH community benefits from this validation effort as well.

b. Application of the 2 Percentage Point Reduction for LTCHs That Fail To Meet the Data Accuracy Threshold

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28276) we proposed that LTCHs must meet a data accuracy threshold of 75 percent in order to avoid receiving a 2 percentage point reduction to their applicable fiscal year annual payment update. We proposed that this proposed data accuracy threshold of 75 percent must be met in addition to the proposed data completion thresholds (80 percent for data collected using the LTCH CARE Data Set and submitted using QIES, and 100 percent for data submitted using the CDC's NHSN), in order to avoid receiving a 2 percentage point reduction to their applicable FY annual payment update. While we proposed that LTCHs must meet both the proposed data accuracy and data completion thresholds, LTCHs cannot have their applicable annual payment update reduced twice. That is, should an LTCH fail to meet either one or both of the proposed thresholds (data completion and/or data accuracy), it will only receive one reduction of 2 percentage points to its applicable FY annual payment update.

We invited public comment on these proposals and suggestions to improve the utility of the approach or to reduce the burden on LTCHs.

Comment: A commenter noted that 260 LTCHs would represent approximately 60 percent of the entire industry, which they believed was excessive.

Response: We thank the commenter for voicing this concern and will take the proportion into consideration in future rulemaking.

Comment: A commenter asked whether “IPPS comparable” cases will be required to meet LTCHQR Program requirements or those that fall under ACH reporting requirements.

Response: We presume that the commenter is referring to current short stay outlier policy, but they could be referencing future regulation under the SGR Reform Act, where the IPPS comparable amount is one of the payment options for a “site neutral” case. Regardless, the facility/unit would be subject to the LTCHQR Program, as it is still an LTCH when it is paid an IPPS comparable amount, and the payment is a form of LTCH PPS payment.

Comment: A commenter recommended that CMS annually announce which LTCHs will be subject to validation and disseminate information about when these LTCHs should expect to begin receiving requests for medical records.

Response: We recognize the need to communicate with LTCHs whether or not they will be selected for validation. We will use the LTCHQR Program Web site, as well as direct communication with LTCHs selected for validation, to communicate time frames and deadlines regarding the data accuracy validation effort. In addition, we will use the LTCHQR Program Web site to announce, and offer access to, a new listserv specifically for the LTCHQR Program, which we will use to communicate with the provider community in the near future.

Comment: Commenters expressed concern that the threshold compliance of 75 percent agreement was too high for this first attempt to validate the Pressure Ulcer data. Commenters suggested that there would be a great deal of variability in the reporting of the Pressure Ulcer measure and that this should be an opportunity for CMS to educate LTCHs on appropriate documentation and reporting to improve the process. Commenters suggested that a 60 percent compliance threshold would be more appropriate validation.

Response: We note that the 75 percent agreement is the single point estimate of the proportion in agreement; we proposed that the upper bound of a 95 percent confidence interval be the value that must exceed the 75 percent compliance threshold. We believe this takes into account the inherent variability to be found in the pressure ulcer data. In addition, the 75 percent proportion agreement is consistent with the other data quality programs currently underway, for example, the Hospital IQR Program, 42 CFR 412.140(d)(2), and the Hospital OQR Program, 42 CFR 419.46(e)(2). We feel it is important to promulgate consistent standards when we deal with the various quality data we are collecting.

Comment: A commenter requested CMS promulgate regulations for the validation process and provide the credentials, inter-rater reliability and detail the training provided to the contractor performing the validation.

Response: We will make any future data accuracy validation regulations known to the LTCH community through future notice-and-comment rulemaking. All chart reviews will be performed by a licensed registered nurse trained in medical record review and comparison, utilizing the quality measure data specifications in the LTCH Quality Reporting Program Manual. Specified training will be provided before the actual reviews, which will include ensuring that there is inter-rater reliability among the reviewers prior to implementation of the data validation process.

Comment: A few commenters suggested that CMS adopt a two-level data validation process similar to the process used by the MACs for the IRF Compliance Percentage Threshold. An initial small sample of charts would be requested from the facilities randomly selected for validation. If the facility did not meet the initial threshold for compliance, a larger, second sample of charts would be requested. The commenters believed that 5 charts is too small of a sample size and that if two of the five charts selected for review are perceived to contain errors the facility would not meet the 75 percent validation score. Lastly, the commenters suggested that CMS select the LTCHs for validation from all LTCHs participating in the Medicare program.

Response: We will consider this approach for future years. We understand the concern regarding a relatively low sample of charts, but wish to explain that the overall validation score will be determined based on the aggregate percentage of reported elements (out of all reportable elements) in all of the sampled charts, not on the percentage of reported elements in each individual chart. Each chart will be evaluated on the 9 required data elements. Finally, we would like to confirm that the sample of randomly selected LTCHs will be drawn from the universe of all Medicare-certified LTCHs, as suggested by the commenter.

After consideration of the public comments we received, we have decided to further explore suggestions from commenters before finalizing the LTCH data validation process that we proposed. Therefore, we are not finalizing our LTCH data validation proposal at this time.

12. Public Display of Quality Measure Data for the LTCHQR Program

Under section 1886(m)(5)(E) of the Act, the Secretary is required to establish procedures for making data submitted under section 1886(m)(5)(C) of the Act available to the public. Section 1886(m)(5)(E) of the Act requires that such procedures shall ensure that an LTCH has the opportunity to review the data that is to be made public with respect to the LTCH prior to such data being made public. The statute also requires that the Secretary report quality measures that relate to services furnished in inpatient settings in LTCHs on our Web site. In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53637), we received and responded to public comments regarding the public reporting of quality data under the LTCHQR Program.

Currently, we are developing plans regarding the implementation of these provisions. We appreciate the need for transparency into the processes and procedures that will be implemented to allow for public reporting of the LTCHQR Program data and to afford LTCHs the opportunity to review that data before it is made public. At this time, we have not established procedures or timelines for public reporting of data, but we intend to include related proposals in future rulemaking.

We welcomed public comment on what we should consider when developing future proposals related to public reporting of quality measures for the LTCHQR Program.

Comment: Several commenters encouraged CMS to work with LTCHs to ensure an opportunity to review potential displays of quality data and to provide feedback prior to public reporting.

Response: We thank the commenters for taking the time to express these views and suggestions regarding public reporting and will take it into consideration for future public reporting development.

Comment: A commenter noted CMS should develop reports in the CASPER Reporting Application to indicate patients included in the Pressure Ulcer measure.

Response: We plan to begin designing and making CASPER reports accessible for LTCHs in the near future.

We thank the commenters for the responses, and we will consider them as we develop future proposals related to public reporting of quality measures for the LTCHQR Program.

13. LTCHQR Program Submission Exception and Extension Requirements for the FY 2017 Payment Determination and Subsequent Years

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50883 through 50885), we referred to these requirements as submission “waiver” requirements. We proposed to instead use the phrase “exception and extension” requirements for purposes of clarity. For the FY 2017 payment determination and subsequent years, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28276 through 28277), we proposed to continue using the LTCHQR Program's requirements that we adopted in the FY 2014 IPPS/LTCH PPS final rule for the FY 2015 payment determination and subsequent years, although the term “waiver” is replaced by “exception and extension.”

In the FY 2014 IPPS/LTCH PPS final rule, we finalized a process for LTCHs to request and for us to grant waivers with respect to the quality data reporting requirements of the LTCHQR Program for one or more quarters, beginning with the FY 2015 payment determination, when there are certain extraordinary circumstances beyond the control of the LTCH. We proposed to continue to use this previously finalized process.

In the event that an LTCH seeks to request a submission exception or extension for quality reporting purposes, the LTCH must request an exception or extension within 30 days of the date that the extraordinary circumstances occurred by submitting a written request to CMS via email to the LTCH mailbox at LTCHQRPReconsiderations@cms.hhs.gov. Exception or extension requests sent to CMS through any other channel will not be considered as a valid request for an exception or extension from the LTCHQR Program's reporting requirements for any payment determination. The written request must contain all of the finalized requirements in the FY 2014 IPPS/LTCH PPS final rule, and on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCH-Quality-Reporting-Reconsideration-and-Disaster-Waiver-Requests.html.

When an exception or extension is granted, an LTCH will not incur payment reduction penalties for failure to comply with the requirements of the LTCHQR Program, for the timeframe specified by CMS. If an LTCH is granted an exception, we will not require that the LTCH submit any quality data for a given period of time. If we grant an extension to an LTCH, the LTCH will still remain responsible for submitting quality data collected during the time frame in question, although we will specify a revised deadline by which the LTCH must submit this quality data.

In addition, in the FY 2014 IPPS/LTCH PPS final rule, we finalized a policy that allowed CMS to grant exceptions or extensions to LTCHs that have not requested them if it is determined that extraordinary circumstances affects an entire region or locale. We stated that if this determination was made, we will communicate this decision through routine communication channels to LTCHs and vendors, including, but not limited to, issuing memos, emails, and notices at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html. More information on the LTCHQR Program exception and extension requirements and processes, and all related announcements may be found at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html.

For the FY 2017 payment determination and subsequent years, we proposed that we may grant an exception or extension to LTCHs if we determine that a systemic problem with one of our data collection systems directly affected the ability of the LTCH to submit data. Because we do not anticipate that these types of systemic problems will happen often, we do not anticipate granting a waiver or extension on this proposed basis frequently. We proposed that if we make the determination to grant an exception or extension, we would communicate this decision through routine communication channels to LTCHs and vendors, including, but not limited to, issuing memos, emails, and notices on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/index.html.

We invited public comment on these proposals.

Comment: A few commenters supported the proposed Exception/Exemption proposal.

Response: We thank the commenters for their support.

After consideration of the public comments we received, we are finalizing the LTCHQR Program submission exception and extension requirements for the FY 2017 payment determination and subsequent years, as proposed.

14. LTCHQR Program Reconsideration and Appeals Procedures for the FY 2016 Payment Determination and Subsequent Years

a. Previously Finalized LTCHQR Program Reconsideration and Appeals Procedures for the FY 2014 and FY 2015 Payment Determinations

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50885 through 50887), we finalized a voluntary process that allowed LTCHs the opportunity to seek reconsideration of our initial noncompliance decision for the FY 2014 and FY 2015 payment determinations. We refer readers to that rule for a discussion of this process.

b. LTCHQR Program Reconsideration and Appeals Procedures for the FY 2016 Payment Determination and Subsequent Years

For the FY 2016 payment determination and subsequent years, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28277 through 28278), we proposed to adopt an updated process, as described below, that will enable an LTCH to request a reconsideration of our initial noncompliance decision in the event that an LTCH believes that it was incorrectly identified as being subject to the 2-percentage point reduction to its annual payment due to noncompliance with the LTCHQR Program reporting requirements for a given reporting period.

For the FY 2016 payment determination, and subsequent years, we proposed that an LTCH would receive a notification of noncompliance if we determine that the LTCH did not submit data in accordance with section 1886(m)(5)(C) of the Act with respect to the applicable fiscal year and that the LTCH is therefore subject to a 2-percentage point reduction in the applicable payment determination as required by section 1886(m)(5)(A)(i) of the Act. We would only consider requests for reconsideration after an LTCH has been found to be noncompliant and not before.

An LTCH would have 30 days from the date of the initial notification of noncompliance to review its payment determination and submit to us a request for reconsideration. This proposed time frame would allow us to balance our desire to ensure that LTCHs have the opportunity to request reconsideration with our need to complete the process and provide LTCHs with our reconsideration decision in a timely manner. Notifications of noncompliance and any subsequent notifications from CMS would be sent via a traceable delivery method, such as certified U.S. mail or registered U.S. mail. We proposed that an LTCH may withdraw its request at any time and may file an updated request within the proposed 30-day deadline. We also proposed that, in very limited circumstances, we may grant a request by an LTCH to extend the proposed deadline for reconsideration requests. It would be the responsibility of an LTCH to request an extension and demonstrate that extenuating circumstances existed that prevented the filing of the reconsideration request by the proposed deadline.

We also proposed that as part of the LTCH's request for reconsideration, the LTCH would be required to submit all supporting documentation and evidence demonstrating: (1) Full compliance with all LTCHQR Program reporting requirements during the reporting period; or (2) extenuating circumstances that affected noncompliance if the LTCH was not able to comply with the requirements during the reporting period. We would not review any reconsideration request that fails to provide the necessary documentation and evidence along with the request. The documentation and evidence may include copies of any communications that demonstrate its compliance with the program's requirements, as well as any other records that support the LTCH's rationale for seeking reconsideration. A sample list of acceptable supporting documentation and evidence, as well as instructions for LTCHs to retrieve copies of the data submitted to CMS for the appropriate program year can be found on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCH-Quality-Reporting-Reconsideration-and-Disaster-Waiver-Requests.html.

We proposed that an LTCH wishing to request a reconsideration of our initial noncompliance determination would be required to do so by submitting an email to the following email address: LTCHQRPReconsiderations@cms.hhs.gov. Any request for reconsideration submitted to us by an LTCH would be required to follow the guidelines outlined on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCH-Quality-Reporting-Reconsideration-and-Disaster-Waiver-Requests.html.

Following receipt of a request for reconsideration, we will provide—

  • An email acknowledgment, using the contact information provided in the reconsideration request, to the CEO or CEO-designated representative that the request has been received; and
  • Once we have reached a decision regarding the reconsideration request, an email to the LTCH CEO or CEO-designated representative, using the contact information provided in the reconsideration request, regarding our decision.

We proposed to require an LTCH that believes it was incorrectly identified as being subject to the 2-percentage point reduction to its annual payment update to submit a timely request for reconsideration and receive a decision on that request before the LTCH can file an appeal with the Provider Reimbursement Review Board (PRRB). If the LTCH is dissatisfied with the decision rendered at the reconsideration level, the LTCH could appeal the decision with the PRRB under 42 CFR 405.1835. We believe this proposed process is more efficient and less costly for CMS and for LTCHs because it decreases the number of PRRB appeals by resolving issues earlier in the process. Additional information about the reconsideration process including requirements for submitting a reconsideration request is posted on our Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/LTCH-Quality-Reporting/LTCH-Quality-Reporting-Reconsideration-and-Disaster-Waiver-Requests.html.

We invited public comment on the proposed procedures for reconsideration and appeals.

Comment: Several commenters supported the proposal to continue the reconsideration process for FY 2016.

Response: We thank the commenters for their support.

Comment: A commenter supported the reconsideration process, but believed that it should be expanded to include data validation.

Response: We believe the current reconsideration process could be utilized for reconsideration of the validation findings, as long as all of the documentation used for the request for reconsideration was submitted at the time of validation. As noted above, we are finalizing our data completeness proposal, but we are not finalizing our data validation proposal at this time.

Comment: A commenter stated that CMS should set the reconsideration process in regulation as has been done in other administrative appeals processes. In addition, the commenter did not believe that CMS has demonstrated the ability to manage this level of additional administrative complexity in a prompt manner. The commenter believed that CMS should allow LTCHs to appeal to the PRRB without having to go through CMS first.

Response: We plan to propose regulations for reconsideration in future rulemaking. We note that while some CMS programs have codified their reconsideration processes in regulations, not all CMS reconsideration processes have been codified. We disagree that we have not demonstrated the ability to manage this level of additional administrative complexity. The LTCHQR Program completed all reconsiderations and notified all LTCHs of those reviews within 60 days in FY 2013. We believe that requiring LTCHs to first submit to the CMS reconsideration process prior to requesting a hearing at the PRRB will allow us the opportunity to overturn an erroneous decision when we have a systematic process and resources in place to do so, and ultimately decrease any unnecessary burden on the PRRB process.

After consideration of the public comments we received, we are finalizing the LTCHQR Program reconsideration and appeals procedures for the FY 2016 payment determination and subsequent years, as proposed.

15. Electronic Health Records (EHR) and Health Information Exchange (HIE)

We are also interested in understanding the current state of electronic health record (EHR) adoption and use of Health Information Exchange (HIE) in the LTCH community. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28278) we solicited feedback and input from LTCHs and the public on EHR adoption and HIE usage. We noted that are especially interested in LTCH feedback and input on the following questions:

  • Have you adopted an EHR in your LTCH setting?
  • If your LTCH setting uses EHRs, what functional aspects of EHRs do you find most important (for example, the ability to send or receive transfer of care information; the ability to support medication orders/medication reconciliation)?
  • Does the EHR system used in your LTCH setting support interoperable document exchange with other healthcare providers (for example, acute care hospitals, physician practices, skilled nursing facilities, etc.)?

In addition to seeking public feedback and input on the feasibility and desirability of EHR adoption and use of HIE in LTCHs, we stated that we are also interested in public comment on the need to develop electronic clinical quality measures, and the benefits and limitations of implementing these measures for LTCHs.

Comment: Commenters expressed support of the adoption and use of EHRs, HIEs and electronic prescribing in the LTCH setting. The commenters suggested that it is a critical step to achieving efficiencies and improving the quality of care provided by LTCHs, and that it is important to allow LTCHs to exchange information with other types of providers to improve care coordination and to participate in Accountable Care Organizations and other reform efforts.

Response: We thank the commenters for their support.

Comment: Some commenters urged CMS to consider a funding incentive program for the adoption of EHR technology by LTCHs that includes the same opportunities afforded to eligible physicians, CAHs, and acute care hospitals under the HITECH provisions of Public Law 111-5, the American Recovery and Reinvestment Act of 2009. The commenters noted that the lack of funding is a significant challenge to EHR adoption in the LTCH setting and calls into question the feasibility of requiring EHR use. Another commenter suggested that it is premature to consider the further development of electronic clinical quality measures for the LTCH setting until compensation is offered for implementing EHRs.

Response: We believe that these recommendations and concerns are important considerations related to EHR adoption and HIE usage in the LTCH setting and help to inform our understanding of these issues.

Comment: Several commenters indicated that their LTCHs have adopted EHR technology and indicated challenges they have been facing. First, the amount of information generated by the EHRs can be overwhelming, and there is a significant challenge associated with utilizing the information in a timely and meaningful way. Second, the lack of interoperability between acute care hospitals' and LTCH EHRs make information exchange difficult. Third, the information currently being collected by HIEs are rudimentary and does not necessarily meet the information needs to LTCHs.

A commenter indicated that not all proposed and new LTCH quality measures utilize EHR information and, therefore, suggested that LTCHs face the burden of manually reviewing each patient's entire medical record regardless of whether EHR technology has been adopted.

Response: We thank the commenters for their observations. We believe that these concerns are important considerations related to EHR adoption and HIE usage in the LTCH setting and help to inform our understanding of these issues.

D. Electronic Health Record (EHR) Incentive Program and Meaningful Use (MU)

1. Background

The HITECH Act (Title IV of Division B of the ARRA, together with Title XIII of Division A of the ARRA) authorizes incentive payments under Medicare and Medicaid for the adoption and meaningful use of certified electronic health record (EHR) technology (CEHRT). We refer to this program as the EHR Incentive Program. Eligible hospitals (EHs) and critical access hospitals (CAHs) may qualify for these incentive payments under Medicare (as authorized under sections 1886(n) and 1814(l) of the Act, respectively) if they successfully demonstrate meaningful use of CEHRT, which includes reporting on clinical quality measures (CQMs) using CEHRT. Sections 1886(b)(3)(B) and 1814(l) of the Act also establish downward payment adjustments under Medicare, beginning with fiscal year 2015, for eligible hospitals and CAHs that are not meaningful users of CEHRT for certain associated reporting periods. We refer to this part of the EHR Incentive Program as the Medicare EHR Incentive Program. Sections 1903(a)(3)(F) and 1903(t) of the Act provide the statutory basis for Medicaid incentive payments.

The set of CQMs from which eligible hospitals and CAHs will report under the EHR Incentive Program beginning in FY 2014 is listed in Table 10 of the EHR Incentive Program Stage 2 final rule (77 FR 54083 through 54087). We continue to believe there are important synergies with respect to the Medicare EHR Incentive Program and the Hospital IQR Program. We believe the financial incentives under the Medicare EHR Incentive Program for the adoption and meaningful use of CEHRT by EHs and CAHs will encourage the adoption and use of CEHRT for the electronic reporting of CQMs under the Hospital IQR Program. We expect that the electronic submission of quality data from EHRs under the Medicare EHR Incentive Program will provide a foundation for establishing the capacity of hospitals to send, and for CMS to receive, CQMs via CEHRT for certain Hospital IQR Program measures.

2. Alignment of the Medicare EHR Incentive Program Reporting and Submission Timelines for Clinical Quality Measures With Hospital IQR Program Reporting and Submission Timelines

We believe it is important to continue our goal of aligning the Medicare EHR Incentive Program with the Hospital IQR Program because alignment of these programs will serve to reduce hospital reporting burden and encourage the adoption and meaningful use of CEHRT by eligible hospitals and CAHs. Section 1886(n)(3)(B)(iii) of the Act requires that, in selecting measures and establishing the form and manner for reporting measures under the Medicare EHR Incentive Program, the Secretary shall seek to avoid redundant or duplicative reporting with reporting otherwise required, including reporting under section 1886(b)(3)(B)(viii) of the Act (the Hospital IQR Program). The reporting and submission timelines for the Medicare EHR Incentive Program for eligible hospitals and CAHs currently operate on a Federal fiscal year basis, while the reporting and submission timelines for the Hospital IQR Program currently operate on a calendar year basis. This difference may create confusion and additional burden for hospitals attempting to report data to both programs. To alleviate this possible confusion, reduce provider burden, and strengthen our commitment to aligning programs, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28278 through 28279) we proposed to align the reporting and submission periods for clinical quality measures for the Medicare EHR Incentive Program with that of the Hospital IQR Program on a calendar year basis in 2015 and 2016.

We realize that aligning the Medicare EHR Incentive Program to the calendar year would mean shifting the timeline for reporting and submission of CQMs such that the submission period would continue through February of the subsequent calendar year rather than ending in November as it is currently done, and therefore would delay the incentive eligibility assessment, and subsequently delay the Medicare EHR incentive payments under Medicare made to eligible hospitals and CAHs. In order to ease the transition of the reporting period to the calendar year, and to prevent the delay of Medicare EHR incentive payments, we proposed to incrementally shift the Medicare EHR Incentive Program reporting periods for CQMs. Specifically, for 2015 and 2016, we proposed for the Medicare EHR Incentive Program to require calendar year reporting for CQM data that are submitted electronically, but require that the data be reported only for the first three calendar quarters (that is, January through March, April through June and July through September) allowing the reporting period, incentive eligibility assessment, and incentive payments to remain on their current schedule.

We noted that this proposal would only apply for eligible hospitals and CAHs submitting CQMs electronically for 2015 and 2016, and that hospitals demonstrating meaningful use for the first time in 2015 or 2016 would still be required to report CQMs by attestation for a continuous 90-day period in FY 2015 or 2016, or report CQMs electronically, by July 1 of the given year to avoid the Medicare penalty in the subsequent year as finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50903 through 50905). Medicaid-only providers would continue to report according to State requirements. The proposal would not change the reporting periods or requirements for the meaningful use objectives and associated measures under 42 CFR 495.6 or for CQMs that are reported by attestation via the Registration and Attestation System. This proposal would allow us to align the CQM reporting periods for the Medicare EHR Incentive Program with that of the Hospital IQR Program without delaying payment of the Medicare EHR incentive payments for 2015 and 2016.

To further align CQM reporting for the two programs, we proposed to require quarterly reporting of electronically reported CQMs for the Medicare EHR Incentive Program to align with the currently established quarterly electronic CQM reporting periods for the Hospital IQR Program. Additionally, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28242 through 28243) the Hospital IQR Program proposed to change its submission period for electronic CQMs from annual to quarterly submission. We refer readers to the Hospital IQR Program discussion in section IX.A.7.h. of the preamble of that proposed rule for more information about this proposal. Therefore, for the CY 2015 and 2016 reporting periods, we also proposed to align the Medicare EHR Incentive Program submission period with that being proposed for the Hospital IQR Program. The table below illustrates the current reporting periods, and the following table further illustrates our proposals.

Current (2014) Timelines for EHR Incentive Program and Hospital IQR Program Reporting and Submission

EHR incentive program CQM reporting requirementsHospital IQR program reporting requirements for FY 2016 payment determination
2014 Reporting PeriodFY 2014 October 1, 2013-September 30, 2014Report one full year ORQ4 CY 2013October 1, 2013-December 31, 2013. N/A for 2014 Hospital IQR Program reporting.
Report one three-month quarter ORQ1 CY 2014January 1-March 31, 2014.
Report any continuous 90-day periodQ2 CY 2014April 1-June 30, 2014.
Q3 CY 2014July 1-September 30, 2014.
Submission PeriodJan 2, 2014-Nov 30, 2014October 1, 2013-November 30, 2014.

Proposed Timelines To Align the Medicare EHR Incentive Program With Proposed Hospital IQR Program Reporting and Submission

CYEHR incentive program reporting requirements *Hospital IQR program reporting requirementsSubmission period **
2015 Reporting PeriodQ1January 1-March 31, 2015January 1-March 31, 2015Data must be submitted by May 30, 2015.
Q2April 1-June 30, 2015April 1-June 30, 2015Data must be submitted by August 30, 2015.
Q3July 1-September 30, 2015July 1-September 30, 2015Data must be submitted by November 30, 2015.
Q4N/A for EHR Incentive ProgramOctober 1-December 31, 2015For Hospital IQR Program, data must be submitted by February 28, 2016.
2016 Reporting PeriodQ1January 1-March 31, 2016January 1-March 31, 2016Data must be submitted by May 30, 2016.
Q2April 1-June 30, 2016April 1-June 30, 2016Data must be submitted by August 30, 2016.
Q3July 1-September 30, 2016July 1-September 30, 2016Data must be submitted by November 30, 2016.
Q4N/A for EHR Incentive ProgramOctober 1-December 31, 2016For Hospital IQR Program, data must be submitted by February 28, 2017.
* Calendar year alignment and quarterly reporting for 2015 and 2016 would apply for electronically reported CQM data only.
** Proposed EHR Incentive Program and Hospital IQR Program submission period would allow data submission on an ongoing basis starting January 2 of the reporting year, and ending approximately 60 days after the end of the quarter.

We invited public comment on these proposals.

Comment: Many commenters supported CMS' proposed alignment between the Medicare EHR Incentive Program and Hospital IQR Program. Commenters appreciated CMS' efforts to align these programs and felt alignment would reduce overall quality reporting burden. Several commenters specifically expressed their support of the proposal to align the reporting and submission timelines of CQMs for the Medicare EHR Incentive Program with reporting and submission timelines for the Hospital IQR Program stating that this alignment would reduce confusion among the programs and reduce reporting burden. A few commenters noted that the proposal did not address the reporting and submission timeline for reporting CQMs via attestation, or the reporting and submission timelines of the meaningful use objectives. Some of these commenters requested that CMS clarify whether those timelines would also be affected by this proposal.

Response: We appreciate the comments in support of our alignment efforts with the Hospital IQR Program, and agree that our proposal to align timelines for the programs would reduce confusion and reporting burden. For this reason, we are finalizing our proposal, with the modifications discussed below, to align the reporting and submission timelines for CQMs that are reported electronically for the Medicare EHR Incentive Program with the reporting and submission timelines of the Hospital IQR Program on the calendar year for 2015. Although it is still our general goal to continue this alignment on a calendar year basis for 2016, we are not finalizing the proposals for 2016 at this time and will address the policy for 2016 in future rulemaking. We will continue to evaluate our policies for 2016, and maintain our goal of alignment with the Hospital IQR Program.

We note that we did not propose to change the reporting periods or requirements for the meaningful use objectives and associated measures under 42 CFR 495.6 or for CQMs that are reported by attestation via the Registration and Attestation System, and thus, the policy will remain the same. We also note that we will consider these comments and possible alignment of CQMs reported by attestation in future rule making.

Comment: Many commenters expressed their views regarding CMS' proposal to require quarterly submission of CQMs reported electronically for the Medicare EHR Incentive Program. In general, commenters felt it was premature to require quarterly submission of CQMs in 2015 for the Medicare EHR Incentive Program given the delays with certification of EHR technology in 2014 and anticipated changes in attestation requirements. Commenters also expressed concerns over whether EHRs would be ready for quarterly reporting by the first quarter of 2015, and suggested that CMS consider a pilot program for quarterly reporting instead of requiring it for 2015.

Response: We refer readers to the Hospital IQR Program discussion in section IX.A.9.d. of the preamble of this final rule for further discussion of the comments related to quarterly reporting.

We appreciate and understand the commenters' concerns regarding quarterly reporting, and understand the feedback we have received from stakeholders concerning delays in certification of EHR technology. We additionally acknowledge that our requirement to report the most recent version of the CQMs as finalized below poses a challenge to eligible hospitals and CAHs in implementing quarterly reporting as EHR vendors can be certified to 2014 CEHRT without updating to the most recent version of CQMs. We note that at this time, we do not plan to offer quarterly reporting on a pilot basis in 2015.

Based on commenters' concerns, and the additional challenges posed by requiring the most recent version of the CQMs for 2015 reporting, we have decided not to finalize our proposal to require quarterly submission of electronically reported CQMs for the Medicare EHR Incentive Program in 2015, and instead maintain in 2015 our policy of one annual submission period to align with the submission period for CQMs reported electronically under the Hospital IQR Program. This annual submission period begins on January 2 and ends on November 30 (for example, for the reporting periods in 2015, the submission period is January 2, 2015 through November 30, 2015).

In addition, and to align with the Hospital IQR Program in 2015, we are not finalizing our proposal to require three quarters of CQM data for calendar year 2015. Instead, for CQM data submitted electronically, we will require one calendar quarter of data for 2015 from either Q1 (January 1, 2015-March 31, 2015), Q2 (April 1, 2015-June 30, 2015), or Q3 (July 1, 2015-September 30, 2015). As noted above, at this time, we are not finalizing any proposals related to our reporting and submission requirements for 2016. We refer readers to the Hospital IQR Program discussion in section IX.A.9.d. of the preamble of this final rule for further discussion of the comments related to quarterly reporting.

We also note that this policy only applies for eligible hospitals and CAHs submitting CQMs electronically for 2015. Therefore, as finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50903 through 50905), hospitals demonstrating meaningful use for the first time in 2015 are still required to report CQMs by attestation for a continuous 90-day period in FY 2015, or to report CQMs electronically, by July 1 of the given year to avoid the Medicare penalty in the subsequent year. Medicaid-only providers will continue to report according to State requirements. In addition, as stated above, this policy does not change the reporting periods or requirements for the meaningful use objectives and associated measures under 42 CFR 495.6 or for CQMs that are reported by attestation via the Registration and Attestation System.

In summary, after consideration of the public comments we received, we are finalizing our proposal, with the modifications described above, to align the reporting and submission timelines of the Medicare EHR Incentive Program with those of the Hospital IQR Program on the calendar year for CQMs that are reported electronically in 2015.

We are not finalizing our proposal to require quarterly submission of CQM data for 2015; instead, we will maintain one annual submission period. We are also not finalizing our proposal to require three calendar quarters of CQM data for 2015, but instead, for data submitted electronically, we will require one calendar quarter of data from Q1, Q2, or Q3 of 2015. We are not finalizing our proposals for 2016 in this final rule, and will address the policy for 2016 in future rule making.

3. Quality Reporting Data Architecture Category III (QRDA-III) Option in 2015

In the EHR Incentive Program Stage 2 final rule (77 FR 54088), we finalized two options for eligible hospitals and CAHs to electronically submit CQMs beginning in FY 2014 under the Medicare EHR Incentive Program. Option 1 was to electronically submit aggregate-level CQM data using QRDA-III. Option 2 was to electronically submit data using a method similar to the 2012 and 2013 EHR Incentive Program electronic reporting pilot for EHs and CAHs, which used QRDA-I (patient-level data). We also stated in that final rule that, consistent with section 1886(n)(3)(B)(ii) of the Act, in the event the Secretary does not have the capacity to receive CQM data electronically, eligible hospitals and CAHs that are beyond their first year of meaningful use may continue to report aggregate CQM results through attestation.

We noted in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50904 through 50905) that we had determined that the electronic submission of aggregate-level data using QRDA-III would not be feasible in 2014 for eligible hospitals and CAHs under the Medicare EHR Incentive Program. Therefore, for the 2014 reporting period under the Medicare EHR Incentive Program, eligible hospitals and CAHs would have the option to continue to report aggregate CQM results through attestation. We stated that we would reassess this policy for the 2015 and future reporting periods.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28279 through 28280), we stated that we have determined that the electronic submission of aggregate-level data using QRDA-III will not be feasible in 2015 for eligible hospitals and CAHs under the Medicare EHR Incentive Program. Therefore, for the 2015 reporting period under the Medicare EHR Incentive Program, eligible hospitals and CAHs would have the option to continue to report aggregate CQM results through attestation. We noted that submissions of aggregate CQM data via attestation would not satisfy the reporting requirements for the Hospital IQR Program, and consistent with our proposal above regarding alignment of these programs, attested CQM data would need to be submitted for one full fiscal year in 2015 via the Registration and Attestation System, and would not require quarterly submissions. Hospitals in their first year of demonstrating meaningful use in 2015 would still be required to report CQMs by attestation for a continuous 90-day period in FY 2015, or report CQMs electronically, by July 1, 2015 to avoid the Medicare penalty in FY 2016 as finalized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50903 through 50905). We also noted that this policy does not apply to the Medicaid EHR Incentive Program. Therefore, States may still require the submission of QRDA-III files to fulfill the CQM reporting requirements for hospitals that participate in the Medicaid EHR Incentive Program.

In order to remain aligned with the Hospital IQR Program, and because over 66 percent of hospitals that participate in the Hospital IQR Program are already meaningful users, we strongly recommended that hospitals that are eligible to participate in both programs electronically submit up to 16 electronic clinical quality measures of the 28 inpatient measures identified by the Hospital IQR Program. We believe that keeping the two programs aligned will ultimately reduce reporting burden for hospitals. We note again that reporting via attestation would not count towards the reporting requirements for the Hospital IQR Program.

Comment: Several commenters expressed views related to CMS' proposal not to accept aggregate-level data using QRDA-III for reporting in 2015. Most commenters were disappointed to learn that it was not feasible for CMS' systems to accept QRDA-III files in 2015 and urged CMS to continue to improve systems such that we would be able to accept QRDA-III data in the future. Some commenters requested further discussion of CMS' plan to accept QRDA-III data in the future.

Response: We understand the concerns raised by commenters, and we expect to continue to review and improve our systems for future years to be able to accept aggregate level QRDA-III files. We note that our plans regarding the acceptance of QRDA-III files will be addressed in future rule making.

Comment: A few commenters suggested that CMS and ONC remove the requirement for EHR technology designed for the inpatient setting to be certified to produce QRDA-III formatted files if CMS would not be able to receive QRDA-III data in the future in order to prevent unnecessary work related to the development of these files.

Response: We appreciate the commenters' concerns and suggestion. As we continue to review and improve our systems, we will continue to evaluate whether QRDA-III is a feasible option for future years and whether changes to existing policies would be appropriate.

Comment: A few commenters requested additional information about the storage and maintenance of QRDA-I files.

Response: We note that the storage and maintenance of QRDA-I files is outside the scope of this final rule.

After consideration of the public comments we received, and for the reasons set forth above, we are finalizing the policy as proposed. For the Medicare EHR Incentive Program, eligible hospitals and CAHs may report their CQMs electronically using QRDA- I (patient-level data) or via attestation (aggregate-level data). We note again that reporting via attestation would not count towards the reporting requirements for the Hospital IQR Program.

4. Electronically Specified Clinical Quality Measures (CQMs) Reporting for 2015

In the EHR Incentive Program Stage 2 final rule, we finalized the CQMs that eligible hospitals and CAHs would be required to report for purposes of meeting the CQM component of meaningful use under the EHR Incentive Program starting in 2014 (77 FR 54083 through 54087 Table 10). These CQMs are updated routinely to account for changes, including but not limited to changes in billing and diagnosis codes and changes in medical practices. The requirements specified in the EHR Incentive Program Stage 2 final rule allow for the reporting of different versions of the CQMs. For 2015, it is not technically feasible for CMS to accept data that is electronically reported according to the specifications of the older versions of the CQMs, including versions that may be allowed for reporting under the EHR Incentive Program. We stated in the EHR Incentive Program Stage 2 final rule that, consistent with section 1886(n)(3)(B)(ii) of the Act, in the event that the Secretary does not have the capacity to receive CQM data electronically, eligible hospitals and CAHs may continue to report aggregate CQM results through attestation (77 FR 54088). In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28280) we proposed that eligible hospitals and CAHs that seek to report CQMs electronically under the Medicare EHR Incentive Program must use the most recent version of the electronic specifications for the CQMs and have CEHRT that is tested and certified to the most recent version of the electronic specifications for the CQMs.

Eligible hospitals and CAHs that do not wish to report CQMs electronically using the most recent version of the electronic specifications (for example, if their CEHRT has not been certified for that particular version) would be allowed to report CQM data by attestation for the Medicare EHR Incentive Program.

We invited public comment on these proposals. We have addressed several of the public comments received in this section of this final rule, and we also refer readers to the Hospital IQR Program discussion in section IX.A.9.d. of the preamble of this final rule for further discussion of the comments related to CQM versions.

Comment: Commenters expressed concern and requested clarification regarding the timeframe between publication of the revised specifications and the quarter in which hospitals must being using the new version. Commenters stated that the timeline was too short for adequate development and implementation of the new specifications.

Response: CQMs are updated routinely to account for changes including, but not limited to, changes in billing and diagnosis codes and changes in medical practices. In order for CQMs to remain current and clinically valid, the specifications must be updated on a regular basis. We note that specifications are posted at least 6 months prior to the reporting period, and as we align the reporting and submission timelines of the Medicare EHR Incentive Program with those of the Hospital IQR Program, we provide an even greater window of time between the posting of the specifications and the start of the reporting period.

Comment: Commenters specifically requested clarification regarding the timing and reporting of the updated specifications with respect to CMS' proposal to require quarterly reporting of electronically reported CQMs. Commenters stated that the two proposals would require hospitals to use an EHR that is certified to one set of specifications and then re-certified to a different set of specifications within a given reporting year in order to satisfy the quarterly reporting requirement.

Response: As we discussed above, and in section IX.D.2. of the preamble of this final rule, we are not finalizing our proposal to require quarterly submission of electronically reported CQMs for 2015. For electronic reporting of CQM data for 2015, we will require one calendar quarter of data from Q1, Q2 or Q3 of 2015 submitted during the period January 2, 2015-November 30, 2015. We believe this revised policy will allow additional time for eligible hospitals and CAHs to implement the updates required to submit the most recent version of the CQMs in 2015.

Comment: One commenter suggested that CMS accept multiple versions of CQMs during the reporting year to account for the period of transition between CQM versions.

Response: We appreciate the commenter's suggestion, but unfortunately, as noted above, for 2015, it is not technically feasible for us to accept data that is electronically reported according to the specifications of the older versions of the CQMs, including versions that may be allowed for reporting under the EHR Incentive Program. We note that eligible hospitals and CAHs that do not wish to report CQMs electronically using the most recent version of the electronic specifications would be allowed to report CQM data by attestation for the Medicare EHR Incentive Program.

Comment: One commenter supported the proposal to require that eligible hospitals and CAHs ensure that their CEHRT products are tested and certified to the most recent version of the electronic specifications for the CQMs, and many others opposed the recertification requirement siting the additional burden and cost recertification would impose.

Response: We have received feedback from stakeholders regarding the difficulty and expense of having to test and recertify CEHRT products to the most recent version of the electronic specifications for the CQMs. While we still believe eligible hospitals and CAHs should test and certify their products to the most recent version of the electronic specifications for the CQMs when feasible, we understand the burdens associated with this requirement. Therefore, to avoid this added burden, we are not finalizing our proposal to require eligible hospitals and CAHs to ensure that their CEHRT products are recertified to the most recent version of the electronic specifications for the CQMs. Please note that, although we are not requiring recertification, eligible hospitals and CAHs must still report the most recent version of the electronic specifications for the CQMs.

After consideration of the public comments we received, and for the reasons set forth above, we are finalizing the policy that eligible hospitals and CAHs that seek to report CQMs electronically under the Medicare EHR Incentive Program must use the most recent version of the electronic specifications for the CQMs, however, we will not require eligible hospitals and CAHs to ensure that their CEHRT products are recertified to the most recent version of the electronic specifications for the CQMs.

5. Clarification Regarding Reporting Zero Denominators

As we stated in the EHR Incentive Program Stage 2 final rule (77 FR 54079) we expect eligible hospitals and CAHs to adopt EHR technology that includes CQMs relevant to each eligible hospital's or CAH's patient mix. We understand, however, that there are situations in which an eligible hospital or CAH does not have data to report on a particular CQM, and its EHR is not certified to additional CQMs that can be used to replace that CQM with another for which it has data. For example, a health system with multiple eligible hospitals or CAHs may have an EHR certified for 16 CQMs, which is the minimum number of required CQMs for reporting, but not all of the eligible hospitals or CAHs in the health system may have cases to report on those particular 16 CQMs. We have received questions on how eligible hospitals and CAHs should meet their reporting requirements in this situation; therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28280) we clarified our policy as set forth below regarding the reporting of a zero denominator for the purposes of the Medicare EHR Incentive Program and the Hospital IQR Program.

If the eligible hospital's or CAH's EHR is certified to a CQM, but the eligible hospital or CAH does not have patients that meet the denominator criteria of that CQM, the eligible hospital or CAH can submit a zero in the denominator for that CQM. Submission of a zero in the denominator for a CQM counts as a successful submission for that CQM for both the Medicare EHR Incentive Program and the Hospital IQR Program. For example, if the eligible hospital or CAH within the previously mentioned health system does not provide maternity services, but one of the 16 CQMs the health system's EHR is certified to is a maternity measure, that eligible hospital's or CAH's EHR may render a zero in the denominator for that CQM. The eligible hospital or CAH would therefore report a zero denominator for that maternity care CQM, and this would count toward the 16 required CQMs for the Medicare EHR Incentive Program and the Hospital IQR Program. Eligible hospitals or CAHs within that health system for which that maternity CQM does apply would provide data on that measure.

Comment: Commenters supported and appreciated the clarification regarding zero denominators. Some commenters requested clarification as to whether the above stated zero denominator policy would be effective in CY 2015 or upon publication of this final rule.

Response: The clarification set forth in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28280) and stated above represents our current policy. The additional information and examples provided in the proposed rule were intended for clarification only and do not represent a change to our existing policy.

Comment: One commenter requested clarification as to whether this policy extends to issues resulting from the maintenance of value sets specifically related to medications codified in RxNorm required by the CQM specifications. The commenter stated that these issues often result in a zero denominator being produced by the Medicare EHR, and went on to suggest that these issues may be resolved by modifying CQM specifications to be more in line with how medications are evaluated in the Hospital IQR Program chart-abstracted measures.

Response: While we cannot explore all the possible explanations and reasons why an EHR would produce a zero denominator in this final rule, we hope that the above clarification regarding zero denominators will provide guidance in these instances.

6. Case Threshold Exemption Policy; Clarification for 2014 and Change for 2015

In the EHR Incentive Program—Stage 2 final rule (77 FR 54080), we finalized the policy that eligible hospitals and CAHs that have 5 or fewer discharges per quarter in the same quarter as their reporting period in FY 2014, or 20 or fewer discharges per full FY reporting period beginning in FY 2015, for which data are being electronically submitted (Medicare and non-Medicare combined) as defined by the clinical quality measure's denominator population are exempted from reporting the CQM. To be eligible for the exemption, eligible hospitals and CAHs must submit their aggregate population and sample size counts for Medicare and non-Medicare discharges for the CQM for the reporting period.

In the Health Information Technology: Revisions to the 2014 Edition Electronic Health Record Certification Criteria; and Medicare and Medicaid Programs; Revisions to the Electronic Health Record Incentive Program interim final rule, we revised the case threshold exemption policy to make it applicable for eligible hospitals and CAHs in all stages of meaningful use beginning with FY 2013, including those that are demonstrating meaningful use for the first time and submitting CQMs by attestation (77 FR 72988 through 72989). Eligible hospitals and CAHs with 5 or fewer discharges during the relevant EHR reporting period (if attesting to a 90-day EHR reporting period), or 20 or fewer discharges during the year (if attesting to a full year EHR reporting period) as defined by the CQM's denominator population would be exempted from reporting on that CQM.

We stated in the interim final rule (77 FR 72989) that beginning in FY 2014, the reporting requirement is to report 16 CQMs covering at least 3 domains from a list of 29 CQMs. We stated further that in order to be exempted from reporting fewer than 16 CQMs, the eligible hospital or CAH would need to qualify for the case threshold exemption for more than 13 of the 29 CQMs. If the eligible hospital or CAH does not meet the criteria for a case threshold exemption for 13 or more CQMs, the eligible hospital or CAH would be able to report at least 16 CQMs. Likewise, we stated that if the CQMs for which the eligible hospital or CAH can meet the case threshold of discharges do not cover at least 3 domains, the eligible hospital or CAH would be exempt from the requirement to cover the remaining domains. For example, if the eligible hospital or CAH does not meet the case threshold of discharges for 13 clinical quality measures, and thus could report 16 clinical quality measures, but the 16 clinical quality measures cover only 2 of the 3 domains, the eligible hospital or CAH would be exempt from covering the third domain.

For the reporting periods in 2014, our policy requires that an eligible hospital or CAH that claims a case threshold exemption for one CQM must choose another CQM on which to submit data, or continue to invoke the case threshold exemption until it exceeds 13 case threshold exemptions and may therefore report fewer than the 16 required CQMs. This policy assumes that the eligible hospital or CAH has an EHR that is certified to more than the minimum of 16 CQMs, and the eligible hospital or CAH has other CQMs in its EHR to choose from for reporting. We realize, however, that there could be many EHRs that are certified to only the minimum of 16 CQMs required by ONC's regulations at 45 CFR 170.102 (the definition of “Base EHR”), and for eligible hospitals and CAHs using those EHRs, this policy may result in the eligible hospital or CAH needing to submit data on a CQM for which the EHR is not certified. It was not our intent to have eligible hospitals or CAHs report on measures for which their EHRs are not certified.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28280 through 28281), beginning with the reporting periods in 2015, we proposed to change the case threshold exemption policy so that if an eligible hospital or CAH qualifies for an exemption from reporting on a particular CQM, the exemption would count toward the 16 required CQMs. For example, if the eligible hospital's or CAH's EHR is certified to report 16 CQMs, and for one of those CQMs the eligible hospital or CAH has 5 or fewer discharges during the relevant EHR reporting period (if attesting to a 90-day EHR reporting period), or 20 or fewer discharges during the year (if attesting to a full year EHR reporting period) as defined by the CQM's denominator population, the eligible hospital or CAH would report data for the 15 CQMs for which the case threshold exemption does not apply, and invoke a case threshold exemption for the one CQM for which the exemption does apply for a total of 16 CQMs.

We expect eligible hospitals and CAHs to adopt EHR technology that includes CQMs relevant to the eligible hospital's or CAH's case mix, though we understand that in some cases, the eligible hospital or CAH may not meet the case threshold of discharges for a particular CQM. We believe this proposed policy better reflects our intent for eligible hospitals and CAHs to report on only those measures for which their EHRs are certified while meeting the reporting requirements for the Medicare EHR Incentive Program and Hospital IQR Program.

We invited public comment on this proposal.

Comment: Several comments supported the proposed change to CMS' case threshold exemption policy. Commenters felt that this change in policy acknowledged that an eligible hospital or CAH should receive credit for meeting the CQM even though the eligible hospital or CAH may not meet the case threshold of discharges for that particular CQM.

Response: We appreciate the comments in support of our proposal.

After consideration of the public comments we received, we are finalizing the policy as proposed. We note that for CQM data reported by attestation, this policy applies to eligible hospitals or CAHs that have 5 or fewer discharges during the relevant EHR reporting period (if attesting to a 90-day EHR reporting period), or 20 or fewer discharges during the year (if attesting to a full year EHR reporting period), as defined by the CQM's denominator population. For CQM data submitted electronically in 2015, this policy applies to eligible hospitals or CAHs that have 5 or fewer discharges during their chosen reporting period of one calendar quarter, as defined by the CQM's denominator population. We note that because there is no option for a full year reporting period for data submitted electronically in 2015, the exemption based on 20 or fewer discharges for a full year EHR reporting period would not apply.

X. Revision of Regulations Governing Use and Release of Medicare Advantage Risk Adjustment Data

A. Background

Section 1853 of the Act requires the Secretary to make payments to Medicare Advantage (MA) organizations offering local and regional MA plans with respect to coverage of individuals enrolled under Medicare Part C. Section 1853(a)(1)(C) of the Act requires the Secretary to adjust such payments for such risk factors as age, disability status, gender, institutional status, and such other factors as the Secretary determines appropriate, including health status. To support these risk adjustments, section 1853(a)(3)(B) of the Act requires submission of data by MA organizations regarding the services provided to enrollees and other information the Secretary deems necessary but does not limit the Secretary's use of such data or information. Section 1106 of the Act authorizes the Secretary to adopt regulations governing release of information gathered in the course of administering programs under the Act.

Implementing regulations at 42 CFR 422.310 set forth the requirements for the submission of risk adjustment data that CMS uses to risk-adjust payments. MA organizations must submit data, in accordance with CMS instructions, to characterize the context and purposes of items and services provided to their enrollees by a provider, supplier, physician, or other practitioner. Section 422.310(d)(1) provides that MA organizations submit risk adjustment data to CMS as specified by CMS. Risk adjustment data refers to data submitted in two formats: comprehensive data equivalent to Medicare fee-for-service claims data (often referred to as encounter data); and data in abbreviated formats (often referred to as RAPS data). Section 422.310(f) currently specifies CMS' uses of the risk adjustment data.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978), we proposed to revise the existing regulation at § 422.310(f) to broaden the specified uses and disclosures of risk adjustment data in order to strengthen program management and increase transparency in the MA program, and to specify the conditions for release of risk adjustment data to entities outside of CMS.

We received approximately 28 pieces of correspondence from MA organizations and trade associations, beneficiary advocacy organizations, hospital systems and trade associations, a government agency, a research firm, and individuals. Summaries of and our responses to the public comments on the uses and bases for disclosure of risk adjustment data (§ 422.310(f)(1)) are presented in section X.B.1 of the preamble of this final rule. Summaries of and responses to the public comments on the conditions for release of risk adjustment data outside of CMS (§ 422.310(f)(2)) are presented in section X.B.2. of the preamble of this final rule.

B. Proposed and Finalized Regulatory Changes

1. Expansion of Uses and Reasons for Disclosure of Risk Adjustment Data

In the FY 2015 IPPS/LTCHG PPS proposed rule, we first proposed to revise a reference in existing § 422.310(f) from “data obtained under this section” to “data described in paragraphs (a) through (d) of this section” in both paragraphs (f)(1) and (f)(2); this new text would indicate that the data used or released under proposed paragraph (f) would not include the medical records and other data collected separately under paragraph (e) for the purpose of risk adjustment data validation (RADV) audits. We stated that we did not intend for the proposed § 422.310(f) to authorize any additional use or release of the data described in paragraph (e). We proposed that the data described in paragraphs (a) through (d) would include those elements that constitute an encounter data record, including contract, plan, and provider identifiers, with the exception of disaggregated payment data as discussed below. In addition, we noted that paragraph (d)(1) also authorizes the collection of abbreviated data and that the proposed regulation would apply to both the abbreviated data as well as more detailed (encounter-level) data collected from MA organizations pursuant to § 422.310(a) through (d).

Comment: A few commenters stated that, without a specific exclusion, the data that CMS proposed to release would seem to include audit data and additional data collected as part of these audits, which could include price and charge information. These commenters urged CMS to restrict the inclusion of additional data collected as part of audits from the data eligible for release.

Response: We did propose a specific exclusion: That medical records and other data that MA organizations submit to CMS as part of a Risk Adjustment Data Validation (RADV) audit at § 422.311 are excluded from the data release provisions of this rulemaking, through the references at § 422.310(f) to the data described in § 422.310(a) through (d) as the data that would be available under this rule. This text excludes data collected pursuant to § 422.310(e) for RADV.

We did not receive any public comments challenging our proposed exclusion of audit data under paragraph (e) of § 422.310, nor did we receive any public comments on the application of this rule to abbreviated data under paragraph (d)(1) of § 422.310. Comments about additional or fewer protections for encounter data under the final rule are addressed below. Therefore, we are finalizing the use of this language limiting the scope of paragraph (f) to data described in paragraphs (a) through (d) as proposed.

The existing regulation at § 422.310(f) specifies five purposes for which CMS may use risk adjustment data obtained from MA organizations. In the proposed rule, we clarified that CMS' uses of these data may include disclosure to CMS contractors or other agents that perform activities or analyses on CMS' behalf in connection with authorized use of the data. The existing specified purposes are: (1) To determine the risk adjustment factors used to adjust payments, as required under §§ 422.304(a) and (c); (2) to update risk adjustment models; (3) to calculate Medicare DSH percentages; (4) to conduct quality review and improvement activities; and (5) for Medicare coverage purposes. We proposed to restructure paragraph (f) to identify the purposes for which CMS may use and release risk adjustment data and to impose certain conditions on any release of that data.

We proposed to revise paragraph (f) to add four purposes, as paragraphs (f)(1)(vi) through (ix), for which CMS may use risk adjustment data submitted by MA organizations: (1) To conduct evaluations and other analysis to support the Medicare program (including demonstrations) and to support public health initiatives and other health care-related research; (2) for activities to support the administration of the Medicare program; (3) for activities conducted to support program integrity; and (4) for purposes permitted by other laws. We stated our expectation that, in general, comprehensive risk adjustment data submitted by MA organizations, which MA organizations began submitting to CMS effective CY 2012, will enable CMS to generate improved data analyses that could support Medicare program evaluations, demonstration designs, and CMS' effective and efficient operational management of the Medicare program. Risk adjustment data also could be useful to support public health initiatives by governmental entities and to advance health care-related research by universities and other research organizations. We stated that we also believe that risk adjustment data can support CMS' program integrity activities in the Medicare program and other Federal health care and related programs. This general term encompasses audits, investigations, efforts to combat waste, fraud, and abuse, and any other actions designed to ensure that the program operates within its authority including audits, evaluations, and investigations by the Office of the Inspector General (OIG) as well as CMS' own efforts. In addition, we stated that risk adjustment data may be useful in supporting Medicare administrative activities, such as the review of the validity of bid and medical loss ratio data submitted by MA organizations. Finally, we proposed to acknowledge that other laws may permit other uses of risk adjustment data and that this regulation is not intended to supersede such other laws.

Regarding the use of risk adjustment data outside of CMS, we proposed at § 422.310(f)(2) that other HHS agencies, other Federal executive branch agencies, States, and external entities would only be able to obtain from CMS and use risk adjustment data for one or more of the purposes listed in proposed paragraph (f)(1). An external entity may be an individual, group, or organization. In the proposed rule, we acknowledged our expectation that other HHS agencies and other Federal executive branch agencies may request these data for the same purposes CMS proposed to use the data and that we believe such use is appropriate. Under our proposal, other agencies that evaluate and analyze the Medicare program, perform health care-related research, support public health initiatives, perform activities in the administration of the Medicare program, or conduct activities to support program integrity in the Medicare program and other Federal health care and related programs would be able to access and use risk adjustment data for these purposes. States, while conducting program integrity activities for Medicaid programs or in the administration of Medicare-Medicaid demonstrations (for example, refer to the Web site at: http://www.cms.gov/Medicare-Medicaid-Coordination/Medicare-and-Medicaid-Coordination/Medicare-Medicaid-Coordination-Office/FinancialAlignmentInitiative/FinancialModelstoSupportStatesEffortsinCareCoordination.html), may access and use risk adjustment data under the proposal. We stated that we anticipate that nongovernmental external entities would generally only gain access to risk adjustment data under this proposal in connection with public health initiatives and health care-related research, as such external entities appear to have limited, if any, roles in the other purposes identified in our proposal.

Regarding the use of risk adjustment data for purposes permitted by other laws, we noted that, to the extent that a requestor has separate statutory authority for requiring CMS disclosure of data, our proposed provisions would not limit or supersede such authority. For example, some Congressional support agencies may compel release of data under separate statutory authority, such as 31 U.S.C. 716, 2 U.S.C. 166(d)(1) and 601(d), and section 1805 of the Act (42 U.S.C. 1395b-6), for the purposes of conducting Congressional oversight, monitoring, making recommendations and analysis of the Medicare program. In addition, the OIG has separate statutory authority under section 1128J of the Act (42 U.S.C. 1320a-7k), coupled with section 6(a) of the Inspector General Act of 1978 (5 U.S.C. App. 3) authorizing the OIG to access data as necessary to perform its responsibilities. This regulation will not limit that authority.

Finally, in the proposed rule, we stated that we are seeking to balance protection of confidential beneficiary information and the proprietary interests of MA organizations with the need to effectively administer Federal health care programs and to encourage research into better ways to provide health care. We also noted a goal of the proposal to increase transparency in the administration of the Medicare program. We sought public comments on the proposed uses and release of data and how else to achieve the necessary balance. In particular, we solicited public comment on the extent to which a commercial purpose underlying a request for risk adjustment data should be a factor in evaluating whether the request is for one of the purposes that permit a disclosure under this regulation or if one of the purposes in paragraph (f)(1) of § 422.310, for which CMS would disclose data under this section, should address commercial uses of the data. The topic of commercial purposes is discussed later in section X.B.2. of the preamble of this final rule as a condition of data release.

Comment: Several commenters supported CMS' proposal for expanding the use and distribution of MA risk adjustment data to support and strengthen the Medicare program, as well as supporting public health initiatives and health care-related research. Commenters stated that risk adjustment data are valuable to researchers for analyzing health care trends, public health research initiatives, and improving management of the Medicare program. These commenters expressed support of CMS' efforts to move toward greater transparency through the release of risk adjustment data. One commenter believed that greater data transparency would not only further public health research but also might serve to further educate consumer organizations, patient advocates, and ultimately beneficiaries about the Medicare program. Generally, commenters supporting the proposals in the proposed rule simultaneously recognized the importance of balancing these goals with the importance of protecting the confidentiality of beneficiary information, and one commenter agreed with CMS' proposal to aggregate data on negotiated rates paid to providers.

Response: We appreciate the commenters' support.

Comment: A number of commenters generally addressed the proposed uses of risk adjustment data, characterizing the listed purposes as too broad and asking CMS to more specifically and narrowly define them. One commenter stated that the purposes, as stated in the proposed rule, are so broad as to justify release of these sensitive data for almost any research activity. Several commenters were concerned that having permitted uses of risk adjustment data for such broad-based purposes leaves a large gap in the protection of potentially proprietary information, especially given the concern about usage of these data by agencies with limited knowledge or understanding of the data and how to make accurate interpretations.

Response: Section 1853 of the Act does not limit the uses of risk adjustment data, and section 1106 of the Act authorizes the adoption of regulations governing how CMS will disclose data obtained in the course of CMS' duties. We have reviewed the proposed uses of risk adjustment data (which are for analytical purposes), and we do not believe that they are too broad. We reiterate that the list of permissible bases under this regulation for use and disclosure is exhaustive and that uses of the risk adjustment data that are outside of the scope of these nine categories will not be authorized. Accordingly, we see no compelling reason to further limit uses of this data by eliminating or narrowing any of the proposed purposes.

Comment: Several commenters expressed concern about CMS' use of risk adjustment data, under the purpose stated under § 422.310(f)(1)(vii), “for activities to support the administration of the Medicare program.” In particular, commenters requested clarification and specificity regarding how these data would be used in the example provided in the preamble, which was to validate the MA organizations' bid and Medical Loss Ratio (MLR) data. They argued that the risk adjustment data could not be used to inform bid review or MLR analyses: The data MA organizations submit to CMS cannot provide a complete picture of MA organizations' costs because many organizations have payment arrangements with providers that are not fee-for-service based, for example, capitation arrangements, bundled pricing, incentive payments, and multi-year contracting.

Response: In the preamble of the proposed rule, we identified two examples of using risk adjustment data in support of Medicare administrative activities: Review of the validity of bid and MLR data submitted by MA organizations. We anticipate that there will be other uses in support of Medicare administrative activities, such as the development of quality measures. Regarding the two examples we provided in the proposed rule, while we recognize that many MA organizations have alternative arrangements other than fee-for-service payments, we believe that encounter data will be useful for understanding patterns of beneficiary utilization and aspects of MA organizations' expenditures, as reported in bid and MLR submissions.

Comment: One commenter asked how CMS could use risk adjustment data, under § 422.310(f)(1)(viii), “for activities conducted to support program integrity,” particularly when the data are incomplete due to payment arrangements with providers that are not fee-for-service based.

Response: We believe that risk adjustment data are valuable for program integrity purposes. For example, encounter data could be used to compare MA and FFS billing to identify aberrant patterns, which may inform efforts to combat fraud, waste, and abuse.

Comment: Several commenters expressed concern about CMS' use of risk adjustment data, under § 422.310(f)(1)(ix), “for purposes permitted by other laws.” Commenters requested CMS to further clarify this purpose in regulation, for example, to distinguish Federal laws from State laws and to specify that this provision only applies to health care laws. Another commenter asked how CMS intends to evaluate the other laws that permit use or release of these data; for example, would CMS allow risk adjustment data to be used to evaluate risk adjustments for insurance exchanges created under the Affordable Care Act, and, if so, the commenter expressed concern that the data would not provide a valid or accurate comparison, given the unique patient population.

Response: As we noted in the preamble of the proposed rule, we believe it is important to acknowledge that this regulation is not intended to supersede other laws that permit other uses of risk adjustment data. For example, this regulation cannot override separate statutory authorities that require CMS disclosure of data to other Federal agencies. We refer readers, for example, to 31 U.S.C. 716; 2 U.S.C. 166(d)(1) and 601(d); and section 1805 of the Act (42 U.S.C. 1395b-6).

Regarding the commenters' request that we further specify in regulation text the types of laws to which paragraph (f)(1)(ix) applies (such as Federal laws versus State laws), we do not believe that detailed specification of laws is necessary because we believe it is clear that only laws that apply to CMS or to data held by CMS are within the scope of the provision. However, in response to these comments, we are finalizing the regulation text at § 422.310(f)(1)(ix) to state “for purposes authorized by other applicable laws” to emphasize that the provisions in other laws must be applicable to CMS and to MA risk adjustment data.

Finally, we are not sure what the commenter means by evaluating other laws other than as a request for clarification that this provision encompasses laws that are applicable to CMS or to data held by CMS. If the question is about how we would determine the appropriateness of a research topic and study design that involves both Medicare and another program enacted under other laws, we do not believe we can comment on the appropriateness of specific designs in this preamble. The approval of any research study requesting use of MA risk adjustment data would be handled through the existing process CMS uses for data sharing, as described elsewhere in this preamble in the discussion of paragraph (f)(2)(ii) of § 422.310.

Comment: One commenter stated that it is not appropriate for external entities to receive the data for uses that are exclusively within CMS' authority—specifically, that CMS should not release data to entities outside of CMS for the determination of risk adjustment factors, updating risk adjustment models, the calculation of Medicare DHS percentages, or Medicare coverage purposes (§ 422.310(f)(1)(i) and (f)(1)(ii) or (f)(1)(v)). Another commenter asked CMS to expressly limit, in regulatory text, the bases upon which nongovernmental external entities receive the data to one purpose: Support of public health initiatives and other health care-related research. Furthermore, one commenter stated that neither States nor external entities should receive the data to conduct quality review and improvement activities, for activities to support the administration of the Medicare program, or for activities conducted to support program integrity (§ 422.310(f)(1)(iv), (f)(1)(vii), or (f)(1)(viii)) because these are purposes exclusive to the Federal Medicare program. Another commenter stated that is it is unclear what uses States would have for these data, given the Federal administration of the MA program and the difference in populations enrolled in MA plans and commercial health insurance coverage, for which States may be administering risk adjustment or other programs, or Medicaid coverage, for which a State Medicaid agency would administer benefits, concluding that States should not receive the MA risk adjustment data.

Response: We have reviewed our proposed purposes and believe that there may be cases in which researchers, whether external entities or other governmental agencies, might have well-designed research projects that meet CMS' stringent requirements, under our longstanding data sharing procedures, thus warranting use of the data for an approved project. For example, other Federal governmental agencies may want to use the data to conduct research on new developments in risk adjustment models or an external entity may want to propose research on the design of quality measures that could apply to beneficiaries in both the MA and FFS programs. Both of these examples illustrate the point that greater data transparency could improve administration of the Medicare program and improve public health. As noted in the preamble of the proposed rule, we also believe that risk adjustment data can support program integrity activities in the Medicare program and in other Federal health care and related programs funded in whole or in part by Federal funds.

Furthermore, we believe that our approach to determining whether to disclose risk adjustment data, which incorporates the Medicare Part A/B and Part D minimum necessary data policy, with additional restrictions to protect beneficiary privacy and commercially sensitive information of MA organizations, strikes an appropriate balance between the significant benefits of furthering knowledge through health care research and concerns regarding the release of risk adjustment data. Finally, we believe this process has sufficient protections to ensure compliance with the applicable laws and guard against the potential misuse of data. External entities requesting access to risk adjustment data will have to enter into a Data Use Agreement with us that includes provisions protecting the data from improper release.

Comment: One commenter asked CMS to further define what CMS means by external entities in paragraphs (f)(1) and (f)(2) of § 422.310.

Response: An external entity may be an individual, group, or organization that is not a Federal executive -branch agency or a State.

After consideration of the public comments we received, we are finalizing, as proposed, the four additional permitted uses of risk adjustment data at § 422.310(f)(1)(vi) through (f)(1)(ix), with the exception that we are changing the language for the purpose under paragraph (f)(1)(ix) to read: “For purposes authorized by other applicable laws.”

2. Conditions for CMS Release of Data

The existing regulations at § 422.310 do not specify conditions for release by CMS of risk adjustment data that are submitted by MA organizations to CMS. In the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to add a paragraph (2) to § 422.310(f) to address CMS' release of such data to non-CMS entities. First, as discussed above in connection with proposed paragraph (f)(1), our proposal was limited to the risk adjustment data described in § 422.310(a) through (d) and did not include the medical records and other data collected separately under paragraph (e) for the purpose of risk adjustment data validation (RADV) audits. We stated that we did not intend for the proposed revision to § 422.310(f) to authorize any additional use or release of the data described in paragraph (e).

Second, we proposed that CMS would release only the minimum data that CMS determines is necessary to fulfill the analytical or operational goal for a particular project. In other words, our proposal provided that CMS could determine that the appropriate data release for an approved research project is a subset of encounter data records requested to conduct the proposed inquiry (instead of all encounter data in CMS' systems for all years and provider types) or is a subset of the abbreviated data requested.

Third, we proposed that CMS may release data under this authority to other HHS agencies, other Federal executive branch agencies, States, and external entities, only for purposes identified in paragraph (f)(1) (discussed above) and subject to a number of additional limitations: (i) Applicable Federal laws; (ii) CMS data sharing procedures; (iii) protection of beneficiary identifier elements and beneficiary confidentiality, including: (A) a prohibition against public disclosure of beneficiary identifying information; (B) release of beneficiary identifying information to other HHS agencies, other Federal executive branch agencies, Congressional support agencies, and States only when such information is needed to accomplish the purpose(s) of the disclosure; and (C) release of beneficiary identifying information to external entities only to the extent needed to link datasets; and (iv) the aggregation of payment data to protect commercially sensitive data.

These limitations were included at proposed paragraphs (f)(2)(i) through (f)(2)(iv), respectively, of § 422.310. We solicited public comment on other conditions or limitations on the release of this data that will help maintain a balance between protecting confidential and proprietary information with the need to effectively administer Federal health care programs and to encourage research into better ways to provide health care. We also solicited public comments on the extent to which a commercial purpose underlying a request for risk adjustment data should be a factor in evaluating whether the request is for one of the purposes that permit a disclosure under this regulation or if one of the purposes in paragraph (f)(1) of § 422.310, for which CMS would disclose data under this section, should address commercial uses of the data.

Under the provisions at proposed § 422.310(f)(2)(iv), we would aggregate payment data to protect commercially sensitive information. We stated our belief that release of payment data at the level of the encounter record might reveal proprietary negotiated payment rates between MA plans and providers. Given the commercially sensitive nature of this information, we did not propose to release payment data at the level of the encounter record without taking steps to protect the commercially sensitive information. In the interest of providing as much transparency as possible, while at the same time protecting proprietary information related to the payments made by MA organizations to health care providers, we proposed to authorize release of aggregate payment information. For example, we could aggregate the payment data by service category, by plan, by contract, or across contracts. We sought public comments on these or other approaches to aggregating payment data for release and whether the specified options are sufficiently aggregated to protect commercially sensitive information. In addition, we sought public comment on our conclusion that releasing payment rates at the level of the encounter data record would reveal proprietary negotiated payment rates. Specifically, we requested public comment on what strategies might be used under which payment data could be released while protecting commercially sensitive information.

Comment: A number of commenters argued that no risk adjustment data should be released to protect the proprietary nature of encounter data, including data on payments, diagnoses, National Provider Identifiers (NPIs), among other data fields. A few commenters used “payment data” when referring to terms such as “service categories” and “diagnoses.”

Response: In reviewing the comments, we observed that several commenters distinguished payment data from other elements of encounter data, while other commenters did not make this distinction and instead used the term “risk adjustment data” or “encounter data” when arguing that all data should be aggregated. Therefore, our response here is intended make clearer the distinction between payment data and other data elements.

In the proposed rule, we stated at § 422.310(f)(2)(iii) that beneficiary identifier elements would not be disclosed to protect beneficiary confidentiality, and we stated at § 422.310(f)(2)(iv) that payment data would be aggregated as necessary to protect commercially sensitive data. Our proposed rule thus implied that data outside of these two protected categories would be released without redaction or aggregation. In light of some comments we received, we are concerned that the regulation text should be more detailed in describing the risk adjustment data that does not fall into the two protected categories at § 422.310 (f)(2)(iii) and (f)(2)(iv). Therefore, we are finalizing this rule with two changes to the regulation text. First, to clarify that the term “payment data” means the dollar amounts reported on an associated encounter data record, we are finalizing § 422.310(f)(2)(iv) to use the more specific phrase “dollar amounts reported for the associated encounter” instead of “payment data.” Therefore, in this final rule, we have revised § 422.310(f)(2)(iv) to specify risk adjustment data subject to the aggregation of dollar amounts reported for the associated encounter to protect commercially sensitive data. (We note that dollar amounts are only reported in encounter data records and not in the other type of risk adjustment data referred to as abbreviated (RAPS) data.)

This rule does not address the release of data that relates to how much CMS pays MA organizations. In the final rule, CMS-4144-F, published in the Federal Register on April 15, 2011 (76 FR 21432), we adopted regulations on that topic.

Second, we are clarifying that risk adjustment data elements that do not fall into either of the two protected categories (beneficiary identifiers and dollar amounts) are subject to release without redaction or aggregation, respectively. Specifically, we are adding paragraph (f)(2)(v) to clarify that risk adjustment data other than data described in paragraphs (f)(2)(iii) and (f)(2)(iv) of the section will be released without the redaction or aggregation described in paragraphs (f)(2)(iii) and (f)(2)(iv), respectively. (We note that we use the term “redaction” to include deletion, encryption, and obscuring or changing the form of something for legal or security purposes.) We discuss in more detail below our analysis of this new language.

Comment: A number of commenters responded to the request for public comments on the release of payment data and possible ways they could be aggregated in order to protect commercially sensitive information. Many commenters thanked CMS for the opportunity to comment on this issue and expressed gratitude for CMS' concern to protect proprietary information on prices negotiated between MA organizations and health care providers.

Response: We appreciate the commenters' support.

Comment: A number of commenters addressed the aggregation of risk adjustment payment data for release under this rule. Several commenters asked that CMS only release payment data that have been aggregated to the national or regional level. Some commenters were concerned that the release of such data, even in an aggregated form, has the potential to provide detailed insight about aspects of MA plan experience under the MA program (for example, utilization and cost experience) that are fundamental to bidding and benefit design decisions and, as a result, release of these data would undermine the integrity of the bidding process and the competitive structure of the MA marketplace, both in terms of plan competition for enrollees and competitive negotiations with providers regarding payments rates. One commenter stated that public transparency of negotiated rates could actually inflate prices by discouraging private negotiations that can result in lower prices for some buyers. One commenter who requested regional aggregation expressed concern that if CMS discloses payment data at a lower level of aggregation, it may be easy for competitors to identify sensitive business information on smaller plans and on plans serving targeted populations or providing specific services, such as SNP plans, which would undermine their market position. Another commenter requested that CMS not release payment data at all (at the encounter level or aggregated). Several other commenters asked that aggregated payment data only be released to government agencies and not to external entities. One commenter made the general request that CMS aggregate the data in a manner and at a level that protect the confidentiality of information and that take into account that an organization in some instances may be the principal MA plan in a particular geographic region. Some commenters argued that using encounter data fields such as contract, plan, and provider identifiers as categories by which to aggregate payment data could still lead to exposure of sensitive business strategies (including details about exclusive contracts, pricing, incentive programs, and other information that would disadvantage identifiable plans).

A number of commenters provided suggestions for approaches to aggregation of payment data. One commenter suggested releasing national per member per month averages, which would protect negotiated rates while still allowing comparison with other areas of Medicare spending. Another commenter suggested aggregating risk adjustment payment data at a county level in areas where there are three or more MA plans, but in areas with two or less MA plans aggregation should be done across counties. In addition, this commenter suggested that CMS identify when area-specific aggregation approaches are needed, such as where a single MA plan dominates a market and could be identifiable even where there are multiple plans within one or across several counties. Several commenters suggested releasing only aggregated data at either service level categories in the MA bid or at the level of HCCs in the Part C risk adjustment model. Finally, one commenter suggested that CMS make available average pricing per relative value unit (RVU) for given geographies or patient demographic categories, which could provide helpful information regarding payment levels without exposing commercially sensitive negotiated rates.

Response: We appreciate all the responses to our request for comment on ways to aggregate risk adjustment payment data, and we will take these ideas and concerns into consideration when determining the appropriate level of aggregation of the dollar amounts associated with each encounter. We understand the commenters' concerns about the proprietary nature of the payment data and believe that this rule, as finalized, provides the flexibility to protect commercially sensitive data as necessary. It is important to note that, in some instances, the payment data may not require aggregation to protect commercial sensitivity; for example, a request could be made for data that are over 15 years old that is not relevant to current payment amounts. In this case, we would need to assess the unique circumstance of the request and determine if the data were or were not commercially sensitive, and we may decide after consideration to release the data at the encounter level because the need to protect commercially sensitive data is not implicated.

We note that we do not agree that only payment data aggregated at the national level should ever be disclosed for any approved research project because such a narrow approach would eliminate too many research questions appropriate to the permitted uses of the data under § 422.310(f)(1) and would not account for situations where less than a national level of aggregation is sufficient to protect the commercial interests of the applicable MA organization(s). In addition, we are not convinced that the release of aggregated payment data would have the negative impact on competition and the integrity of the MA bidding process that is described by a number of commenters. CMS expects to aggregate the dollar amounts on encounter data records as necessary to prevent researchers from determining payment amounts to individual providers, and in this way we would protect competition. As we noted in the April 15, 2011 final rule, CMS-4144-F (76 FR 21516), the MA program is not competitive in the way that term is normally understood. Although MA organizations do compete for members, primarily through the plan benefits offered and the cost (member cost sharing and premium) of those benefits, they do not directly compete for the payments that CMS makes. Rather, we approve all sustainable bids that are otherwise qualified without preference for the lowest bidder. The fact that MA-eligible Medicare beneficiaries can, generally, select from a number of plans offered in their county of residence is evidence that competition is robust.

Comment: Several commenters were concerned with the release and use of the payment data included in risk adjustment data, especially to external entities, even if the data are aggregated for release. These commenters argued that the MA encounter data are incomplete due to the nature of MA organizations' financial arrangements with providers and it is inappropriate to use this data to draw conclusions that may inform policy or program management. Specifically, these commenters noted that encounter data cannot provide a complete picture of MA organizations' costs because many MA organizations have payment arrangements with providers that are not fee-for-service based and are instead based on capitation arrangements (for which $0 is reported on an encounter record), bundled pricing, incentive payments, and multi-year contracting. One commenter noted that, for some MA plans, most outpatient services are capitated, except for some referrals and emergency services, and nonbenefit costs are not reflected in encounter data, rendering financial analysis and comparisons for these plans inappropriate and impractical. Another commenter also recommended that information about the limitations of the data should accompany any release of risk adjustment data, to reduce misinterpretations of the data by the end users and to prevent policy development based on inaccurate analyses of risk adjustment data.

Response: We understand the commenters' concerns that risk adjustment data may not provide a complete picture of the costs associated with care of MA plan enrollees due to the alternative payment arrangements. However, we believe that broader release of risk adjustment data to external entities can increase the positive contributions researchers make to the evaluation and function of the MA program and improve the efficiency of the program and the clinical care of its beneficiaries, which is in the interest of public health. Specifically, it is in the interest of the public health to share this information with entities outside of CMS, as the work of these entities will assist CMS in evaluating the MA program and assessing related policies to improve the clinical care of beneficiaries. In addition, broader release of the data also has the potential to assist in addressing public health issues of the population in general beyond just Medicare beneficiaries. Regarding the suggestion to provide approved requestors with information on the limitations of encounter data, we believe this is a good suggestion and will consider what disclaimers are appropriate to accomplish this.

Comment: A number of commenters also expressed concern about the proprietary nature of other data elements in the encounter record in addition to payment data, stating that releasing plan-specific and provider-specific data such as diagnoses, service categories, Current Procedural Terminology codes (CPTs), and NPIs has the potential to provide detailed insight about aspects of MA plan experience under the MA program that are fundamental to bidding and benefit design decisions and could undermine the competitive structure of the health care market in many areas. In contrast, one commenter agreed that proprietary payment data should be aggregated to protect proprietary information on negotiated prices, but further emphasized that all other encounter claims data should be widely available to commercial entities—including providers, medical societies, ERISA plans and insurers—for the purposes of improving the value of health care to the consumer (subject to privacy protections under HIPAA and other statutes).

Response: In the proposed rule, we only raised the issue of commercial sensitivity with regard to payment data. As noted in an earlier response, we are clarifying that the term “payment data” means the dollar amounts reported on an associated encounter data record, and that risk adjustment data elements that do not fall into either of the two protected categories of beneficiary identifiers and dollar amounts are subject to release without redaction or aggregation, respectively. We are not persuaded by the argument that data elements aside from beneficiary identifiers and dollar amounts require protection because they are relevant to competition that MA organizations face. We are mirroring the effort within CMS to increase transparency through broadened release of Parts A and B data. We routinely make Medicare FFS claim data available to interested parties for research, and these data include information on procedure codes and diagnosis codes. Furthermore, on April 9, 2014, CMS released detailed service use data on nearly 1 million physicians and health care providers. Thus, as clarified in § 422.310(f)(2)(v), CMS will release risk adjustment data—other than beneficiary identifier data described in § 422.310(f)(2)(iii) and dollar amounts reported for associated encounter described in § 422.310(f)(2)(iv)—without the redaction or aggregation described in paragraphs (f)(2)(iii) and (f)(2)(iv), respectively.

Comment: One commenter was concerned that encounter data from Medicare-Medicaid Plan (MMP) demonstrations would be used for analyses, such as OIG studies and validation of bids and medical loss ratios, and believed this would be a mistake because these are new plans and there remain many operational questions about submission of this encounter data, including coordinating with States and processing and submitting claims in a manner seamless to both the member and provider.

Response: Our policy on the use and release of risk adjustment data in this final rule will apply the same way to the Medicare risk adjustment data of MMP demonstrations as it does to the risk adjustment data of MA organizations. We appreciate the comment on the important distinctions in the encounter data collection process for MMP data compared to MA data, and we will consider the unique aspects of MMP data in their ultimate application.

Comment: Several commenters asked CMS to provide a definition of commercially sensitive.

Response: There is extensive case law under the Trade Secrets Act (18 U.S.C. 1905) and FOIA Exemption 4 (5 U.S.C. 552(b)(4)) that addresses the concept of commercially sensitive, and we do not believe this is an appropriate venue for summarizing the case law. We also discuss the relationship of this regulation to the Trade Secrets Act and FOIA below.

We add that two commenters appeared to blur the concepts of commercially sensitive and commercial purpose; therefore, we are clarifying here that these are unrelated concepts for the purpose of this rulemaking. Issues around releasing data for a commercial purpose pertain to CMS' data sharing procedures and are discussed in a separate comment and response below.

Comment: Several commenters asserted that even risk adjustment data aggregated up to the level of contract or parent organization (for example, service category and diagnosis data) could be considered to meet the elements required for application of the exemption under FOIA Exemption 4 (5 U.S.C. 552(b)(4)). The commenters stated that risk adjustment data submitted by an MA organization are protected by 45 CFR 5.65(b)(1) because: (1) It is supplied by someone outside the government having a financial interest in the information, namely the MA organization providing the data; (2) it is “confidential commercial or financial information” and proprietary and confidential; and (3) disclosure of each MA organization's service category and/or diagnosis data could result in competitive harm for the MA organization.

Response: In response to comments arguing that the Trade Secrets Act (18 U.S.C. 1905) or FOIA exemptions prohibit release of this information on the basis that the information is the confidential commercial information of the MA organization, we do not believe that the release of the risk adjustment data under our final rule will violate either the Trade Secrets Act or FOIA. With respect to the risk adjustment data described in paragraph § 422.310(f)(2)(iv), the regulation explicitly provides for aggregation at the level necessary to protect commercially sensitive data. Under this regulation, when CMS aggregates, as necessary, the dollar amounts associated with the risk adjustment data—whether at a regional, contract or other level—any detailed (encounter-level) data protected by the Trade Secret Act or FOIA Exemption 4 will be withheld from disclosure. With respect to the risk adjustment data described at § 422.310(f)(2)(v), we are not persuaded that data elements aside from beneficiary identifiers and dollar amounts require protection and, therefore, are adopting a regulation that directs disclosure of such information (assuming all other conditions in this rule are met to obtain such a release) without redaction or aggregation. Section 1106(a) of the Act (42 U.S.C. 1306(a)) provides authority to enact regulations that would enable the agency to release information filed with this agency. (See Parkridge Hospital, Inc. v. Califano, 625 F.2d 719, 724-25 (6th Cir. 1980).) We have engaged in notice-and-comment rulemaking to promulgate regulations to enable the disclosure of the data described at § 422.310(f)(2)(v). The Trade Secrets Act permits government officials to release otherwise confidential information when authorized by law. A substantive regulation issued following notice-and-comment rulemaking, such as this one, provides the authorization of law required by the Trade Secrets Act. Because the Trade Secrets Act would allow disclosure, Exemption 4 (5 U.S.C. 552(b)(4)), which is as coextensive with the Trade Secrets Act, would also not preclude disclosure with respect to the information that would be released under this final rule. We recognize that this conclusion would not apply to the dollar amounts data described in paragraph § 422.310(f)(2)(iv).

Comment: Several commenters stated that releasing payment data may trigger antitrust concerns within both the health plan and provider communities, and cited the August 1996 “Statements of Antitrust Enforcement Policy” (http://www.ftc.gov/sites/default/files/documents/reports/revised-federal-trade-commission-justice-department-policy-statements-health-care-antritrust/hlth3s.pdf), where the Federal Trade Commission and the Department of Justice laid out several conditions for an antitrust safety zone (pages 44 and 45) related to the collective release of negotiated provider payment rates, noting that there would be instances where negotiated rates possibly could be discerned, such as areas with a dominant private payer.

Response: We are not clear what the “collective release of negotiated provider payment rates” has to do with this rulemaking. We understand the term “collective” in this context to mean more than one actor releasing its own specific rates. When CMS approves a release of aggregated payment data under this rule, that release is performed by one actor and not a collective of some sort. Further, our proposed policy of aggregating payment data as necessary will protect the proprietary nature of the payment data. In cases where there is a dominant private payer in a certain geographic area, we will take this into account when determining the appropriate level of aggregation. We understand the Federal Trade Commission and the Department of Justice guidance to address when health care providers act in concert to share or release their payment terms and what circumstances those enforcement agencies believe would ameliorate any collusive intent in such actions. However, this rulemaking pertains to a single actor (CMS), not to a collective action; specifically, CMS may release risk adjustment data for approved research projects, and these are data that were submitted to CMS by MA organizations on the basis of requirements in statute and regulation regarding risk adjustment data collection in the MA program. The underlying negotiation of the payment terms, such as whether the health care providers collectively negotiated them and the respective negotiating position of the MA organizations and the parties, are not part of the information submitted to CMS or disclosed by CMS under § 422.310.

Comment: A few commenters stated that payment data should not be collected by CMS as part of encounter data and should not be used by CMS or released outside of CMS because such data are not relevant to risk adjustment.

Response: We believe that payment data are useful for all of the purposes set forth in this regulation, including, but not limited to, the purpose of risk-adjusting payments to MA organizations. Therefore, we are finalizing in this rule the release of aggregated payment data as necessary to protect commercially sensitive data, subject to all the conditions established in this rule.

Comment: A number of commenters raised issues about the timing for release of risk adjustment data under the rule. Some commenters supported release of risk adjustment data to improve transparency; one commenter stated that there is an urgent need for more reliable consumer comparison shopping tools due to extreme provider price variations in local healthcare markets, and Medicare data could be valuable for this purpose. A few commenters requested that CMS delay release of encounter data to any governmental agency outside of CMS and/or delay release to external entities.

A number of other commenters presented requests for two different types of delay in release of encounter data: (1) A routine delay for all data releases; and (2) a delay applicable only to the first few years of encounter data collected by CMS. First, commenters requested that CMS implement a routine lag in release of encounter data. Some commenters argued that, before release of the data for any given year, CMS should ensure that the data are complete and accurate, for example by validating and identifying any limitations in the data. Other commenters suggested timeframes of between 4 and 6 years for such routine lags, arguing that CMS should allow an established number of years pass before release because utilization, pricing, and similar data elements remain sensitive for a number of years (and could be used for trending competitor's patterns), and many MA organizations have multi-year contracts with their providers (5 or more years), making data even a few years old still commercially sensitive in a marketplace.

Second, a few commenters requested that CMS never release encounter data that was submitted in the initial years of required submission (in particular, that data for 2012 dates of service—the first year of submission—never be released). Alternatively, other commenters suggested that CMS lag release of data from the initial years of submission because: (1) Implementation of encounter data collection via CMS' encounter Data System (EDS) has required frequent and ongoing systems development and modifications on the part of the agency and MA organizations since the initiative began in January 2012, which has resulted in challenges in submission and acceptance of encounter data; and (2) the ICD-10 code set transition likely will result in some instances in which encounter data are incomplete or inaccurate for the 2015 data year, as providers adjust to the coding differences.

Response: Regarding commenters' requests that CMS implement both routine multi-year lags in release of encounter data and targeted delays in the release of encounter data from the initial years of submission, we believe that such delays in release to any agencies and entities described in this rule would defeat the goals of improving transparency in the Medicare program and allowing researchers to use data in a timely manner to improve the administration of and advance policy research on the Medicare program. Also, we refer readers to our response elsewhere in this preamble regarding the impact of such releases on MA bidding.

However, CMS recognizes that there are circumstances unique to the process for collecting risk adjustment data that should be addressed in the timing of releases of such data. CMS allows 13 months after the end of a risk adjustment data collection year for MA organizations to update the risk adjustment data submitted under § 422.310; this period provides MA organizations an opportunity to identify and correct errors in data they have submitted for that data collection year, and ensures that the risk adjustment data is complete and accurate. We do not plan to regularly release risk adjustment data for a data collection year prior to the completion of this period because of the possibility that the data may contain errors or be incomplete for the applicable year. Therefore, to clarify our processes for the purposes of this rule, we have added paragraph (f)(3) to § 422.310, which states that risk adjustment data will not be available for release under paragraph (f) unless:

  • The risk adjustment reconciliation for the applicable payment year has been completed;
  • CMS determines that the data release is necessary under paragraph (f)(1)(vi) for emergency preparedness purposes before reconciliation; or
  • CMS determines that extraordinary circumstances exist to release the data before reconciliation.

An example of an extraordinary circumstance would be a request by the Department of Justice for data for a qui tam case under the False Claims Act. We believe these restrictions on the timing of data releases will address some of the commenters' concerns.

Comment: A few commenters suggested that CMS not release encounter data until CMS uses it for risk-adjusted payment purposes in place of RAPS data. One commenter stated that no data should be released until MA organizations are provided with the MAO-004 encounter data reports because these reports will allow the MA organizations to validate that encounter data are complete for risk adjustment purposes.

Response: First, we do not agree with the commenters that CMS' transition from the use of RAPS data to encounter data for risk-adjusted payment should be a prerequisite for releasing encounter data for the purposes under this final rule. The data are valuable for other purposes besides calibration of the risk adjustment model, as listed in paragraph (f)(1) of § 422.310, and the release of the data is important for transparency. Second, the MAO-004 report, also known as a filtering report, will contain detailed information on which diagnoses are eligible for risk adjustment purposes and will be part of the process that CMS will undertake for risk score calculation. The intent of this report is to assist MA organizations and other encounter data submitters to understand risk score calculation; it is not intended to support validation by MA organizations of the encounters that they have submitted to CMS. Therefore, we do not believe that these filtering reports should be a prerequisite to releasing encounter data associated with any payment year.

Comment: A number of commenters responded to our request for public comment on releasing risk adjustment data for commercial purposes. Many commenters asked CMS not to release data to external entities for commercial purposes. Commenters also noted that CMS does not currently release Part A, Part B, or Part D data for commercial purposes, and argued that CMS should have a consistent policy for release of data for commercial purposes across all Medicare programs, including the Part C Medicare Advantage program; these commenters cited CMS' discussion about such a consistent policy in a final rule, CMS-4159-F, published on May 23, 2014 (79 FR 29844).

In contrast, one commenter supported the use of risk adjustment data by commercial entities to conduct research when the research is focused on legislative, regulatory, or policy development aimed at improving the Medicare program, including projects focused on patterns of care of MA enrollees. This commenter suggested that if CMS moves to define commercial purpose, it should do so narrowly so that, for example, firms requesting data releases under the rule for research on regulatory or policy issues for their clients would not have this type of work construed as commercial. Another commenter stated that encounter data should be widely available to commercial entities, including providers, medical societies, ERISA plans, and insurers, for purposes of improving the value of health care to the consumer. This commenter encouraged CMS to put providers and insurers on an equal footing with each other, with respect to sharing of public data on safety, quality, volume, and intensity of care. Finally, a number of commenters requested that CMS define the term “commercial purposes.”

Response: We appreciate all of the responses to our request for public comments on releasing risk adjustment data for commercial purposes. We recognize that some commenters would like risk adjustment data to be available for commercial purposes, while others are concerned about external entities having risk adjustment data releases approved for projects that have commercial purposes and/or for researchers employed by commercial entities. We consider the issue of whether or not a request for data has a commercial purpose to be an issue that would be addressed under CMS' data sharing policies, which are referenced in § 422.310(f)(2)(ii) of this rule. Regarding commercial purpose, we refer commenters to page 30674 of the preamble of the final rule, CMS-4119-F, published on May 28, 2008 (73 FR 30664), where, for example, there is discussion of research whose primary purpose is to contribute to general knowledge in the public domain.

We agree with commenters that it is appropriate to have consistent policies for the release of data across the original Medicare (Parts A and B) program, the Part D prescription drug program, and the Part C Medicare Advantage program. Although we are not changing CMS' existing policy against releasing data for commercial purposes at this time, we note that, in the event the policy regarding the release of Parts A, B, and D data for commercial purposes were to change, we also would revise our Part C risk adjustment data sharing policies to be consistent with that change. Therefore, if a request for the data under the current policy is for one of the purposes outlined in paragraph (f)(1)(i) through (ix) and also for a commercial purpose, we would consider the commercial purpose as a barrier to the release in the same way here as in the other Medicare programs.

Comment: A few commenters expressed concern about how CMS will evaluate requests for risk adjustment data release. One commenter asked CMS to keep in mind that broad disclosures of data could lead to creation of non-Federal databases, which could negatively affect the privacy and security of beneficiary-specific data. Another commenter cautioned that, in determining what is a minimum dataset necessary for a particular data request, CMS must not approve release of a subset of data or variables that omits critical data, resulting in an analysis with false assumptions about MA encounters. In addition, other commenters were particularly concerned about requests by external entities. One commenter stated that, in evaluating requests from commercial entities, CMS should consider relationships between the corporate and research arms of the commercial entity, and CMS should not assume that data released for research purposes will not be made available to related commercial entities unless specific prohibitions are put in place, or that published research findings will not be used for commercial purposes. Another commenter also expressed concern that external entities may use data released to them for a CMS-approved research project for purposes that go beyond the initial intent of the request.

Response: We will release only the minimum data necessary for a particular study design that CMS has determined meets a use (analytical purpose) finalized in this final rule and if the research project also complies with all other conditions established in this final rule. We believe that CMS' longstanding data sharing procedures (a condition for data release referenced at § 422.310(f)(2)(ii)) will allow CMS to determine the appropriateness of a requested data set and will limit inappropriate use of encounter data. CMS considers all data requests to ensure that the use of the data will not exploit or negatively impact Medicare beneficiaries.

In order for a researcher to gain access to CMS data, the researcher must complete an application process, including submission of a research protocol. The researcher must receive approval of the protocol from CMS. In addition, all requestors are required to sign a Data Use Agreement with the agency that limits the use of the data to only the approved purposes. The Data Use Agreements that CMS uses have and will continue to have enforcement mechanisms. For example, one of CMS' Data Use Agreement forms states: “The User acknowledges that penalties under § 1106(a) of the Social Security Act [42 U.S.C. 1306(a)], including possible imprisonment, may apply with respect to any disclosure of information in the files(s) that is inconsistent with the terms of the Agreement. The User further acknowledges that criminal penalties under the Privacy Act [5 U.S.C. 552a(i)(3)] apply if it is determined that the User, or any individual employed or affiliated therewith, knowingly and willfully obtained the file(s) under false pretenses. The User also acknowledges that criminal penalties may be imposed under 18 U.S.C. 641.”

We believe these restrictions are necessary in order to ensure that data are only requested in compliance with the requirements of the regulations and CMS data sharing procedures, and that data shared by CMS are appropriately protected and are not reused or redisclosed without the necessary approval. Under our data sharing policies, we generally require the requester not to disclose the data to third parties without specific written authorization from us. CMS expects that researchers who receive a CMS-approved release of risk adjustment data will abide by the law, policies, and procedures surrounding use of that data, particularly where the regulation conditions release of the data on CMS data procedures being followed.

Comment: A few commenters requested that, when CMS is making a determination about whether to release risk adjustment data to a requestor, CMS reach out to MA organizations to consult on whether to approve the request. One commenter stated that plans would appreciate the opportunity to advise the agency of any specific concerns they have with respect to release of data for certain purposes and to certain entities, while another commenter asked CMS to allow plans to deny certain requests for data. Finally, a few commenters stated that whenever a stakeholder's data is part of an approved release, that stakeholder should have access to the entire data release for purposes of verification, equity, and accuracy.

Response: Under this rulemaking, we will use CMS existing data sharing procedures (in accordance with § 422.310(f)(2)(ii)) for responding to requests for risk adjustment data. It is not part of CMS' data sharing procedures to contact a submitter of data (for example, a FFS provider, supplier, a Part D sponsor, or an MA organization) whenever a researcher requests or receives approval for access to a data set for a study that includes some of that particular submitter's data (unless the request is made under FOIA). Nor, is it part of the CMS' data sharing procedures to allow an MA organization or another entity to have approval rights in regards to the release of data. In addition, this rule itself sets the standards under which data will be released. Therefore, CMS will not be notifying MA organizations or their contracted providers when data requests that may include their risk adjustment data are received or approved.

Finally, CMS could not simply release a risk adjustment data set to a stakeholder that had not received approval through CMS data sharing procedures simply on the grounds that the stakeholder's risk adjustment data submissions to CMS comprise one part of a data file released to a researcher for an approved study.

One of the best ways MA organizations can address their concerns about the accuracy of risk adjustment data available for release is to continue working with CMS to improve the quality of risk adjustment data they submit to CMS.

Comment: One commenter opposed the release of beneficiary identifying information to external entities, including other HHS agencies, other Federal Executive Branch agencies, Congressional support agencies, and States. Another commenter encouraged CMS to establish and impose appropriate penalties for any breach of privacy related to beneficiary identifiable information by external entities.

Response: We understand the need to protect beneficiary identifying information. As finalized in § 422.310(f)(2)(iii) of the regulation, CMS release of risk adjustment data is subject to the protection of beneficiary identifier elements and beneficiary confidentiality, including—

  • A prohibition against public disclosure of beneficiary identifying information;
  • Release of beneficiary identifying information to other HHS agencies, other Federal executive branch agencies, and States only when such information is needed; and
  • Release of beneficiary identifying information to external entities only to the extent needed to link datasets.

Any release of beneficiary-identifiable data must follow the policies in CMS' data sharing procedures. We intend to protect the beneficiary data through, for example, encryption, or removal of the confidential fields when risk adjustment data is released. As we discussed above and in the final rule, CMS-4159-F, published on May 23, 2014 (79 FR 29844), all users accessing beneficiary identifiable data are required to sign CMS' Data Use Agreement, which addresses privacy and security for the data CMS discloses. The Data Use Agreement also contains provisions regarding access to and storage of CMS data to ensure that beneficiary identifiable information is stored in a secure system and handled according to CMS' security policies. CMS has an established process to evaluate requests for data to ensure that there are appropriate safeguards in place to protect beneficiary privacy. We believe this process contains the necessary checks to ensure that the risks of the disclosure are minimal.

Comment: One commenter requested CMS to release risk adjustment data by creating an MA encounter data Standard Analytic File (SAF) in limited data set form (LDS) to extend research that can currently be done by users of LDS SAFs across sites using fee-for-service Medicare claims data. A few other commenters argued that these data should be routinely available through public use files, not just through the project-specific process set forth in this rule.

Response: We appreciate the suggestions and will take them into consideration for future additional guidance. With regard to the issue of Public Use Files, we believe that the nature of data—see the discussion above—make it appropriate to include the protections imposed by this rule, including the limits on the purpose of the disclosure, release of only the minimum necessary data, the incorporation of CMS data sharing policies and procedures, and additional protections for certain data elements.

After consideration of the public comments we received, we are finalizing, as proposed, the policies regarding CMS release of data in § 422.310(f)(2), with the exception of five changes to the regulation text. First, we clarify that the paragraph (f)(1)(ix) purpose permitted by other laws is for “purposes authorized by other applicable laws.” Second, we have deleted the term “congressional support agencies” from paragraph (f)(2)(B) in order to be consistent with the introductory language at paragraph (f)(2) of this regulation. Third, to clarify that data aggregation will be of the dollar amounts reported on an associated encounter data record, we are finalizing paragraph (f)(2)(iv) to state that subject to the aggregation of dollar amounts reported for the associated encounter to protect commercially sensitive data. Fourth, in order to explicitly address the risk adjustment data elements that do not fall into either of the two protected categories (beneficiary identifiers and dollar amounts) and to clarify that such data are subject to release without redaction or aggregation, we are finalizing an additional paragraph (f)(2)(v) to state that risk adjustment data other than data described in paragraphs (f)(2)(iii) and (f)(2)(iv) of the section will be released without the redaction or aggregation described in paragraphs (f)(2)(iii) and (f)(2)(iv) of the section, respectively. Finally, we are adding paragraph (f)(3) to § 422.310 to clarify when risk adjustment data will be available for release, to state that risk adjustment data will not be available for release under paragraph (f) unless—

  • The risk adjustment reconciliation for the applicable payment year has been completed;
  • CMS determines that the data release is necessary under paragraph (f)(1)(vi) of the section for emergency preparedness purposes before reconciliation; or
  • CMS determines that extraordinary circumstances exist to release the data before reconciliation.

3. Technical Change

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978), we proposed to amend § 422.300, which identifies the basis and scope of the regulations for payments to MA organizations, to add a reference to section 1106 of the Social Security Act, which governs the release of information gathered in the course of administering our programs under the Act.

We did not receive any public comments on this technical change, and we are finalizing without modification our proposed amendment to § 422.300, to add a reference to section 1106 of the Social Security Act, which governs the release of information gathered in the course of administering our programs under the Act.

XI. Changes to Enforcement Provisions for Organ Transplant Centers

A. Background

In February 2004, the Office of the Inspector General (OIG) published a report entitled “Medicare-Approved Heart Transplant Centers” (OEI-01-02-00520), in which the OIG outlined three recommendations for CMS' oversight of heart transplant centers: (1) that CMS expedite the development of continuing criteria for volume and survival-rate performance and for periodic recertification; (2) that CMS develop guidelines and procedures for taking actions against centers that do not meet Medicare criteria for volume and survival-rate performance requirements; and (3) that CMS take immediate steps to improve its ability to maintain accurate and timely data regarding the performance of transplant centers.

As part of CMS' efforts to strengthen oversight of organ transplant centers, we published the final rule “Medicare Program: Hospital Conditions of Participation, Requirements for Approval and Re-approval of Transplant Centers to Perform Organ Transplants” on March 30, 2007 in the Federal Register (72 FR 15198) that established conditions of participation (CoPs) for organ transplant centers and applied the survey and certification enforcement process (that is used for all other providers and suppliers of Medicare services) to Medicare-approved transplant centers. In the preamble of that final rule, we discussed our efforts to improve organ donation and transplantation services and our goals to: (1) Protect patients who are awaiting organs for transplantation; (2) establish key quality and procedural standards; and (3) improve outcomes for patients (such as patient survival) and reduce Medicare expenses by decreasing the likelihood that a transplant would fail.

In the March 30, 2007 final rule, we codified the CoPs for transplant centers at 42 CFR Part 482, Subpart E (§§ 482.68 through 482.104) and the special procedures for approval and re-approval of organ transplant centers at 42 CFR 488.61. The CoPs set forth explicit expectations for outcomes, patient safety, informed choice, and quality of transplantation services. In particular, §§ 482.80 and 482.82 specify that a transplant center's outcomes are not acceptable if, among other factors, the number of observed patient deaths or graft failures 1 year after receipt of a transplant exceeds the risk-adjusted expected number by 1.5 times, based on the most recent program-specific report from the Scientific Registry of Transplant Recipients (SRTR).

Failure to meet the transplant center requirements will lead CMS to deny approval or re-approval of a center's Medicare participation under § 488.61. However, §§ 488.61(a)(4) and (c)(4) authorize CMS to consider mitigating factors when determining approval and re-approval, respectively, for a transplant center that has not met the data submission, clinical experience, or outcome requirements, or other CoPs, if the center submits a formal, written request for such a review. The existing regulations do not limit the factors that CMS may consider, but enumerate, at a minimum, the following factors to be considered: (1) The extent to which outcome measures are met or exceeded; (2) the availability of Medicare-approved transplant centers in the area; and (3) extenuating circumstances that may have a temporary effect on a transplant center meeting the requirements under the CoPs, such as a natural disaster. CMS approval or re-approval based on mitigating factors permits a transplant center to operate as a Medicare-approved transplant center under certain circumstances despite a finding of noncompliance. Under existing regulations at §§ 488.61(b)(4)(iv) and (c)(4)(iv), CMS will not approve a center with condition-level deficiencies but may re-approve a center with standard-level deficiencies.

B. Basis for Proposed and Final Policies

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27977), we proposed to strengthen, clarify, and provide additional transparency for the survey, certification, and enforcement procedures under § 488.61 for transplant centers that are requesting initial approval or re-approval for participation in the Medicare program when the centers have not met one or more of the CoPs but wish to have certain mitigating factors taken into consideration.

1. Expansion of Mitigating Factors Based on CMS' Experience

The existing organ transplant enforcement regulation at § 488.61 does not provide detailed information on the factors generally needed for approval or re-approval of a request based on mitigating factors that a transplant center may make in order to participate, or continue to participate, in Medicare. However, since the adoption of the organ transplant CoPs and corresponding enforcement regulations, we have expanded our knowledge regarding: (a) The factors and processes that promote improvement in transplant center outcomes; and (b) other mitigating factors that merit explicit recognition under CMS regulations.

Most of the requests that we have approved based on mitigating factors have been for transplant centers that were out of compliance with CMS outcomes requirements, but were then able to (a) effect substantial program improvements and (b) based on meaningful post-transplant survival data, demonstrate recent and much improved patient and graft survival subsequent to those program reforms. These performance improvements occurred after the program was cited for substandard performance by CMS and was at risk of losing Medicare participation, usually while the program was operating during the mitigating factors review process or under a binding Systems Improvement Agreement (SIA) with CMS. Under an SIA, CMS agrees to extend the effective date of a prospectively scheduled termination from Medicare participation (that is, denial of re-approval) and holds in temporary abeyance a final review of the transplant center's mitigating factors request, if the transplant center agrees to engage in a structured regimen of quality improvement to improve performance during a specified period of time. At the end of the SIA period (typically 12 months), we review the transplant center's performance and make a final decision as to whether: (a) The transplant center's patient and graft survival is within the acceptable limits set forth in the regulations; or (b) the transplant center qualifies for approval or re-approval based on mitigating factors.

As of August 2013, CMS had rendered a final determination for 129 requests for approval to operate as a Medicare-approved transplant center based on mitigating factors. Of those determinations, 48 of the requests (37.8 percent) were approved based on information provided by the transplant center on its mitigating factors alone (that is, without entering into an SIA) because the transplant program had implemented substantial program improvements during the extended CMS review period, and CMS concluded that the most recent patient and graft survival data (taking into consideration the lag time in data inherent in the SRTR reports) demonstrated current compliance with outcome requirements. Another 33 requests (25.6 percent) were eventually approved on the basis of each transplant program's successful SIA completion and much improved outcome data for the affected program. A total of 24 requests (18.6 percent) involved transplant programs that were approved (and the transplant centers were permitted to continue Medicare participation) because CMS determined that the transplant centers met the outcome requirements during the time period it took for CMS to review the mitigating factors request, based on a new SRTR report that because available during the 210-day mitigating factors review period. Requests from another 2 programs (1.6 percent) were approved in which the programs had not enter into an SIA but had made extensive use of innovative practices involving key factors that were not included in the SRTR risk-adjustment methodology. An additional 2 requests (1.6 percent) were approved because natural disasters temporarily impacted the transplant centers. Finally, 20 requests (15.5 percent) were denied approval based on mitigating factors because the programs failed to meet the outcome or clinical experience requirements and were not able to demonstrate improvements and recent outcomes or experience that would warrant approval based on mitigating factors. These 20 programs voluntarily withdrew their participation from the Medicare program.

Comment: Commenters supported CMS' efforts to add clarity and increase transparency, and most commenters conveyed specific suggestions for further clarity or revision.

Response: We appreciate the thoughtful nature of all comments we received and the specificity of the suggestions that were made. We address those specific suggestions below in the context of each relevant section of our proposed language.

2. Coordination With Efforts of the Organ Procurement and Transplantation Network (OPTN) and Health Resources and Services Administration

When we adopted the outcome standards for transplant programs in 2007, we sought to harmonize CMS' outcome standards with standards of the Organ Procurement and Transplantation Network (OPTN) so that transplant centers would have a single, consistent set of outcome expectations on which to focus. We also sought to organize CMS activities in a manner that would reinforce and continue the OPTN as the first line of external review and quality improvement for transplant centers. The OPTN is the unified transplant network established under the National Organ Transplant Act (NOTA) of 1984. The NOTA called for the network to be operated by a private, nonprofit organization under Federal contract. The OPTN is a public-private partnership that links all of the professionals involved in the donation and transplantation system. The primary goals of the OPTN are to: (a) Increase the effectiveness and efficiency of organ-sharing and equity in the national system of organ allocation; and (b) increase the supply of donated organs available for transplantation. For more details about the OPTN, we refer readers to the Web site at: http://optn.transplant.hrsa.gov/optn/profile.asp.

The OPTN and the Health Resources and Services Administration (HRSA) have been considering adoption of an alternative methodology for calculating expected transplant outcomes, known as the “Bayesian” methodology, and for setting a threshold that would “flag” a transplant center for OPTN review of performance. At its June 2014 meeting, the OPTN determined to adopt the new methodology and begin using the new methodology in CY 2015. HRSA and the SRTR will continue to provide the information needed by CMS, and we will work together to track how transplant programs are rated under the “Bayesian” methodology and the CMS outcomes standards. At the present time, CMS has insufficient experience with the new “Bayesian” methodology, and insufficient data, to determine an appropriate threshold for a Medicare outcomes deficiency under a “Bayesian” methodology. Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule, we did not propose any changes in our CoPs regarding this new methodology. However, we wish to continue to coordinate with the OPTN's efforts. Therefore, we proposed that if a program has been cited for an outcomes deficiency by CMS, but has not been flagged for review by the OPTN, CMS would take these facts into consideration if the transplant program has requested approval based on mitigating factors. For a perspective on the “Bayesian” methodology, we refer readers to the Web site at: http://www.srtr.org/faqs/16.aspx.

Comment: One commenter stated that an important lesson learned over the past several years is the need to further coordinate and reconcile differences between the requirements and processes used by CMS and the OPTN in regulating the quality of services provided by transplant centers throughout the country.

Response: We concur with the value of coordinating requirements and processes to the extent permitted by the different roles played by the OPTN and CMS. Our desire to coordinate with HRSA and OPTN gave rise to many of the proposals discussed here. Further, staff from the United Network for Organ Sharing (UNOS, that is under contract with HRSA) and CMS developed a cross-walk of the OPTN and CMS requirements, updated the cross-walk in July 2014, and published it online at: http://www.optn.transplant.hrsa.gov/content/policiesAndBylaws/evaluation_plan.asp. While CMS and OPTN have many mutually-reinforcing requirements, the two organizations largely cover different aspects of the transplant universe. The OPTN, for example, excels at the data reporting and management that CMS does not address, but CMS reinforces OPTN through a CMS requirement that transplant centers timely and adequately report data in accordance with OPTN requirements. Similarly, there are other areas not covered by OPTN that CMS addresses (such as the CMS requirement that every transplant program have an effectively functioning, internal quality assessment and performance improvement (QAPI) system). CMS, HRSA, and the OPTN observed both CMS' and OPTN's onsite surveys in the past year, with the intent to identify areas to reduce the burden on transplant programs, as well as improve the efficiency of the survey process. Although the surveys are conducted very differently based on the distinct roles of the two agencies, the OPTN has now combined the living donor survey with its regular survey to eliminate the need for an additional survey. HRSA and CMS also maintain monthly meetings and, as the need arises, more frequent meetings of workgroups. Another recent development was the CMS final rule (“Medicare and Medicaid Programs; Regulatory Provisions to Promote Program Efficiency, Transparency, and Burden Reduction; Part II; Final Rule”) published in the Federal Register on May 12, 2014 (79 FR 27106) designed to reduce regulatory burden and increase efficiency. Among other features, the rule permits CMS to vary the frequency of onsite transplant center surveys compared to the earlier and standard CMS 3-year cycle that applied to all transplant programs. CMS maintains continuous review of transplant outcomes, responds to complaints at any time, and is notified by transplant centers when there is a major change in a center. With these continuous activities, and the added flexibility of the May 12, 2014 final rule, we expect to extend the average onsite survey frequency to a range of approximately 3 to 5 years. We expect some centers will be surveyed more frequently than the average and other centers less frequently, depending on CMS' assessment of the need for a survey of a particular program. We expect that this change will help reduce the extent to which any particular transplant program will have two different surveys (OPTN and CMS) that occur within a proximate time of each other. We look forward to continuing to work with HRSA, UNOS, OPTN, and the transplant community on these and other coordination issues.

C. Provisions of the Proposed and Final Regulations

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27978), we proposed to revise the regulations at § 488.61 to include specific additional provisions describing and expanding the mitigating factors that CMS may consider when evaluating requests and explain the conditions under which each factor would apply.

Comment: One commenter recommended that CMS modify § 482.82 of the regulations to state that a transplant center that does not meet the data submission, clinical experience, or outcomes requirements would be considered to be out of compliance at the Condition level only if CMS determines that a mitigating factors request would not be approved. In other words, a mitigating factors request would be processed before citing a center for a deficiency at a Condition level.

Response: We are not adopting this recommendation for a number of important reasons. First, the mitigating factors provision is intended to enable CMS to recognize special situations so that we may calibrate enforcement actions appropriately. The provision is not intended to remove the possibility of enforcement or the likelihood of enforcement if appropriate corrections are not made.

Second, we believe the mitigating factors and SIA processes have been quite successful in promoting improved patient outcomes as a direct result of the full hospital alignment in support of each involved transplant center. Based on the past 7 years of experience with programs that have had substandard outcomes, we believe that strong whole-institution support has been generated directly in response to the deficiency citation and accompanied by clear potential for enforcement action against a program that has had substandard performance.

We note that most transplant programs maintain excellent outcomes continuously and are not cited for noncompliance with CMS outcome expectations. Another group of transplant programs temporarily exceed the outcomes thresholds based on a single SRTR report, but soon manifest outcomes within the acceptable tolerance limits in the next report. Such programs may be cited by CMS at a Standard level, rather than the Condition level, because the results are evident in only a single SRTR report. A Standard level citation requires corrective action but does not require mitigating factors approval because a Standard level citation by itself does not put the program on a schedule for termination of Medicare participation. A smaller third group of transplant programs experience long-term problems and may be cited by CMS at the Condition level, but engage with the OPTN and soon recover. These programs may apply for approval based on mitigating factors, but are more likely than other applicants to be approved without greater involvement with CMS. An even smaller group of programs are cited by CMS at the Condition level and are eventually approved based on mitigating factors, but only after an extended period of time and a more involved regimen of quality improvement through an SIA. CMS' policy has been to cite at the Condition level only if the tolerance limits are exceeded in the most recent SRTR report and in at least one other SRTR report within the past five SRTR reports. Although the number and percentages vary depending on the particular SRTR reports involved, we generally find that approximately 9 to 11 percent of the programs do not meet the CMS outcome thresholds in any one particular SRTR report. However, only approximately 3 to 4 percent of the programs tend to fail to meet the outcomes requirement in the most recent SRTR report and in at least one other report of the most recent 5 reports. This number is further reduced during the 210-day period that we permit for consideration of a mitigating factors request by CMS. The 210-day period allows sufficient time for a sixth SRTR report to appear and perhaps show evidence of outcomes that would remove the deficiency and remove the need for mitigating factors approval.

As a result of the way we implemented the citation and mitigating factors processes, those programs cited by CMS at the Condition level tend to be programs that generally have more extensive issues for which full hospital alignment and support are most needed, or programs that, for a variety of reasons, have been immune to prior efforts to improve outcomes. We believe we have structured CMS enforcement actions in a manner that permits considerable time for a transplant program to take action on its own, and allows many programs to engage successfully with the OPTN for timely resolution. However, for the residual, smaller number of programs that have not been so successful, our experience indicates that it is questionable whether the hospital alignment and other actions needed to achieve substantial and sustainable improvement would have occurred without the clear prospect that Medicare participation might soon end. The fact that many of the transplant programs cited at the Condition level had already been engaged with the OPTN in a peer review process without timely improvement in outcomes, lends credence to the belief that the clear, and potentially imminent, ending of Medicare participation has been the stimulant that eventually brought the various departments of the hospital together to finally achieve the results that had eluded the many dedicated individuals who had previously labored to achieve better outcomes.

Third, only two situations involved mitigating factors where approval was based on natural disasters, two situations involved extensive use of innovative practices, and a small number of situations involved clinical experience. The remainder, constituting the vast preponderance of mitigating factors requests we have received, involved transplant programs whose patient or graft survival outcomes exceeded the tolerance limits in the CMS regulation for programmatic reasons. When such cases have been approved, the approval has been based on recovery and improvement in outcomes during the extended time period (210 days) that CMS permits for mitigating factors consideration or the even longer time period that CMS permits under an SIA. We believe that eventual approval of a mitigating factors request should not be construed to imply that the substandard outcomes never occurred, or to obscure the history or facts that led to the recovery efforts. We regard such history as important matters of public record.

Transplant programs that may be approved based on mitigating factors due to confirmed innovative practice may be an area for which the commenter's suggestion has merit, and we will give further consideration to this area for potential future action. Unlike mitigating factors approval that is made pursuant to recovery from a period of substandard outcomes or even natural disaster, approval of mitigating factors due to innovative practice may indicate the absence of a deficiency in the first place.

We will be pleased to continue a dialogue with the transplant community regarding these issues and to consider other approaches to ensure that a strenuous improvement effort, such as that which is required in an SIA, is not misinterpreted.

1. Expansion of Mitigating Factors List, Content, and Timeframe

In the FY 2015 IPPS/LTCH PPS proposed rule, we noted that the regulations at §§ 488.61(a)(4) and (c)(4) provide three specific mitigating factors for review by CMS when determining whether a transplant center can be approved or re-approved, respectively, based on mitigating factors. These mitigating factors are: (1) The extent to which outcome measures are met or exceeded; (2) the availability of Medicare-approved transplant centers in the area; and (3) extenuating circumstances that may have a temporary effect on meeting the CoPs. In the FY 2015 IPPS/LTCH PPS proposed rule, we proposed to move the listing of mitigating factors from paragraphs (a)(4)(i) through (a)(4)(iii) and (c)(4)(i) through (c)(4)(iii) to new proposed paragraphs (f), (g), and (h) under § 488.61, and to include additional factors under these three new proposed paragraphs that may be reviewed in addition to the existing three factors. We proposed to move existing paragraphs (a)(4)(iv) and (c)(4)(iv) to the proposed new paragraph (g)(2). We also proposed to provide clarification of the existing three mitigating factors and the conditions under which they would apply. Finally, we proposed to revise the preexisting paragraphs (a)(4) and (c)(4) of § 488.61 to include cross-references to the new proposed paragraphs (f), (g), and (h).

We note that an administrative rule we published as a final rule in the Federal Register on May 12, 2014 (79 FR 27106) within days of publication of the FY 2015 IPPS/LTCH PPS proposed rule renumbered the elements of § 488.61(c) and added § 488.61(c)(3)(v) and made other amendments to this section. Specifically, the May 12, 2014 administrative rule removed § 488.61(a)(7), revised the introductory text of paragraphs (c) and (c)(1), and revised paragraph (c)(1)(ii). In addition, the final administrative rule removed paragraph (c)(2) and redesignated paragraphs (c)(3), (c)(4), and (c)(5) as paragraphs (c)(2), (c)(3), and (c)(4), respectively. Finally, the final administrative rule revised the text of newly designated paragraphs (c)(2), (c)(3)(i), and (c)(3)(ii), added a new paragraph (c)(3)(v), and revised paragraph (e). As a result of these changes, in this final rule, we are replacing the renumbered paragraphs of § 488.61(c)(3)(i) through (c)(3)(iii) of the administrative final rule as § 488.61(f)(1)(i), (f)(1)(ii), and (f)(1)(iii), respectively. The renumbered paragraph § 488.61(c)(3)(iv) is moved to the new § 488.61(g)(2). We also are incorporating the new paragraph that was added in the final administrative rule (§ 488.61(c)(3)(v)) as the new paragraph § 488.61(f)(1)(iv).

We note that in all subsequent references involving § 488.61(c), we use the regulatory citations as revised by the May 12, 2014 final rule (79 FR 217060) and described above. Under proposed new paragraph § 488.61(f)(1), we proposed to move and relist the three mitigating factors currently under both paragraphs (a)(4)(i) through (a)(4)(iii) and paragraphs (c)(3)(i) through (c)(3)(iii). We further proposed to expand the mitigating factors that CMS may consider by adding more description to those factors, as well as by adding new factors that may be reviewed. We also proposed to specify the procedures and timeframes for transplant centers to request consideration for approval based on mitigating factors.

Specifically, in proposed new paragraph (f)(1), we proposed to specify the mitigating factors, except for situations of immediate jeopardy, as described below.

  • The extent to which outcome measures are not met or exceeded (existing paragraphs (a)(4)(i) and (c)(3)(i); proposed paragraph (f)(1)(i)).
  • Availability of Medicare-approved transplant centers in the area (existing paragraphs (a)(4)(ii) and (c)(3)(ii); proposed paragraph (f)(1)(ii)).
  • Extenuating circumstances (for example, natural disaster) that may have a temporary effect on meeting the CoPs (existing paragraphs (a)(4)(iii) and (c)(3)(iii); proposed paragraph (f)(1)(iii)).
  • Program improvements that substantially address root causes of graft failures or patient deaths and that have been implemented and institutionalized on a sustainable basis (proposed new paragraph (f)(1)(iv)).
  • Recent patient and graft survival data to determine if there is sufficient clinical experience and survival for CMS to conclude that the program is in compliance with CMS requirements, except for the data lag inherent in the reports from the SRTR (proposed new paragraph (f)(1)(v)).
  • Extensive use of innovative transplantation practices relative to other transplant programs, such as a high rate of transplantation of individuals who are highly sensitized or children who have undergone the Fontan procedure, where CMS finds that the innovative practices are supported by evidence-based, published research or nationally recognized standards or Institutional Review Board (IRB) approvals, and the SRTR risk-adjustment methodology does not take the relevant key factors into consideration (proposed new paragraph (f)(1)(vi)).
  • The program's performance, based on the OPTN method of calculating patient and graft survival, is within the OPTN's thresholds for acceptable performance and does not flag OPTN performance review under the applicable OPTN policy (proposed new paragraph (f)(1)(vii)).

Under proposed new paragraph (f)(2), we proposed to include details for the content of the request for consideration of mitigating factors, based on examples that have proven to be most useful in considering successful mitigating factors requests. Specifically, we proposed that a request for consideration of mitigating factors include sufficient information to permit an adequate review and understanding of the transplant program, the factors that have contributed to outcomes, program improvements or innovations that have been implemented or planned, and, in the case of natural disasters, the recovery actions planned. Examples of information to be submitted with each request could include, but are not limited to, the following:

(i) The name and contact information for the transplant hospital and the names and roles of key personnel of the transplant program;

(ii) The type of organ transplant program(s) for which approval is requested;

(iii) The CoPs that the program failed to meet, and with respect to which the transplant center is requesting CMS' review of mitigating factors;

(iv) The rationale and relevant supporting evidence for CMS' review must include, but not be limited to—

○ Root Cause Analysis of patient deaths and graft failures, including factors the program has identified as likely causal or contributing factors for patient deaths and graft failures;

○ Program improvements or innovations (where applicable) that have been implemented and improvements that are planned;

○ Patient and donor/organ selection criteria and evaluation protocols, including methods for pre-transplant patient evaluation by cardiologists, hematologists, nephrologists, and psychiatrists or psychologists, to the extent applicable;

○ Organizational chart with full-time equivalent levels, roles, and structure for reporting to hospital leadership;

○ Waitlist management protocols and practices relevant to outcomes;

○ Pre-operative management protocols and practices;

○ Immunosuppression/infection prophylaxis protocols;

○ Post-transplant monitoring and management protocols and practices;

○ Quality Assessment and Performance Improvement (QAPI) Program meeting minutes from the most recent four meetings and attendance rosters from the most recent 12 months;

○ Quality dashboard and other performance indicators;

○ Recent outcomes data for both patient survival and graft survival; and

○ Documentation of whether the program has engaged with the OPTN to review program outcomes, the status of any such review, and any steps taken to address program outcomes in accordance with the OPTN review.

Under proposed new paragraph (f)(3), we proposed to specify a timeline for the transplant program to submit a request for mitigating factors and to make clear that, for requests related to clinical experience or outcomes, the program has additional time within which to submit supporting information. Specifically, we proposed that within 10 days after CMS has issued formal written notice of a Condition-level deficiency to the program, CMS must receive notice of the program's request to seek consideration of mitigating factors. CMS would require that all information necessary for consideration be received within 30 days of CMS' initial notification for any deficiency, except a deficiency based on insufficient clinical experience or outcomes; and within 120 days of CMS' written notification for a deficiency based on insufficient clinical experience or outcomes. Failure of a transplant program to meet these timeframes may be the basis for denial of requests for consideration based on mitigating factors.

Comment: One commenter stated that mitigating factors consideration should not be available for either initial applications or for deficiencies that involve process requirements (as opposed to clinical experience or outcomes). Examples of process requirements include the requirement to match donor and recipient blood types, ensure informed consent, or engage in multi-disciplinary planning. The commenter suggested that if the mitigating factors provision applied to process CoPs, CMS should clarify the circumstances under which a program ought to apply for mitigating factors rather than submit a plan of correction. The commenter suggested that process CoPs be handled through plans of correction rather than through mitigating factors.

Response: We agree that all process CoPs should be handled through the plan of correction process and that only a deficiency involving data submission, clinical experience, or outcomes should involve both the required plan of correction and an optional mitigating factors request. A transplant program cited for a process CoP deficiency (or any deficiency) would not risk termination of its Medicare participation without a prior opportunity to submit a plan of correction and demonstrate that the deficiency has been removed in a timely manner. Since the time the CMS transplant regulation became effective in 2007, every instance in which a program was cited for a process CoP has been successfully remedied by timely action on the part of the transplant program. As a result, no transplant program has had its Medicare participation terminated as a result of failure to comply with a process CoP. We agree that the use of mitigating factors in the case of process deficiencies is not only unnecessary, but is also confusing to transplant centers and may interfere with the prompt remediation of those process deficiencies. In the final regulation, at § 488.(61)(f)(1) and elsewhere, we therefore limit the mitigating factors provision to deficiencies cited for noncompliance with the data submission, clinical experience, or outcomes requirements specified at § 488.80 and § 488.82.

We also agree with the commenter that, in the case of an initial application for Medicare certification, every transplant program should be in full compliance with all process CoPs without needing to rely on mitigating factors consideration. However, we are retaining the ability of an initial applicant to request mitigating factors consideration with respect to the data submission, clinical experience, or outcomes requirements. We retain such ability because there may be situations where a transplant program has gone inactive beyond the time period allowed by CMS and is seeking Medicare reinstatement, or has withdrawn or lost Medicare participation due to substandard outcomes or lack of sufficient clinical experience, and is seeking reinstatement. In such situations, the latest available SRTR report may still show the program to have substandard outcomes or insufficient clinical experience for the 2.5 year retrospective period covered in the report, despite the fact that subsequent program improvements may have enabled greater clinical experience, or much better patient and graft survival with more recent data that meets the criteria for CMS approval based on mitigating factors. With the refinements discussed previously (wherein the mitigating factors provision will not apply to process CoPs), we believe that the retention of the mitigating factors provision for initial applicants is warranted.

Comment: With regard to the expanded list of factors at § 488.61(f)(1) that CMS may consider, one commenter suggested that not every request should necessarily be required to cover all of the factors listed. Instead, the factors covered should be tailored to the particular circumstances in question.

Response: We agree with the commenter. The intent of § 488.61(f)(1) was not to require every application to address every possible factor, but to recognize CMS' obligation to consider all of the listed factors, as applicable. We acknowledge the potential for confusion on this matter, and therefore, at § 488.61(f)(1) in this final rule, we have clarified that CMS will consider such mitigating factors as may be appropriate in light of the nature of the deficiency and circumstances involved. We would not necessarily expect a program that requests consideration on the basis of innovative practice, for example, to detail all the improvements that have been implemented. We would instead expect such a program to explain its innovative practice, the extent of such practice, the evidentiary basis that established it as an innovative practice, the particular cases in the applicable SRTR report that involved innovative practices, and similar factors applicable to the use and outcomes of innovative approaches.

Comment: One commenter noted that a final rule published in the Federal Register on May 12, 2014 (79 FR 21706) made changes to § 488.61(c)(4), renumbering it as § 488.61(c)(3) and adding a factor at § 488.61(c)(3)(v), and specifying that CMS would consider program improvements that address root causes of patient deaths or graft failures if the improvements are supported by recent outcomes data that permit CMS to conclude that the program is in compliance with CMS outcomes expectations. In other words, in the May 12, 2014 final rule, CMS sought to clarify that both program improvements and recent data showing acceptable outcomes, together, comprise a single critical factor in our determination as to whether mitigating factors approval should be granted. CMS also sought to make clear that CMS will examine data that are more recent than the data in the latest available SRTR report that covers a retrospective 2.5 year period. The commenter observed that the subsequent regulation proposed in the FY 2015 IPPS/LTCH PPS proposed rule would move this provision to the new § 488.61(f)(1)(iv) and (f)(1)(v) as two separate considerations, making it unclear whether both improved data and substantial improvements are needed. The commenter asked that CMS clarify whether a program must demonstrate both substantial improvements and recent improved outcomes data, or whether program improvements without better outcomes data would suffice. The commenter expressed the opinion that it should be sufficient for a program to demonstrate that it had implemented substantial program improvements on a sustainable basis, without having to provide recent outcomes data that allow CMS to conclude that the improvements have resulted in recent observed deaths or graft failures that are less than 150 percent of the risk-adjusted expected number 1 year after transplant. Several other commenters simply stated that these parts of the proposed regulation were not entirely clear and should be clarified.

Response: We believe that program improvements and data showing improved outcomes subsequent to the program improvements are both needed and should be considered as a single two-sided but integrated consideration. We do not agree that mitigating factors should be approved without recent data that demonstrate actual improvements in outcomes in the manner described above. In our considerable experience with more than 129 mitigating factor applications, we have encountered many situations where program improvements were implemented on a sustainable basis, but outcomes either did not improve or did not improve sufficiently to bring the program into compliance within a reasonable period of time. Often the problem was that the improvements were well-warranted, but insufficient. Often the improvements did not address root causes, and the root causes did not become apparent until a multidisciplinary peer review team (organized under an SIA) conducted an onsite review and, together with the transplant program staff, gained new insights into systemic factors that contributed to substandard outcomes. In other cases, the program implemented improvements that were within the transplant program's purview, but the hospital did not alter other aspects of hospital operations that were instrumental in affecting patient or graft survival. The transplant programs often were aware of other hospital-wide factors that were important, but were unable to effect change in those hospital-wide factors until the hospital agreed to enter into an SIA with CMS. Examples of hospital-wide factors include, but are not limited to, the working relationships between the transplant center and the intensive care unit (ICU), availability of transplant-trained specialty physicians (such as in cardiology, hepatology, anesthesiology, or nephrology), adequacy of staffing levels, and information technology support, among others.

With regard to the requested clarification for the new section § 488.61(f)(1)(iv) of the regulations, we note that the SRTR data, upon which the CMS outcomes expectations rely, cover a retrospective 2.5 year period. The data are further dependent on a 1-year post-transplant period during which patient and graft survival are tracked. We appreciate that a transplant program may implement improvements and it will take time for the results of the improvements to become manifest in the SRTR data. This new section is intended to make clear that CMS will examine data that are more recent than the data in the latest available SRTR report. We will make a judgment as to the usability of those data depending on the number of transplants and the amount of post-transplant survival time available to be analyzed after major program improvements have been implemented. For example, a kidney transplant center may conduct 80 transplants per year, but have only 35 transplants that both occurred after the major program improvements were implemented and also have a sufficient post-transplant survival period (1-year post-transplant survival period) to constitute data that are reasonable to use in evaluating outcomes. It is not likely that the 35 transplant cases would be included in the latest available SRTR report. Nonetheless, this number of cases with such a post-transplant tracking period may be sufficient and would be considered by CMS. We acknowledge that, by looking at a time period shorter than the 2.5 year period of the SRTR reports and looking only at the observed/expected ratio, we may approve programs that seem to have improved outcomes d by chance. However, if there is a combined demonstration of implemented program changes and an improved survival ratio based on adequate numbers, we believe the risk is warranted. We also do not mean to imply that 35 cases is a magic number, but is illustrative for purposes of clarifying CMS' intention.

Therefore, we are finalizing these provisions at § 488.61(f)(1)(iv) as a combined factor (program improvements plus improved outcomes data). The final paragraph is consistent with the final regulation published as § 488.61(c)(3)(v) on May 12, 2014 (79 FR 27106), but now is moved to the new § 488.61(f)(1)(iv). Paragraph (f)(1)(iv) of § 488.61 in this new final rule now combines the two factors that were proposed in the FY 2015 IPPS/LTCH PPS proposed rule as § 488.61(f)(1)(iv) and (f)(1)(v).

Comment: With regard to the content of mitigating factors requests described in proposed § 488.61(f)(2), one commenter suggested that instructions related to specific information that must be included as part of a mitigating factors request should not be included in regulations but, instead, in CMS instructions that can be more easily modified as transplant centers and CMS gain additional experience with the types of information that may be useful. The commenter also expressed concern that it appeared that all the information was required of every request even if certain items were not relevant.

Response: We appreciate the commenter's concern. On the basis of 7 years of experience and review of 129 mitigating factors requests, we are confident that there are certain items of information that are almost always important in a mitigating factors request. We agree that not every item is needed in every request. Therefore, in this final rule, we reorganized into various categories the informational items for a mitigating factors request that were originally proposed in the new § 488.61(f)(2). In this final rule, the first category is comprised of items required for all requests (new paragraphs (f)(2)(i) through (f)(2)(iv)). Additional information required for requests pertaining to data submission, clinical experience, or outcomes is then described in new paragraph (f)(2)(v), versus additional material required of requests pertaining to innovative practice (new paragraph (f)(2)(vi)), versus requests based on natural disasters or emergencies (new paragraph (f)(2)(vii)). We believe that this reorganization makes it clear that information not pertinent to the request is not needed, while continuing to provide additional transparency and continuing to communicate (in advance of a request) the type of information that a transplant center should be prepared to provide if it wishes to request consideration of mitigating factors.

Comment: With regard to the proposed content at § 488.61(f)(2), one commenter stated that it did not believe CMS has the authority to require a root cause analysis of patient deaths or graft failures that is specified by the program as a patient safety work product (PWSP) and submitted to (or received from) a Patient Safety Organization (PSO). Further, the commenter stated that to require such disclosure may place a transplant center in a situation in which it must choose between foregoing a mitigating factors review, which could keep the center open, or face fines under the Patient Safety and Quality Improvement Act of 2005 (PSQIA).

Response: By way of background, the PSQIA amended Title IX of the Public Health Service Act (PHSA) (42 U.S.C 299 et seq.). Section 921(7)(A) of the PSQIA defines “patient safety work product” (PSWP) as including “any data, reports, records, memoranda, analyses (such as root cause analyses) . . . which are assembled or developed by a provider for reporting to a patient safety organization and are reported to a patient safety organization.” Section 921(7)(B) of the PSQIA clarifies that PSWP does not include certain information, such as a patient's medical record (section 921(7)(B)(i) of the PSQIA) or “information that is collected, maintained, or developed separately, or exists separately, from a patient safety evaluation system . . .” (section 921(7)(B)(ii) of the PSQIA). Section 921(7)(B)(iii) of the PSQIA further specifies that “nothing in this part shall be construed to limit . . . the reporting of information to a Federal, State, or local governmental agency for public health surveillance, investigation, or other public health purposes or health oversight purposes.” In addition, section 922(c)(1)(C) of the PSQIA provides for an exception to the privilege and confidentiality restrictions for “disclosure of identifiable patient safety work product if authorized by each provider identified in such work product.”

We appreciate the commenter's concerns. However, after 7 years and 129 mitigating factors reviews, we have not experienced this problem in relation to organ transplant centers. This may be because adequate root cause analyses of peri- and post-transplant deaths or graft failures require such specialized expertise that the more generalized patient safety expertise of PSOs is less likely than in other areas to be the resource to which transplant centers turn.

We also note that, in certain other types of providers where the PSQIA has arisen as an issue, the providers have often taken advantage of the exceptions in the PSQIA noted previously (such as the exceptions at section 921(7)(B)(ii) or section 921(7)(B)(iii)) of the PSQIA).

CMS does not require submission of a PSWP, and hospitals have choices with regard to what to place in a patient safety evaluation system as a PSWP, to what extent the hospital will use any of the exceptions provided in the PSQIA as noted above, and to what extent the hospital will seek to demonstrate compliance with the CoPs through the provision of other information. With regard to root cause analyses, rather than being a cause of Medicare termination, we have found root cause analyses to have been among the most important considerations in CMS decisions to grant mitigating factors approval that allowed continued Medicare participation for most of the requests we have received. In many cases, the analyses demonstrated the program's definite capability to identify root causes. In other cases, the analyses demonstrated the program's clear inability to conduct adequate root cause analyses, but CMS review of the analyses (by clinical and quality improvement personnel, both in-house and contracted) allowed us to gain sufficient insights, particularly with respect to areas that might be further investigated, that we determined an SIA and more time would be warranted for the transplant program to make improvements. Rather than constituting an expectation that leads to closure of a transplant program, our experience of the root cause analyses has been that they prevented many programs from termination of Medicare participation and from experiencing risk that such termination might have led to closure.

Finally, the regulations at § 482.21 and § 482.96 oblige each hospital and transplant program to maintain an effectively functioning quality assessment and performance improvement system. A key expectation is that every adverse event be identified and investigated and the results of the investigation used to prevent recurrence. In the case of patient deaths and graft survival, this means root cause analyses to identify systemic factors that may have caused or contributed to the adverse events. The ability of a transplant program to demonstrate that it has adequately conducted such analyses, used the results to prevent recurrence, and has the capability to continue to do so is fundamental to the program's demonstration of compliance required by CMS regulations.

Therefore, we are retaining in this final regulation the language we originally proposed.

Comment: One commenter objected to the provision at proposed § 488.61(f)(2)(iv)(L) that each program must describe whether it has engaged with the OPTN to review program outcomes, the status of any such review, and any steps taken to address program outcomes pursuant to the OPTN review. The commenter believed that this provision would unnecessarily mandate disclosure of the institution's involvement with the peer review function of the OPTN under 42 CFR 121.10(b). The commenter stated that assurances of confidentiality and protection from disclosure are the foundations of effective medical peer review processes. The commenter suggested that the proposed paragraph be modified to specify only submission of the steps taken by the program to address program outcomes. Another commenter suggested that the proposed language at § 488.61(f)(1) be expanded to include consideration of whether the Membership and Professional Standards Committee (MPSC) of the OPTN has reviewed the program's performance and found it acceptable.

Response: We appreciate the tremendous value of the OPTN peer review process and its statutory responsibilities under the National Organ Transplant Act (Pub. L. 98-507). We believe that the MPSC process of the OPTN may often result in improved outcomes, thereby rendering a CMS Condition-level deficiency citation unnecessary, or at least setting the stage for approval of a mitigating factors request during the extended period of time CMS allows for consideration of such requests. While we had proposed the regulatory language in order to further strengthen CMS coordination with the OPTN, we also appreciate the nature of the peer review process itself, as described by the commenter. Therefore, in this final rule, we have removed reference to the OPTN peer review process that was originally proposed at § 488.61(f)(2)(iv)(L). We note that programs may still voluntarily disclose any engagement with the MPSC of the OPTN. If the program is entering into an SIA with CMS, the program may also wish to disclose whether the OPTN has already conducted a recent onsite peer review of the program so that CMS may consider if an adjustment to the SIA peer review is warranted, or it may choose to describe any recent onsite peer review without reference to whether the onsite review was conducted under OPTN auspices or not.

Comment: With regard to the timelines for submitting information that we proposed at § 488.61(f)(3), several commenters suggested that more than the proposed 10 days be permitted for a program to notify CMS of an intent to apply for mitigating factors consideration, and 30 days to submit written documentation when the pertinent deficiencies do not involve citation for clinical experience or outcomes. These commenters suggested that 20 days and 45 days, respectively, should be permitted.

Response: With respect to mitigating factors, the 10-day timeline only obliges programs to notify CMS of the program's intent to request such consideration, and no information is required beyond a simple statement of intent. We regard the 10-day timeframe for submission of a simple notice of intent to be a reasonable expectation. With the modification in this final discussed above (in which the mitigating factors provision is limited to deficiencies of data submission, clinical experience, or outcomes), the 30-day time period is no longer necessary. We already proposed to permit a longer period of time (120 days) for submission of the application when the deficiency is for data submission, clinical experience, and outcomes. Therefore, we are finalizing the rule with the proposed 10-day and 120-day timelines. We have removed the reference to the 30-day time period. In response to comments that suggested more time might be needed in some cases, we also added a provision in this final rule that permits CMS to extend the timelines for good cause. An example of a good cause would be a natural disaster, such as the 2013 Hurricane Sandy event, that may intervene in the middle of the applicable period.

After consideration of the public comments we received, we are retaining the 10-day timeframe to notify CMS of an intent to apply for mitigating factors, reorganizing § 488.61(f)(1) and making it clear that not all factors pertain to every application, retaining the proposed § 488.61(f)(1)(iv) as a combined factor (program improvements plus improved outcomes data, consistent with the May 12, 2014 final regulation (79 FR 27106)) but with the paragraph moved to the new and clarified § 488.61(f)(1)(iv), and retaining the reorganized content of § 488.61(f)(2) except for the removal of references to a transplant program's engagement with the OPTN.

2. Results of Mitigating Factors Review

Under proposed new § 488.61(g), we proposed to clarify and expand on the description of the mitigating factors review process and results. Under existing regulations, a transplant center seeking initial approval or re-approval of Medicare participation based on the presence of mitigating factors is required to submit a formal written request to the CMS Central Office, as described earlier. If there are no deficiencies that constitute immediate jeopardy to a patient's health and safety, in limited circumstances, CMS may approve continued Medicare participation based on mitigating factors. However, where a transplant program demonstrates that it is making significant progress toward correction and program improvement, but does not yet qualify for approval based on mitigating factors, we believe there may be merit, in many cases, in temporarily extending the effective date of the program's Medicare participation termination in exchange for a hospital's agreement to engage in a significant and directed regimen of further quality improvement under a Systems Improvement Agreement (SIA). As we noted above, programs that have entered into SIAs have demonstrated significant improvements. Therefore, we proposed to provide an explicit procedure in the regulations at proposed new § 488.61(g)(1)(iii) for CMS to offer an SIA and hold in abeyance a final decision on the mitigating factors request until the SIA period has ended. Proposed new paragraphs (g)(1)(i), (g)(1)(ii), and (g)(1)(iii) outline the three outcomes of CMS mitigating factors decisions: (i) Initial approval or re-approval of a program's Medicare participation based upon consideration of mitigating factors; (ii) denial of the program's request; or (iii) offer of a time-limited SIA when a transplant program has waived its appeal rights, has committed to substantial program improvements that address root causes and are institutionally supported by the hospital's governing body on a sustainable basis, and has requested more time to design or implement additional improvements or demonstrate compliance with CMS outcome requirements. Under the proposed new paragraph (g)(1)(iii), we would clarify that, during the SIA, CMS holds the mitigating factors request in abeyance and makes a final decision to approve or deny Medicare participation when the SIA is ended, based on the results of the program's performance of the SIA.

Existing regulations state that CMS will not approve any program with a Condition-level deficiency. However, CMS could approve a program with a Standard-level deficiency upon receipt of an acceptable plan of correction. A Condition-level deficiency represents a serious classification and, unless the deficiency is remedied, precludes a provider from participating in Medicare. A Standard-level deficiency represents a less serious deficiency, such as one in which just a small part of a CoP is found to be out of compliance. We proposed to move this to the proposed new paragraph § 488.61(g)(2).

We did not receive any public comments on this proposal and, therefore, are finalizing it as proposed.

3. System Improvement Agreements (SIAs)

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 27977), we proposed to add proposed new paragraph (h) to § 488.61 to set forth the purpose, intent, and contents of an SIA and the timeframes for an approved SIA with CMS.

a. Purpose and Intent of an SIA

Based on information and documentation provided by the transplant program at the time of its request, CMS may determine that, despite a deficiency or deficiencies, the transplant center has made substantial progress, has full support of the hospital governing body, and is on a quality improvement path that promises to improve prospects for patient survival. In such cases, we exercise our limited discretion to offer the transplant program the opportunity to enter into an SIA. In the absence of a written request for consideration on the basis of mitigating factors, CMS would otherwise proceed with the proposed date of termination based on noncompliance with one or more of the CoPs. In the proposed regulation, we clarified and specified the terms for such SIAs.

CMS may offer an SIA to a transplant program if the transplant center can show that it has identified, or is actively improving its identification of, the root causes of its noncompliance and if the transplant center has initiated actions to correct those root causes. However, if we conclude that a transplant center does not qualify for initial approval or re-approval based on mitigating factors, the proposed rule would explicitly provide CMS with the option of offering a time-limited SIA to those transplant centers that have demonstrated progress in making substantive program improvements to address root causes of deficient outcomes, agree to undertake a structured regimen of further quality improvement, and agree to waive their appeal rights. In some instances, a voluntary period of inactivity of the transplant center is warranted, or a period of inactivity may be required by CMS as a condition of an SIA approval, as a requirement of initiating an SIA for a specified period, or until certain milestones are achieved.

During the SIA period, CMS' oversight and enforcement authority continue and CMS may conduct routine unannounced surveys, complaint investigations, and/or terminate the transplant center's participation in the Medicare program if there is not substantial compliance with Federal requirements under 42 CFR Part 482 or if the program fails to follow the terms of the SIA. In consideration for the opportunity to continue to participate in the Medicare program under an SIA during the time that structured improvements and corrections are made, despite having been found to be in noncompliance with the requirements, a transplant center would be required to waive any appeal rights that it may have, either administratively or judicially, if CMS ultimately terminates Medicare participation or denies initial approval of the transplant center. We proposed that such a waiver applies, regardless of whether revocation or termination of approval/re-approval occurs due to a finding that the hospital failed to fulfill the terms of the SIA or due to the deficiency findings that the SIA was designed to address, pursuant to CMS' enforcement authority under the regulations.

A transplant center's approval to operate as a Medicare-approved transplant center does not guarantee any subsequent re-approvals and may be time-limited. The transplant center must submit a separate request for consideration of mitigating factors, including updated supporting documentation each time a CMS review (generally on a 3 to 5 year cycle) or complaint investigation determines that the transplant center does not meet one or more of the data submission, clinical experience, and outcomes requirements, or other CoPs. At such time, we would review any prior mitigating factors approval to determine if the circumstances that originally warranted approval would still apply. However, in the case of past mitigating factors approval based on innovative practice, CMS may seek information in advance of a recertification survey to determine if the reasons for past approval still prevail and, in such a case, CMS may consider mitigating factors concomitantly with the recertification survey.

We did not receive any public comments in this policy and, therefore, finalizing it as proposed.

b. Description and Contents of an SIA

The SIA is a binding agreement between CMS and the hospital within which a transplant center operates. A transplant center, in turn, may have one or more organ-specific programs, such as a heart, kidney, pancreas, liver, or lung transplant program. Each SIA is focused on a particular organ transplant program. The SIA is a plan for a series of actions, activities, and goals that provide opportunities for the hospital and transplant center to conduct internal improvement analysis and action, and engage external experts to ensure that the transplant center is in compliance with evidence-based standards and advances in the field that would optimize the care provided to patients.

Through an SIA, CMS is able to offer transplant centers additional time to achieve compliance with the CoPs through a structured and monitored process. In particular, the use of the formal SIA process reflects CMS' recognition that it may sometimes require more than the usual time to correct the 1-year post-transplant patient or graft survival and have the results of such improvement become manifest in the tracking data, or to develop and implement a plan to correct low-volume performance rates. We generally do not expect to use an SIA in cases of noncompliance with other CoPs, although we do not preclude such a possibility if highly unusual circumstances are present.

The SIA process (discussed in more detail below) has demonstrated effectiveness in improving patient and graft survival. An important measure of outcome is the extent to which observed patient deaths 1 year after transplant compare with the risk-adjusted expected number of deaths or graft failure for a particular transplant program. The SRTR risk adjustment methodology (used to calculate the expected numbers) takes into consideration the organs transplanted and the characteristics of the donors and recipients (for example, factors that have a bearing on the risk to patient or graft survival, such as diabetes, hypertension, advanced age, or cold ischemic time of the organ to be transplanted, among others). For example, the national number of expected deaths 1 year after transplant for all transplant centers in the United States is 1.0. A transplant center that had twice the expected number of deaths would have a standardized mortality ratio (SMR) of 2.0. As of August 2013, adult kidney transplant programs cited by CMS for substandard outcomes and placed on a Medicare enforcement track, for which there was a 2-year post-CMS survey tracking period (N=15), improved their average SMR for 1-year post-transplant patient survival performance rate from 2.05 to 1.17 (close to the 1.0 national average). The transplant centers under an approved SIA improved their outcomes from an average SMR ranging from 2.41 before the SIA to 0.76 after the SIA (much better than the national average). Transplant centers not cited for substandard kidney transplant outcomes improved outcomes slightly from 0.89 to 0.84.

Hamilton, Thomas E., Regulatory Oversight in Transplantation: Are Patients Really Better Off, Curr Opin Organ Transplant 2013, 18:203-209. Available at: at http://www.co-transplantation.com.

In proposed new § 488.61(h), we proposed to explicitly incorporate and specify elements that have been important to the successful use of the SIA structure. We proposed to define an SIA as a binding agreement, entered into voluntarily by the hospital and CMS, through which CMS extends the effective date of a prospectively scheduled termination of the center's Medicare participation (thereby permitting the program additional time to achieve compliance with the CoPs), contingent on the hospital's agreement to participate in a structured regimen of quality improvement activities and subsequent demonstration of improved outcomes. In some cases, transplant programs have entered a period of inactivity—voluntarily, or imposed as a condition of the SIA.

Under proposed new § 488.61(h)(1)(i) through (h)(1)(x), we proposed that in the SIA, in exchange for additional time to initiate or continue activities to achieve compliance with the CoPs, the transplant center must agree to a regimen of specified activities, including (but not limited to) all of the following:

  • Patient notification about the degree and type of noncompliance by the program, an explanation of what the program improvement efforts mean for patients, and financial assistance to defray the out-of-pocket costs of copayments and testing expenses for any wait-listed individual who wishes to be listed with another program (proposed paragraph (h)(1)(i)).
  • An external independent peer review team that conducts an onsite assessment of program policies, staffing, operations, relationship to hospital services, and factors that contribute to program outcomes; that suggests quality improvements the hospital should consider; that provides both verbal and written feedback to the hospital; and that provides a verbal debriefing to CMS. Neither the hospital nor the peer review team is required to provide a written report to CMS. The peer review team would include a transplant surgeon with expertise in the relevant organ type(s), a transplant administrator, an individual with expertise in transplant QAPI systems, a social worker or psychologist or psychiatrist, and a specialty physician with expertise in conditions particularly relevant to the applicable organ types(s) such as a cardiologist, nephrologist, or hepatologist. Except for the transplant surgeon, CMS may permit substitution of an individual with one type of expertise for another individual who has expertise particularly needed for the type of challenges experienced by the program, such as substitution of an infection control specialist in lieu of, or in addition to, a social worker (proposed paragraph (h)(1)(ii)).
  • An action plan that addresses systemic quality improvements and is updated after the onsite peer review (proposed paragraph (h)(1)(iii)).
  • An onsite consultant whose qualifications are approved by CMS, and who provides services for 8 days per month on average for the duration of the agreement, except that CMS may permit a portion of the time to be spent offsite and may agree to fewer consultant days each month after the first 3 months of the SIA (proposed paragraph (h)(1)(iv)).
  • A comparative effectiveness analysis that compares policies, procedures, and protocols of the transplant program with those of other programs in areas of endeavor that are relevant to the transplant center's current quality improvement needs (proposed paragraph (h)(1)(v)).
  • Development of increased proficiency, or demonstration of current proficiency, with patient-level data from the SRTR and the use of registry data to analyze outcomes and inform quality improvement efforts (proposed paragraph (h)(1)(vi)).
  • A staffing analysis that examines the level, type, training, and skill of staff in order to inform transplant center efforts to ensure the engagement and appropriate training and credentialing of staff (proposed paragraph (h)(1)(vii)).
  • Activities to strengthen performance of the Quality Assessment and Performance Improvement (QAPI) Program to ensure full compliance with the requirements at § 482.96 (proposed paragraph (h)(1)(viii)).
  • Monthly (unless otherwise specified) reporting and conference calls with CMS regarding the status of programmatic improvements, the results of the actions, data, reports, or other deliverables specified in the SIA, and regarding the number of transplants, and the death and graft failures that occur within 1 year post-transplant (proposed paragraph (h)(1)(ix)).
  • Additional or alternative requirements specified by CMS, tailored to the transplant program type and circumstances (proposed paragraph (h)(1)(x)).

Comment: One commenter suggested that less detail be provided with regard to the content of an SIA in favor of more flexibility for CMS and transplant centers. Another commenter observed that the SIA content was robust and could conceivably constitute a best practice for transplant centers. The commenter also noted that, despite the high specificity of the required activities, proposed language at § 488.61(h)(1)(x) allowing CMS to specify alternate requirements, provides the flexibility needed if there are elements a transplant program cannot meet due to circumstances beyond its control.

Response: We agree that the requirements are reasonably robust and specific. The SIA content was developed after early experiences in 2007-2010 with lesser requirements that failed to generate the results needed for a number of programs to generate and sustain improvement outcomes. We then entered into a number of SIAs that had additional requirements which we did not include here, either because they proved unnecessary in many cases or caused excessive risk avoidance on the part of some transplant centers. The remaining requirements we proposed have now been well-tested in 35 SIAs so far, with exceptional success. We agree with the commenter who observed that the language proposed at § 488.61 (h)(1)(x) allows CMS with advisable flexibility to tailor alternate requirements when necessary. In response to the concern of the first commenter, however, in this final rule, we expanded § 488.61(h)(1)(x) to allow CMS the ability to waive certain enumerated elements of the SIA (rather than requiring alternates) if the agency finds that the program has already adequately fulfilled the task.

Comment: Several commenters stated that transplant programs should not be obliged to waive their appeal rights in order to engage in an SIA with CMS.

Response: We do not agree. Prior to any SIA, each transplant program will already have had full opportunity to appeal a prospectively scheduled termination of Medicare participation. Further, while a prospective termination deriving from all other CoP deficiencies must be resolved within 90 days, in the case of clinical experience or outcomes, CMS sets the prospective Medicare termination at 210 days and allows for consideration of mitigating factors. We provide for an SIA for certain programs when a program is making substantial progress but is not able to demonstrate compliance or qualify for outright approval of its mitigating factors request within the 210-day period. Under an SIA, CMS agrees to extend the prospectively scheduled Medicare termination date for up to another 12 months. Given these considerations, we do not agree that a program should be able to reach the end of an SIA, fail to demonstrate the improved outcomes necessary, and then appeal. We believe such an arrangement would only serve to prolong the termination date and reduce incentives to correct deficiencies and achieve compliance promptly. Further, in our experience to date, only one transplant program has chosen to appeal a Medicare termination for any reason. The affected hospital involved expended considerable legal effort, over the course of a year, and did not prevail. In the succeeding year, the program applied for reinstatement and was eventually recertified for Medicare participation after making further improvements and demonstrating compliance with the CMS clinical experience and outcomes requirements. In short, in the case of an SIA, we provide for an exceptional extension of time and believe it is preferable for the available resources of all parties to be invested in the process of improving patient care rather than in a legal contest. If a program wishes to appeal, we suggest the appeal be made within the 60-day post-notification period permitted by regulation rather than pursue an SIA (because the SIA would require waiver of appeal rights).

After consideration of the public comments we received, we are making a minor change at § 488.61(h)(1)(x) to allow some added flexibility to the SIA content, and are otherwise finalizing § 488.61(h)(1)(i) through (h)(1)(ix) as proposed.

c. Effective Period for an SIA

Under proposed new § 488.61(h)(2), we proposed to specify that an SIA will be established for a 12-month period, subject to CMS' discretion to determine if a shorter time period would suffice. At the hospital's request and at CMS' discretion, CMS may extend an SIA for up to one additional 6-month period.

Comment: A number of commenters supported these time periods.

Response: We appreciate the commenters' support.

After consideration of the public comments we received, we are finalizing § 488.61(h)(2) as proposed.

XII. MedPAC Recommendations

Under section 1886(e)(4)(B) of the Act, the Secretary must consider MedPAC's recommendations regarding hospital inpatient payments. Under section 1886(e)(5) of the Act, the Secretary must publish in the annual proposed and final IPPS rules the Secretary's recommendations regarding MedPAC's recommendations. We have reviewed MedPAC's March 2014 “Report to the Congress: Medicare Payment Policy” and have given the recommendations in the report consideration in conjunction with the policies set forth in this final rule. MedPAC recommendations for the IPPS for FY 2015 are addressed in Appendix B to this final rule.

For further information relating specifically to the MedPAC reports or to obtain a copy of the reports, contact MedPAC at (202) 653-7226, or visit MedPAC's Web site at: http://www.medpac.gov.

XIII. Other Required Information

A. Requests for Data From the Public

In order to respond promptly to public requests for data related to the prospective payment system, we have established a process under which commenters can gain access to raw data on an expedited basis. Generally, the data are now available on compact disc (CD) format. However, many of the files are available on the Internet at: http://www.cms.hhs.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html. We listed the data files and the cost for each file, if applicable, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28288 through 28289).

Commenters interested in discussing any data used in constructing the proposed rule and this final rule should contact Nisha Bhat at (410) 786-5320.

B. Collection of Information Requirements

1. Statutory Requirement for Solicitation of Comments

Under the Paperwork Reduction Act of 1995, we are required to provide 60-day notice in the Federal Register and solicit public comment before a collection of information requirement is submitted to the Office of Management and Budget (OMB) for review and approval. In order to fairly evaluate whether an information collection should be approved by OMB, section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995 requires that we solicit comment on the following issues:

  • The need for the information collection and its usefulness in carrying out the proper functions of our agency.
  • The accuracy of our estimate of the information collection burden.
  • The quality, utility, and clarity of the information to be collected.
  • Recommendations to minimize the information collection burden on the affected public, including automated collection techniques.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28289 through 28294), we solicited public comment on each of these issues for the following sections of this document that contain information collection requirements (ICRs). We discuss and respond to any public comments we received in the relevant sections.

2. ICRs for Add-On Payments for New Services and Technologies

Section II.I.1. of the preamble of the proposed rule and of this final rule discuss add-on payments for new services and technologies. Specifically, this section states that applicants for add-on payments for new medical services or technologies for FY 2016 must submit a formal request. A formal request includes a full description of the clinical applications of the medical service or technology and the results of any clinical evaluations demonstrating that the new medical service or technology represents a substantial clinical improvement. In addition, the request must contain a significant sample of the data to demonstrate that the medical service or technology meets the high-cost threshold.

We believe the burden associated with this requirement is exempt from the PRA under 5 CFR 1320.3(c), which defines the agency collection of information subject to the requirements of the PRA as information collection imposed on 10 or more persons within any 12-month period. This information collection does not impact 10 or more entities in a 12-month period. In FYs 2008, 2009, 2010, 2011, 2012, 2013, FY 2014, and FY 2015, we received 1, 4, 5, 3, 3, 5, 5, and 7 applications, respectively.

We did not receive any public comments regarding this information collection.

3. ICRs for the Occupational Mix Adjustment to the FY 2015 Index (Hospital Wage Index Occupational Mix Survey)

Section III.F. of the preamble of the proposed rule (79 FR 28066 through 28067) and this final rule discusses the occupational mix adjustment to the proposed and final FY 2015 wage index, respectively. While the preamble of these rules does not contain any new ICRs, we note that there is an OMB approved information collection request associated with the hospital wage index.

Section 304(c) of Public Law 106-554 amended section 1886(d)(3)(E) of the Act to require CMS to collect data at least once every 3 years on the occupational mix of employees for each short-term, acute care hospital participating in the Medicare program in order to construct an occupational mix adjustment to the wage index. We collect the data via the occupational mix survey.

The burden associated with this information collection requirement is the time and effort required to collect and submit the data in the Hospital Wage Index Occupational Mix Survey to CMS. The aforementioned burden is subject to the PRA; it is currently approved under OCN 0938-0907.

We did not receive any public comments regarding this information collection.

4. Hospital Applications for Geographic Reclassifications by the MGCRB

Section III.H.2. of the preambles of the proposed rule (79 FR 28070 through 28075) and of this final rule discuss proposed and final changes to the wage index based on hospital reclassifications. As stated in that section, under section 1886(d)(10) of the Act, the MGCRB has the authority to accept short-term IPPS hospital applications requesting geographic reclassification for wage index and to issue decisions on these requests by hospitals for geographic reclassification for purposes of payment under the IPPS.

The burden associated with this application process is the time and effort necessary for an IPPS hospital to complete and submit an application for reclassification to the MGCRB. The burden associated with this requirement is subject to the PRA. It is currently approved under OCN 0938-0573.

We did not receive any public comments regarding this information collection.

5. ICRs for Application for GME Resident Slots

The information collection requirements associated with the preservation of resident cap positions from closed hospitals, addressed under section IV.J.3. of the preamble of this final rule, are not subject to the Paperwork Reduction Act, as stated in section 5506 of the Affordable Care Act.

6. ICRs for the Hospital Inpatient Quality Reporting (IQR) Program

The Hospital IQR Program (formerly referred to as the Reporting Hospital Quality Data for Annual Payment (RHQDAPU) Program) was originally established to implement section 501(b) of the MMA, Public Law 108-173. This program expanded our voluntary Hospital Quality Initiative. The Hospital IQR Program originally consisted of a “starter set” of 10 quality measures. The collection of information associated with the original starter set of quality measures was previously approved under OMB control number 0938-0918. All of the information collection requirements previously approved under OMB control number 0938-0918 have been combined with the information collection request previously approved under OMB control number 0938-1022. We no longer use OMB control number 0938-0918.

We added additional quality measures to the Hospital IQR Program and submitted the information collection request to OMB for approval. This expansion of the Hospital IQR measures was part of our implementation of section 5001(a) of the DRA. Section 1886(b)(3)(B)(viii)(III) of the Act, added by section 5001(a) of the DRA, requires that the Secretary expand the “starter set” of 10 quality measures that were established by the Secretary as of November 1, 2003, to include measures “that the Secretary determines to be appropriate for the measurement of the quality of care furnished by hospitals in inpatient settings.” The burden associated with these reporting requirements was previously approved under OMB control number 0938-1022.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53666), we stated that, for the FY 2016 payment determinations and subsequent years updates, we sought OMB approval for a revised information collection request using the same OMB control number (0938-1022). The FY 2014 IPPS/LTCH PPS final rule (78 FR 50955) does not change the method for information collection requests. In a revised request for the FY 2017 payment determination, we will add the four claims-based measures and one chart-abstracted measure that we are finalizing in this final rule as proposed. The claims-based measures are: (1) Hospital 30-day, all-cause, unplanned, risk-standardized readmission rate (RSRR) following coronary artery bypass graft (CABG) surgery; (2) Hospital 30-day, all-cause, risk-standardized mortality rate (RSMR) following coronary artery bypass graft (CABG) surgery; (3) Hospital-level, risk-standardized 30-day episode-of-care payment measure for pneumonia; and (4) Hospital-level, risk-standardized 30-day episode-of-care payment measure for heart failure. The chart-abstracted measure we are finalizing in this final rule is: Severe sepsis and septic shock: management bundle (NQF #0500).

Because claims-based measures can be calculated based on data that are already reported to the Medicare program for payment purposes, we believe no additional information collection will be required from the hospitals for the four finalized claims based measures. However, we believe that the chart-abstracted measure will cause some additional burden.

In addition, we believe there will be a reduction in the burden as a result of removing 19 total measures in this rule. We note that we are not removing SCIP-Inf-4 Cardiac Surgery Patients with Controlled 6 a.m. Postoperative Blood Glucose as proposed. The measures we are removing are: (1) AMI-1 Aspirin at Arrival; (2) AMI-3 ACEI/ARB for left ventricular systolic dysfunction; (3) AMI-5 Beta-blocker prescribed at discharge; (4) AMI-8a Timing of Receipt of Primary Percutaneous Coronary Intervention (PCI); (5) HF-2 Evaluation of left ventricular systolic function; (6) SCIP-Inf-1 Prophylactic antibiotic received within 1 hour prior to surgical incision; (7) SCIP-Inf-2 Prophylactic antibiotic selection for surgical patients; (8) SCIP Inf-3 Prophylactic antibiotics discontinued within 24 hours after surgery end time (48 hours for cardiac surgery); (9) SCIP Inf-6 Appropriate hair removal; (10) SCIP-Inf-9 Postoperative urinary catheter removal on post-operative day 1 or 2 with day of surgery being day zero; (11) SCIP-VTE-2: Surgery patients who received appropriate VTE prophylaxis within 24 hours pre/post-surgery; (12) SCIP Cardiovascular-2: Surgery Patients on a Beta Blocker prior to arrival who received a Beta Blocker during the perioperative period; (13) PN-6 Appropriate initial antibiotic selection; (14) STK-2 Antithrombotic therapy for ischemic stroke; (15) STK-3 Anticoagulation therapy for Afib/flutter; (16) STK-5 Antithrombotic therapy by the end of hospital day 2; (17) STK-10 Assessed for rehab; and (18) VTE-4 Patients receiving un-fractionated Heparin with doses/labs monitored by protocol, and (19) one structural measure: Participation in a systematic database for cardiac surgery.

We note that some of these measures are being removed as chart-abstracted measures, but are being retained as electronic clinical quality measures. We refer readers to section IX.A.2.b. of the preamble of this final rule for further discussion.

The numbers included in our finalized policy more accurately reflect the burden associated with the Hospital IQR Program than the estimates provided in our proposal. In the FY 2014 IPPS/LTCH PPS final rule, we estimated that the burden for the FY 2016 payment determination was 1,775 hours annually per hospital and 5.86 million hours across approximately 3,300 hospitals participating in the Hospital IQR Program (78 FR 50956). These estimates (at 78 FR 50956 for chart-abstracted measures) were based on the projected numbers of records to be abstracted for VTE and stroke. Using actual data from the Hospital IQR Program's clinical data warehouse, we have since revised these estimates downward to 1,309 hours per hospital and 4.3 million hours across all hospitals.

We believe that there will be a reduction in burden for hospitals due to 14 of the 19 chart-abstracted measures that we are removing: (1) AMI-8a Timing of Receipt of Primary Percutaneous Coronary Intervention (PCI); (2) HF-2 Evaluation of left ventricular systolic function; (3) SCIP-Inf-1 Prophylactic antibiotic received within 1 hour prior to surgical incision; (4) SCIP-Inf-2 Prophylactic antibiotic selection for surgical patients; (5) SCIP Inf-3 Prophylactic antibiotics discontinued within 24 hours after surgery end time (48 hours for cardiac surgery); (6) SCIP-Inf-9 Postoperative urinary catheter removal on postoperative day 1 or 2 with day of surgery being day zero; (7) SCIP-VTE-2: Surgery patients who received appropriate VTE prophylaxis within 24 hours pre/postsurgery; (8) SCIP Cardiovascular-2: Surgery Patients on a Beta Blocker prior to arrival who received a Beta Blocker during the perioperative period; (9) PN-6 Appropriate initial antibiotic selection; (10) STK-2 Antithrombotic therapy for ischemic stroke; (11) STK-3 Anticoagulation therapy for Afib/flutter; (12) STK-5 Antithrombotic therapy by the end of hospital day 2; (13) STK-10 Assessed for rehab; and (14) VTE-4 Patients receiving un-fractionated Heparin with doses/labs monitored by protocol.

The remaining four chart-abstracted measures that we are removing have been previously suspended from the program; therefore, their removal will not impact the reporting burden. The structural measure we are removing, Participation in a Systematic Database for Cardiac Surgery (NQF #0113), has an estimated burden of nearly zero hours; therefore, its removal will not result in a significant burden reduction.

Therefore, for the FY 2017 payment determination, we estimate a net reduction in burden accounting for both the addition of one chart-abstracted measure, severe sepsis and septic shock: Management bundle (NQF #0500), as well as our removal of 19 measures (both chart-abstracted and structural) to be 160 hours annually per hospital. We estimate the total reduction in burden for chart abstraction and structural measures for the approximately 3,300 Hospital IQR Program-participating hospitals to be 0.5 million hours (please note the stated number appears to be off by 0.1 due to rounding).

In addition, we intend to enroll up to 100 hospitals in a voluntary large scale test of validation for electronic clinical quality measures for the Hospital IQR Program. We estimate a total burden of 16 hours for each participating hospital. We intend to reimburse hospitals $26 per hour for up to 16 hours for their participation in this test. Details regarding this reimbursement rate are as follows:

  • The labor performed can be accomplished by medical records and health information technology staff, with a mean hourly wage in general medical and surgical hospitals of $19.24.

For the FY 2017 payment determination, we also are encouraging hospitals to voluntarily submit up to 16 measures electronically for the Hospital IQR Program in a manner that would permit eligible hospitals to partially align Hospital IQR Program requirements with some requirements under the Medicare EHR Incentive Program. We estimate that the total burden associated with the electronic clinical quality measure reporting option will be similar to the burden outlined for hospitals in the EHR Incentive Program Stage 2 final rule (77 FR 53968 through 54162). As described above for participation in the test of validation for electronic clinical quality metrics in the Hospital IQR Program, we believe an individual with commensurate skills will submit electronic clinical quality measures on behalf of the hospital at a rate of approximately $26.00 per hour. Therefore, we believe it will cost a hospital approximately $277.33 ($26.00 x 10 hours and 40 minutes) to report 16 electronic clinical quality measures. Additional information about the chart abstraction burden is detailed in section II.K. of Appendix A to this final rule.

Previously, we required hospitals to provide 96 patient charts for validation per hospital per year, including 36 charts for HAI validation (with an average page length of 1,500) and 60 charts for clinical process of care measure validation (with an average page length of 300) for a total of 72,000 pages per hospital per year. For the FY 2017 payment determination and subsequent years, we are reducing this requirement to 72 charts per hospital per year, including 40 charts for HAI validation and 36 charts for clinical process of care validation, for a total of 70,800 pages per hospital per year—a decrease of 1,200 pages per hospital per year. We reimburse hospitals at 12 cents per photocopied page (68 FR 67956 and 70 FR 23667). Therefore, the reduced burden is $144 per hospital for up to 600 hospitals.

To support validation of four HAI measures for the FY 2017 payment determination and subsequent years, we estimate an annual burden of 43,200 hours. This estimate is based on up to 600 hospitals completing HAI Templates averaging 18 hours per quarter over 4 quarters. This burden is 10,800 hours more than that for the FY 2016 payment determination as finalized in the FY 2014 IPPPS/LTCH PPS final rule (78 FR 50822 through 50825) of 32,400 hours, because the HAI measures are to be validated for 4 quarters instead of 3 quarters. However, this change for the FY 2017 payment determination was previously finalized (78 FR 50822 through 50825).

Using the estimates above, we estimate an overall reduction in burden from the FY 2016 estimate. We anticipate the reduction in total burden for hospitals to be 160 hours per hospital or 0.5 million hours for the FY 2017 payment determination, as compared to FY 2016, for reporting chart-abstracted and structural measures, completing forms, reviewing reports, and submitting validation templates across all hospitals. This burden estimate includes new, readopted, and previously finalized measures. The estimate excludes the burden associated with the NHSN and HCAHPS measures, both of which are submitted under separate information collection requests and are approved under separate OMB control numbers.

The table below describes the hospital burden associated with the previously finalized Hospital IQR Program requirements, and shows how they changed based upon the policies finalized for the FY 2017 payment determination. The numbers included in our finalized policy more accurately reflect the burden associated with the Hospital IQR Program over the estimates provided in our proposal. The burden estimates in this final rule are the estimates for which we are requesting OMB approval.

Burden Impact of Hospital IQR Program Requirements

Hospital IQR program requirementNumber of hospitals impactedBurden per hospital for previously finalized requirementsBurden per hospital for all requirements finalized in this rule (continuing, removed, added)Net change in burden per hospital
Chart-abstracted and structural measures, forms3,3001,291 hours1,131 hours−160 hours
Review reports for claims-based measures3,3004 hours4 hours0
Reporting of voluntary electronic clinical quality measures in place of chart-abstracted measuresUnknown *−385 hours−425 hours−40 hours
Validation templatesUp to 600 **72 hours72 hours0
Electronic clinical quality measure validation testUp to 100 **016 hours16 hours
Validation charts photocopyingUp to 600$8,640$8,496−$144
* This number is unknown at the time this table was prepared because final submission deadlines have not passed. Because the burden associated with participation is negative, we conservatively assumed the number of participating hospitals to be 0 in summary calculations included in the narrative.
** Maximum numbers were used in summary calculations included in the narrative.

7. ICRs for PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program

As discussed in section IX.B. of the preamble of the proposed rule and this final rule, section 1866(k)(1) of the Act requires, for purposes of FY 2014 and each subsequent fiscal year, that a hospital described in section 1886(d)(1)(B)(v) of the Act (a PPS-exempt cancer hospital, or a PCH) submit data in accordance with section 1866(k)(2) of the Act with respect to such fiscal year.

In this final rule, we are finalizing our proposal to adopt one new clinical effectiveness measure (External Beam Radiotherapy for Bone Metastases) for the FY 2017 program and subsequent years, which will increase the total number of measures for the FY 2017 PCHQR measure set to 19 measures.

We also are finalizing an update to the specifications for the five previously finalized clinical process/oncology care measures to require PCHs to report all-patient data for each of these measures, and adopting a new sampling methodology that PCHs can use to report these measures, as well as the External Beam Radiotherapy measure.

We believe that requiring PCHs to report the new External Beam Radiotherapy for Bone Metastases measure, as well as to use the sampling methodology, will not be burdensome. At least seven PCHs are currently reporting quality measure data (including population and sampling data for HCAHPS measures) on a voluntary basis to CMS. PCHs may also have experience submitting quality and population/sample size data to other entities, such as State survey agencies and The Joint Commission. As a result, we believe that the new reporting requirements we are adopting will not significantly impact PCHs.

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50957 through 50959), we included burden estimates for the FY 2015 and FY 2016 programs. We noted in that final rule that those estimates represented a worst case scenario of estimated burden. We are providing a revised burden estimate for FY 2016 and a burden estimate for FY 2017 that take into account our finalized sampling methodologies for all applicable measures. The anticipated revised burden on PCHs for the FY 2016 program and the anticipated new burden on PCHs for the FY 2017 program consist of the following: New measure training and measure maintenance, and the time required for collection, aggregation, and submission of data for all measures.

We estimate that 11 PCHs will submit quality measure data on approximately 37,596 cancer cases annually beginning with FY 2016 and FY 2017. In addition, we estimate that PCHs will spend 0.5 hours on chart abstraction and data submission per case/event, 0.5 hours on training per each new measure, 0.25 hours on measure maintenance per each existing measure, and a maximum of 5 hours summarizing and reporting population and sample size counts for the six SCIP measures and five oncology care measures.

FY 2011 CMS MedPAR file based on Medicare data alone.

We are reducing the burden estimates for the HCAHPS Survey, the six SCIP measures, and the five clinical process/oncology care measures in this final rule to take into consideration the sampling that PCHs may use for these measures. As a result, we estimate that the reporting burden on each PCH for the FY 2016 program will be 18,758 hours. We estimate that the reporting burden on each PCH for FY 2017 would increase by 50 hours because PCHs will be required to report an additional quality measure (External Beam Radiotherapy for Bone Metastases). Therefore, we estimate the overall burden for all of the FY 2017 PCHQR Program requirements to be 18,808 hours per PCH. This FY 2017 estimate, which includes an additional finalized measure, represents a decrease of 33,122 hours per PCH from the FY 2016 burden estimate of 51,930 hours that we published in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50957 through 50959), or an overall decrease of 64 percent in the number of hours for each PCH. Coupled with our estimated salary costs, this revised estimate results in a net reduction in estimated cost of $472,362 per PCH. We believe that this burden estimate more accurately captures the hour and cost impact on PCHs participating in the PCHQR Program and reflects our efforts to minimize the burden impact through the proposed adoption of a new sampling methodology that PCHs can use to report the clinical process/oncology care measures.

This figure represents the difference between previous burden estimate (51,930 hours) in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50958) and current burden estimate (18,808 hours).

We are now estimating an hourly salary of $33 (http://swz.salary.com/salarywizard/Staff-Nurse-RN-Hourly-Salary-Details.aspxper). After accounting for employee benefits and overhead, this results in a total cost of $66 per labor hour.

However, we note that these estimates are based on PCH reporting of Medicare data only. We intend to update the burden estimate to more accurately reflect the burden on PCHs for reporting all-patient data in future years.

Comment: One commenter supported CMS' efforts to reduce the reporting burden of the PCHQR Program but raised concern about the variation in estimated burden between the Hospital IQR Program and PCHQR Program, and the possibility that the large variation in PCH patient volume may leave some PCHs with a greater burden than is estimated on average. The commenter also noted that the burden estimates provided in the FY 2015 IPPS/LTCH PPPS proposed rule do not consider the need for PCHs to build a reporting infrastructure, report non-Medicare data, or make efforts to ensure consistent application of measure specifications across PCHs.

Response: We thank the commenter for their support and will consider this feedback for future years. We incorporated a sampling approach for non-Medicare patients, abstraction, training, computer edits, and labor hours in our burden estimates. We also note that we will revise our estimates to account for the burden associated with reporting patient level data for the six SCIP measures in future years, once we have data on which submission option PCHs select for SCIP data submission. Finally, in response to the commenter's concern that our burden estimates do not account for ensuring consistent application of measure specifications across PCHs, we note that it is our role to ensure that PCHs report each measure consistent with the measure specifications and, therefore, this task does not affect PCH burden.

We will submit a revision of the information collection request currently approved under OMB 0938-1175 to account for the aforementioned changes to the PCHQR Program.

8. ICRs for the Hospital Value-Based Purchasing (VBP) Program

In section IV.I. of the preamble of the proposed rule and of this final rule, we discuss requirements for the Hospital VBP Program. Specifically, in this final rule, we are adopting three new measures for the FY 2017 Hospital VBP Program: (1) Methicillin-Resistant Staphylococcus aureus (MRSA) Bacteremia; (2) Clostridium difficile; and (3) PC-01: Elective Delivery Prior to 39 Completed Weeks Gestation. The first two measures are measures of healthcare-associated infections reported via the CDC's National Healthcare Safety Network, while the last measure is a chart-abstracted measure.

We also are adopting Hospital-level Risk-Standardized Complication Rate (RSCR) Following Elective Primary Total Hip Arthroplasty (THA) and Total Knee Arthroplasty (TKA) for the FY 2019 Hospital VBP Program.

As provided for in section 1886(o)(2)(A) of the Act, all of these additional measures are required for the Hospital IQR Program. Therefore, their inclusion in the Hospital VBP Program does not result in any additional burden because the Hospital VBP Program uses data that are required for the Hospital IQR Program.

9. ICRs for the Long-Term Care Hospital Quality Reporting (LTCHQR) Program

As discussed in sections IX.C.3. through IX.C.5. of the preamble of the proposed rule and of this final rule, for the LTCHQR Program, for the FY 2015 payment determination and subsequent years, we are retaining the following three quality measures: (1) National Healthcare Safety Network (NHSN) Catheter-Associated Urinary Tract Infections (CAUTI) Outcome Measure (NQF #0138); (2) National Healthcare Safety Network (NHSN) Central Line Catheter-Associated Blood Stream Infection Event (CLABSI) Outcome Measure (NQF #0139); and (3) and Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short-Stay) (NQF #0678). For the FY 2016 payment determination and subsequent years, we are retaining the following two measures in addition to the measures finalized for previous years: (1) Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (Short-Stay) (NQF #0680); and (2) Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431). For the FY 2017 payment determination and subsequent years, we are retaining the following three measures in addition to the measures finalized for previous years: (1) National Health Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Methicillin-resistant Staphylococcus aureus (MRSA) Bacteremia Outcome Measure (NQF #1716); (2) National Health Safety Network (NHSN) Facility-Wide Inpatient Hospital-Onset Clostridium difficile Infection (CDI) Outcome Measure (NQF #1717); and (3) All-Cause Unplanned Readmission Measure for 30 Days Post-Discharge from Long-Term Care Hospitals. For the FY 2018 payment determination and subsequent years, we are retaining the following measure in addition to the measures finalized for previous years: Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674).

As discussed in section IX.C.7. of the preamble of the proposed rule and this final rule, we are finalizing three new quality measures for inclusion in the LTCHQR Program for the FY 2018 payment determination and subsequent years: (1) Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function; (2) Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support; and (3) National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure.

Six of the previously adopted and newly finalized measures will be collected via the NHSN. The NHSN is a secure, Internet-based healthcare-associated infection (HAI) tracking system maintained and managed by the CDC. The NHSN enables health care facilities to collect and use data about HAIs, adherence to clinical practices known to prevent HAIs, and other adverse events within their organizations. NHSN data collection occurs via a Web-based tool hosted by the CDC and provided free of charge to facilities. We believe that any burden increase related to complying with the submission of the proposed NHSN VAE Outcome measure would be minimal because LTCHs have already completed the initial setup of the NHSN submission process and have become familiar with reporting data in the NHSN system due to the requirement to report CAUTI and CLABSI measures. While this requirement is subject to the PRA, we believe that the associated burden is approved under OMB control number 0920-0666, for those measures previously finalized, with an expiration date of November, 31, 2016.

The All-Cause Unplanned Readmission Measure for 30 Days Post-Discharge from Long-Term Care Hospitals is a Medicare claims-based measure. Because claims-based measures can be calculated based on data that are already reported to the Medicare program for payment purposes, we believe that this measure will not add any additional reporting burden for LTCHs.

The remaining five previously adopted and newly finalized measures will be collected utilizing the LTCH CARE Data Set. The LTCH CARE Data Set, in its current form, has been approved under OMB control number 0938-1163. Additions will need to be made to the LTCH CARE Data Set in order to allow for collection of the two functional status measures we are finalizing in section IX.C.7.a. of the preamble of this final rule: (1) Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function; and (2) Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support. The revised data collection will be resubmitted to OMB for approval. While this requirement is subject to the PRA, we believe the associated burden is either approved under OMB control number 0938-1163, for those measures previously finalized, with an expiration date of June 30, 2016, or is contained in this updated information collection request section.

Assuring data accuracy is vital to public reporting programs. However, as discussed in section IX.C.11. of the preamble of this final rule, we are not finalizing our proposal, for the FY 2016 payment determination and subsequent years, to validate data submitted to CMS on the LTCH CARE Data Set at this time.

We discuss and respond to public comments we received on these information collection requirements in the section IX.C. of the preamble of this final rule.

10. Electronic Health Record (EHR) Incentive Program and Meaningful Use (MU)

In section IX.D. of the preamble of the proposed rule and of this final rule, we discuss our proposal to align the Medicare EHR Incentive Program reporting and submission timelines for clinical quality measures for eligible hospitals and CAHs with the Hospital IQR Program's reporting and submission timelines. In addition, we provide guidance and clarification of certain policies for reporting zero denominators on clinical quality measures and our policy on case threshold exemptions. Because these proposals for data collection would align with the reporting requirements in place for the Hospital IQR Program, we do not believe there is any additional burden for this collection of information.

11. ICR Regarding Revision of Regulations Governing Use and Release of Medicare Advantage (MA) Risk Adjustment Data (§ 422.310(f))

Medicare Advantage (MA) organizations are required to submit risk adjustment data to CMS organizations under current authority at § 422.310(b) through (d). The changes we are finalizing regarding the use and release of MA risk adjustment data under section X. of the preamble of this final rule do not change the requirements on MA organizations for submission of information to CMS, which have been in place for several years. Therefore, these finalized changes do not impose new information collection requirements on MA organizations. Consequently, because there are no new information collection requirements in our proposal, the proposal does not require a review by OMB under the authority of the Paperwork Reduction Act of 1995.

C. Waiver of 60-Day Delay in the Effective Date

We ordinarily provide a 60-day delay in the effective date of the provisions of a rule in accordance with the Administrative Procedure Act (APA) (5 U.S.C. 553(d), which requires a 30-day delayed effective date, and the Congressional Review Act (5 U.S.C. 801(a)(3), which requires a 60-day delayed effective date for major rules. However, we can waive the delay in the effective date if the Secretary finds, for good cause, that the delay is impracticable, unnecessary, or contrary to the public interest, and incorporates a statement of the finding and the reasons in the rule issued. 5 U.S.C. 553(d)(3); 5 U.S.C. 808(2).

The Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long Term Care Hospital Prospective Payment System are fiscal year payment systems, and we typically issue the final rule by August 1 of each year to both comply with the requirement to annually review and update these payment systems and ensure that the payment policies for these systems are effective, following the required 60-day delay in the effective date, on October 1, the first day of the fiscal year to which the policies are intended to apply. If the agency finds, for good cause, that a 60-day delay is impracticable, unnecessary, or contrary to the public interest, and the agency incorporates a statement of the findings and its reasons in the rule issued, the agency may specify an earlier effective date. The timeframes for developing annual rules are extremely compressed and processing issues complicated this year's rule. We believe it would be contrary to the public interest to delay the effective date of the payment system portions of this rule. We therefore specify that those portions of the rule will be effective October 1.

List of Subjects

42 CFR Part 405

  • Administrative practice and procedure
  • Health facilities
  • Health professions
  • Kidney diseases
  • Medicare
  • Reporting and recordkeeping, rural areas
  • X-rays

42 CFR Part 412

  • Administrative practice and procedure
  • Health facilities
  • Medicare
  • Puerto Rico
  • Reporting and recordkeeping requirements

42 CFR Part 413

  • Health facilities
  • Kidney diseases
  • Medicare
  • Puerto Rico
  • Reporting and recordkeeping requirements

42 CFR Part 415

  • Health facilities
  • Health professions
  • Medicare
  • Reporting and recordkeeping requirements

42 CFR Part 422

  • Administrative practice and procedure
  • Health facilities
  • Health maintenance, organizations (HMO)
  • Medicare
  • Penalties
  • Privacy
  • Reporting and recordkeeping requirements

42 CFR Part 424

  • Emergency medical services
  • Health facilities
  • Health professions
  • Medicare

42 CFR Part 485

  • Grant programs—health
  • Health facilities
  • Medicaid
  • Medicare
  • Reporting and recordkeeping requirements

42 CFR Part 488

  • Administrative practice and procedure
  • Health facilities
  • Medicare
  • Reporting and recordkeeping requirements

For the reasons stated in the preamble of this final rule, the Centers for Medicare & Medicaid Services is confirming, as final, interim rules published on October 3, 2013 (78 FR 61191) and March 18, 2014 (79 FR 15022) and is further amending 42 CFR Chapter IV as set forth below:

Title 42—Public Health

PART 405—FEDERAL HEALTH INSURANCE FOR THE AGED AND DISABLED

Subpart R—Provider Reimbursement Determinations and Appeals

1. The authority citation for Subpart R continues to read as follows:

Authority: Secs. 205, 1102, 1814(b), 1815(a), 1833, 1861(v), 1871, 1872, 1878, and 1886 of the Social Security Act (42 U.S.C. 405, 1302, 1395f(b), 1395g(a), 1395l, 1395hh, 1395ii, 1395oo, and 1395ww).

2. Section 405.1811 is amended by—

a. Revising paragraphs (a) introductory text and (a)(3).

b. Revising paragraph (b) introductory text.

c. Redesignating paragraph (c) as paragraph (e).

d. Adding new paragraphs (c) and (d).

e. Revising newly redesignated paragraph (e).

The revisions and additions read as follows:

§ 405.1811
Right to contractor hearing; contents of, and adding issues to, hearing request.

(a) Right to hearing on final contractor determination. A provider (but no other individual, entity, or party) has a right to a contractor hearing, as a single provider appeal, for specific items claimed for a cost reporting period covered by a final contractor or Secretary determination if—

(3) Unless the provider qualifies for a good cause extension under § 405.1813, the date of receipt by the contractor of the provider's hearing request is no later than 180 days after the date of receipt by the provider of the final contractor or Secretary determination.

(b) Contents of request for a contractor hearing on final contractor determination. The provider's request for a contractor hearing under paragraph (a) of this section must be submitted in writing to the contractor, and the request must include the elements described in paragraphs (b)(1) through (b)(3) of this section. If the provider submits a hearing request that does not meet the requirements of paragraph (b)(1), (b)(2), or (b)(3) of this section, the contractor hearing officer may dismiss with prejudice the appeal or take any other remedial action he or she considers appropriate.

(c) Right to hearing based on untimely contractor determination. Notwithstanding the provisions of paragraph (a) of this section, a provider (but no other individual, entity, or party) has a right to a contractor hearing, as a single provider appeal, for a cost reporting period if—

(1) A final contractor determination for the provider's cost reporting period is not issued (through no fault of the provider) within 12 months after the date of receipt by the contractor of the provider's perfected cost report or amended cost report (as specified in § 413.24(f) of this chapter). The date of receipt by the contractor of the provider's perfected cost report or amended cost report is presumed to be the date the contractor stamped “Received” on such cost report unless it is shown by a preponderance of the evidence that the contractor received the cost report on an earlier date.

(2) Unless the provider qualifies for a good cause extension under § 405.1813, the date of receipt by the contractor of the provider's hearing request is no later than 180 days after the expiration of the 12 month period for issuance of the final contractor determination (as determined in accordance with paragraph (c)(1) of this section); and

(3) The amount in controversy (as determined in accordance with § 405.1839) is at least $1,000 but less than $10,000.

(d) Contents of request for a contractor hearing based on untimely contractor determination. The provider's request for a contractor hearing under paragraph (c) of this section must be submitted in writing to the contractor, and the request must include the elements described in paragraphs (d)(1) through (d)(3) of this section. If the provider submits a hearing request that does not meet the requirements of paragraph (d)(1), (d)(2), or (d)(3) of this section, the contractor hearing officer may dismiss with prejudice the appeal or take any other remedial action he or she considers appropriate.

(1) A demonstration that the provider satisfies the requirements for a contractor hearing as specified in paragraph (c) of this section.

(2) An explanation (for each specific item at issue) of the following:

(i) Why the provider believes Medicare payment is incorrect for each disputed item (or, where applicable, why the provider is unable to determine whether Medicare payment is correct because it does not have access to underlying information concerning the calculation of Medicare payment).

(ii) How and why the provider believes Medicare payment must be determined differently for each disputed item.

(iii) If the provider self-disallows a specific item, a description of the nature and amount of each self-disallowed item and the reimbursement or payment sought for the item.

(3) A copy of any documentary evidence the provider considers necessary to satisfy the hearing request requirements of paragraphs (d)(1) and (d)(2) of this section.

(e) Adding issues to the hearing request. After filing a hearing request in accordance with paragraphs (a) and (b), or paragraphs (c) and (d), of this section, a provider may add specific Medicare payment issues to the original hearing request by submitting a written request to the contractor hearing officer, only if—

(1) The request to add issues complies with the requirements of paragraphs (a) and (b), or paragraphs (c) and (d), of this section as to each new issue.

(2) The specific matters at issue raised in the initial hearing request and the matters identified in subsequent requests to add issues, when combined, satisfy the amount in controversy requirements of paragraph (a)(2) or paragraph (c)(3) of this section.

(3) The contractor hearing officer receives the provider's request to add issues no later than 60 days after the expiration of the applicable 180-day period prescribed in paragraph (a)(3) or paragraph (c)(2) of this section.

3. Section 405.1835 is amended by—

a. Revising paragraphs (a) introductory text and (a)(3).

b. Revising paragraph (b) introductory text.

c. Redesignating paragraph (c) as paragraph (e).

d. Adding new paragraphs (c) and (d).

e. Revising newly redesignated paragraph (e).

The revisions and additions read as follows:

§ 405.1835
Right to Board hearing; contents of, and adding issues to, hearing request.

(a) Right to hearing on final contractor determination. A provider (but no other individual, entity, or party) has a right to a Board hearing, as a single provider appeal, for specific items claimed for a cost reporting period covered by a final contractor or Secretary determination if—

(3) Unless the provider qualifies for a good cause extension under § 405.1836, the date of receipt by the Board of the provider's hearing request is no later than 180 days after the date of receipt by the provider of the final contractor or Secretary determination.

(b) Contents of request for a Board hearing on final contractor determination. The provider's request for a Board hearing under paragraph (a) of this section must be submitted in writing to the Board, and the request must include the elements described in paragraphs (b)(1) through (b)(4) of this section. If the provider submits a hearing request that does not meet the requirements of paragraph (b)(1), (b)(2), or (b)(3) of this section, the Board may dismiss with prejudice the appeal or take any other remedial action it considers appropriate.

(c) Right to hearing based on untimely contractor determination. Notwithstanding the provisions of paragraph (a) of this section, a provider (but no other individual, entity, or party) has a right to a Board hearing, as a single provider appeal, for specific items claimed for a cost reporting period if—

(1) A final contractor determination for the provider's cost reporting period is not issued (through no fault of the provider) within 12 months after the date of receipt by the contractor of the provider's perfected cost report or amended cost report (as specified in § 413.24(f) of this chapter). The date of receipt by the contractor of the provider's perfected cost report or amended cost report is presumed to be the date the contractor stamped “Received” on such cost report unless it is shown by a preponderance of the evidence that the contractor received the cost report on an earlier date.

(2) Unless the provider qualifies for a good cause extension under § 405.1836, the date of receipt by the Board of the provider's hearing request is no later than 180 days after the expiration of the 12 month period for issuance of the final contractor determination (as determined in accordance with paragraph (c)(1) of this section); and

(3) The amount in controversy (as determined in accordance with § 405.1839) is $10,000 or more.

(d) Contents of request for a Board hearing based on untimely contractor determination. The provider's request for a Board hearing under paragraph (c) of this section must be submitted in writing to the Board, and the request must include the elements described in paragraphs (d)(1) through (d)(4) of this section. If the provider submits a hearing request that does not meet the requirements of paragraph (d)(1), (d)(2), or (d)(3) of this section, the Board may dismiss with prejudice the appeal or take any other remedial action it considers appropriate.

(1) A demonstration that the provider satisfies the requirements for a Board hearing as specified in paragraph (c) of this section.

(2) An explanation (for each specific item at issue) of the following:

(i) Why the provider believes Medicare payment is incorrect for each disputed item (or, where applicable, why the provider is unable to determine whether Medicare payment is correct because it does not have access to underlying information concerning the calculation of Medicare payment).

(ii) How and why the provider believes Medicare payment must be determined differently for each disputed item.

(iii) If the provider self-disallows a specific item, a description of the nature and amount of each self-disallowed item and the reimbursement or payment sought for the item.

(3) A copy of any documentary evidence the provider considers necessary to satisfy the hearing request requirements of paragraphs (d)(1) and (d)(2) of this section.

(4) With respect to a provider under common ownership or control, the name and address of its parent corporation, and a statement that meets all of the requirements of paragraphs (b)(4)(i) and (b)(4)(ii) of this section.

(e) Adding issues to the hearing request. After filing a hearing request in accordance with paragraphs (a) and (b), or paragraphs (c) and (d), of this section, a provider may add specific Medicare payment issues to the original hearing request by submitting a written request to the Board only if—

(1) The request to add issues complies with the requirements of paragraphs (a) and (b), or paragraphs (c) and (d), of this section as to each new issue.

(2) The specific matters at issue raised in the initial hearing request and the matters identified in subsequent requests to add issues, when combined, satisfy the amount in controversy requirements of paragraph (a)(2) or paragraph (c)(3) of this section.

(3) The Board receives the provider's request to add issues no later than 60 days after the expiration of the applicable 180-day period prescribed in paragraph (a)(3) or paragraph (c)(2), of this section.

Nomenclature Changes

Subpart R [Amended]

4. Amend Subpart R by removing the term or phrase in the first column and replace it with the term or phrase in the second column:

RemoveAdd
an intermediarya contractor
intermediarycontractor
intermediaries'contractors'
intermediary'scontractor's

Subpart X—Rural Health Clinic and Federally Qualified Health Center Services

5. The authority citation for Subpart X continues to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh).

6. Section 405.2468 is amended by revising paragraph (f)(1) to read as follows:

§ 405.2468
Allowable costs.

(f) * * *

(1) Effective for portions of cost reporting periods occurring on or after January 1, 1999, if an RHC or an FQHC incurs “all or substantially all” of the costs for the training program in the nonhospital setting as defined in § 413.75(b) of this chapter, the RHC or FQHC may receive direct graduate medical education payment for those residents. However, in connection with cost reporting periods for which “all or substantially all of the costs for the training program in the nonhospital setting” is not defined in § 413.75(b) of this chapter, if an RHC or an FQHC incurs the salaries and fringe benefits (including travel and lodging where applicable) of residents training at the RHC or FQHC, the RHC or FQHC may receive direct graduate medical education payments for those residents.

PART 412—PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES

7. The authority citation for Part 412 is revised to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh), sec. 124 of Pub. L. 106-113 (113 Stat. 1501A-332), sec. 1206 of Pub. L. 113-67, and sec. 112 of Pub. L. 113-93.

8. Section 412.23 is amended by—

a. Revising paragraphs (e)(6)(i), (e)(6)(ii) introductory text, and (e)(6)(ii)(B)( 2).

b. Revising paragraphs (e)(7)(i) and (e)(7)(ii) introductory text.

c. Adding new paragraph (e)(7)(iii).

The revisions and additions read as follows:

§ 412.23
Excluded hospitals: Classifications.

(e) * * *

(6) * * *

(i) General rule. Except as specified in paragraphs (e)(6)(ii) and (e)(6)(iii) of this section for the period beginning December 29, 2007 and ending December 28, 2012, and the period beginning April 1, 2014 and ending September 30, 2017, a moratorium applies to the establishment and classification of a long-term care hospital as described in paragraphs (e) and (e)(1) through (e)(5) of this section or a long-term care hospital satellite facility as described in § 412.22(h).

(ii) Exception. The moratorium specified in paragraph (e)(6)(i) of this section is not applicable to the establishment and classification of a long-term care hospital that meets the requirements of paragraphs (e) and (e)(1) through (e)(5) of this section, or a long-term care hospital satellite facility that meets the requirements of § 412.22(h), if the long-term care hospital or long-term care satellite facility meets the following criteria on or before December 29, 2007, or prior to April 1, 2014, as applicable:

(B) * * *

(2)(i) Has expended prior to December 29, 2007, at least 10 percent (or, if less, $2.5 million) of the estimated cost of the project specified in paragraph (e)(6)(ii)(B)(1) of his section; or

(ii) Has expended, before April 1, 2014, at least 10 percent (or, if less, $2.5 million) of the estimated cost of the project specified in paragraph (e)(6)(ii)(B)(1) of this section.

(7) * * *

(i) For purposes of this paragraph, an existing long-term care hospital or long-term care hospital satellite facility means a long-term care hospital that meets the requirements of paragraph(e) of this section or a long-term care hospital satellite facility that meets the requirements of § 412.22(h) that received payment under the provisions of subpart O of this part prior to the dates noted in the following moratorium clauses.

(ii) December 29, 2007, through December 28, 2007—

(iii) April 1, 2014 through September 30, 2017—The number of Medicare-certified beds in an existing long-term care hospital or an existing long-term care hospital satellite facility must not be increased beyond the number of Medicare-certified beds prior to April 1, 2014.

9. Section 412.64 is amended by—

a. Removing paragraph (b)(1)(ii)(D).

b. Revising paragraph (b)(3)(i).

c. Revising paragraphs (d)(1), (d)(2)(i) introductory text, (d)(2)(ii), and (d)(3) introductory text.

d. In paragraphs (h)(4) introductory text and (h)(4)(vi), removing the date “October 1, 2014” and adding in its place the date “October 1, 2015”.

The revisions read as follows:

§ 412.64
Federal rates for inpatient operating costs for Federal fiscal year 2005 and subsequent fiscal years.

(b) * * *

(3)(i) For discharges occurring on or after October 1, 2004, a hospital that is located in a rural county adjacent to one or more urban areas is deemed to be located in an urban area and receives the Federal payment amount for the urban area to which the greater number of workers in the county commute if the rural county would otherwise be considered part of an urban area, under the standards for designating MSAs if the commuting rates used in determining outlying counties were determined on the basis of the aggregate number of resident workers who commute to (and, if applicable under the standards, from) the central county or central counties of all adjacent MSAs. Qualifying counties are determined based upon OMB standards, using the most recent OMB standards for delineating statistical areas adopted by CMS.

(d) * * *

(1) The applicable percentage change for updating the standardized amount for all hospitals in all areas is—

(i) For fiscal year 2005 through fiscal year 2009, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section.

(ii) For fiscal year 2010, for discharges—

(A) On or after October 1, 2009 and before April 1, 2010, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section; and

(B) On or after April 1, 2010 and before October 1, 2010, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section, less 0.25 percentage point.

(iii) For fiscal year 2011, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this subchapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section, less 0.25 percentage point.

(iv) For fiscal years 2012 and 2013, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section, less a multifactor productivity adjustment (as determined by CMS) and less 0.1 percentage point.

(v) For fiscal year 2014, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraph (d)(2) of this section, less a multifactor productivity adjustment (as determined by CMS) and less 0.3 percentage point.

(vi) For fiscal year 2015, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals, subject to the provisions of paragraphs (d)(2) and (d)(3) of this section, less a multifactor productivity adjustment (as determined by CMS) and less 0.2 percentage point.

(2)(i) In the case of a “subsection (d) hospital,” as defined under section 1886(d)(1)(B) of the Act, that does not submit quality data on a quarterly basis to CMS, in the form and manner specified by CMS, the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals is reduced—

(ii) Any reduction pursuant to this paragraph (d)(2) will apply only to the fiscal year involved and will not be taken into account in computing the applicable percentage change for a subsequent fiscal year.

(3) Beginning fiscal year 2015, in the case of a “subsection (d) hospital,” as defined under section 1886(d)(1)(B) of the Act, that is not a meaningful electronic health record (EHR) user as defined in Part 495 of this chapter for the applicable EHR reporting period and does not receive an exception, three-fourths of the percentage increase in the market basket index (as defined in § 413.40(a)(3) of this chapter) for prospective payment hospitals is reduced—

10. Section 412.101 is amended by revising paragraphs (b)(2)(i), (b)(2)(ii), (c)(1), (c)(2) introductory text, and (d) to read as follows:

§ 412.101
Special treatment: Inpatient hospital payment adjustment for low-volume hospitals.

(b) * * *

(2) * * *

(i) For FY 2005 through FY 2010 and the portion of FY 2015 beginning on April 1, 2015, and subsequent fiscal years, a hospital must have fewer than 200 total discharges, which includes Medicare and non-Medicare discharges, during the fiscal year, based on the hospital's most recently submitted cost report, and be located more than 25 road miles (as defined in paragraph (a) of this section) from the nearest “subsection (d)” (section 1886(d) of the Act) hospital.

(ii) For FY 2011 through FY 2014, and the portion of FY 2015 before April 1, 2015, a hospital must have fewer than 1,600 Medicare discharges, as defined in paragraph (a) of this section, during the fiscal year, based on the hospital's Medicare discharges from the most recently available MedPAR data as determined by CMS, and be located more than 15 road miles, as defined in paragraph (a) of this section, from the nearest “subsection (d)” (section 1886(d) of the Act) hospital.

(c) * * *

(1) For FY 2005 through FY 2010 and the portion of FY 2015 beginning on April 1, 2015 and subsequent fiscal years, the adjustment is an additional 25 percent for each Medicare discharge.

(2) For FY 2011 through FY 2014 and the portion of FY 2015 before April 1, 2015, the adjustment is as follows:

(d) Eligibility of new hospitals for the adjustment. For FYs 2005 through 2010 and the portion of FY 2015 beginning on April 1, 2015, and subsequent fiscal years, a new hospital will be eligible for a low-volume adjustment under this section once it has submitted a cost report for a cost reporting period that indicates that it meets discharge requirements during the applicable fiscal year and has provided its fiscal intermediary or Medicare administrative contractor with sufficient evidence that it meets the distance requirement, as specified under paragraph (b)(2) of this section.

11. Section 412.102 is revised to read as follows:

§ 412.102
Special treatment: Hospitals located in areas that are changing from urban to rural as a result of a geographic redesignation.

An urban hospital that was part of an MSA, but was redesignated as rural as a result of the most recent OMB standards for delineating statistical areas adopted by CMS, may receive an adjustment to its rural Federal payment amount for operating costs for 2 successive fiscal years as provided in paragraphs (a) and (b) of this section.

(a) First year adjustment. (1) Effective on or after October 1, 1983 and before October 1, 2014, the hospital's rural average standardized amount and disproportionate share payments as described in § 412.106 are adjusted on the basis of an additional amount that equals two-thirds of the difference between the urban standardized amount and disproportionate share payments applicable to the hospital before its geographic redesignation and the rural standardized amount and disproportionate share payments otherwise applicable to the Federal fiscal year for which the adjustment is made.

(2) Effective on or after October 1, 2014, the hospital's rural disproportionate share payments as described in § 412.106 are adjusted on the basis of an additional amount that equals two-thirds of the difference between the disproportionate share payments as an urban hospital applicable to the hospital before its geographic redesignation to a rural area as a result of implementation of the most recent OMB standards for delineating statistical areas adopted by CMS and the rural disproportionate share payment otherwise applicable to the Federal fiscal year for which the adjustment is made.

(b) Second year adjustment. (1) Effective on or after October 1, 1983 and before October 1, 2014, if a hospital's status continues to be rural as a result of geographic redesignation, its rural average standardized amount and disproportionate share payments are adjusted on the basis of an additional amount that equals one-third of the difference between the urban standardized amount and disproportionate share payments applicable to the hospital before its redesignation and the rural standardized amounts and disproportionate share payments otherwise applicable to the Federal fiscal year for which the adjustment is made.

(2) Effective on or after October 1, 2014, if a hospital's status continues to be rural as a result of geographic redesignation, its disproportionate share payments are adjusted on the basis of an additional amount that equals one-third of the difference between the disproportionate share payments applicable to the hospital before its geographic redesignation to a rural area as a result of implementation of the most recent OMB standards for delineating statistical areas adopted by CMS and the rural disproportionate share payments otherwise applicable to the Federal fiscal year for which the adjustment is made.

12. Section 412.103 is amended by adding a new paragraph (a)(6) to read as follows:

§ 412.103
Special treatment: Hospitals located in urban areas and that apply for reclassification as rural.

(a) * * *

(6) For any period on or after October 1, 2014, a CAH in a county that was not in an urban area as defined by the Office of Management and Budget (OMB), but was included in an urban area as a result of the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, may be reclassified as being located in a rural area for purposes of meeting the rural location requirement at § 485.610(b) of this chapter for a period of 2 years, beginning with the date of the implementation of the new labor market area delineations, if it meets any of the requirements under paragraph (a)(1), (a)(2), or (a)(3) of this section.

13. Section 412.105 is amended by revising paragraphs (a)(1)(ii), (f)(1)(iv)(D), and (f)(1)(v), to read as follows:

§ 412.105
Special treatment: Hospitals that incur indirect costs for graduate medical education programs.

(a) * * *

(1) * * *

(ii)(A) For new programs started prior to October 1, 2012, the exception for new programs described in paragraph (f)(1)(vii) of this section applies to each new program individually for which the full-time equivalent cap may be adjusted based on the period of years equal to the minimum accredited length of each new program.

(B) For new programs started on or after October 1, 2012, the exception for new programs described in paragraph (f)(1)(vii) of this section applies to each new program individually during the cost reporting periods prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started, for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(1) of this chapter, and prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the each individual new program started, for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(3) of this chapter.

(f) * * *

(1) * * *

(iv) * * *

(D) A rural hospital redesignated as urban after September 30, 2004, as a result of the most recent census data and implementation of the new labor market area definitions announced by OMB on June 6, 2003, may retain the increases to its full-time equivalent resident cap that it received under paragraphs (f)(1)(iv)(A) and (f)(1)(vii) of this section while it was located in a rural area. Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may retain any existing increases to its FTE resident cap that it had received prior to when the redesignation became effective. Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may receive an increase to its FTE resident cap for a new program, in accordance with paragraph (e) of this section, if it received a letter of accreditation for the new program and/or started training residents in the new program prior to the redesignation becoming effective.

(v)(A) For a hospital's cost reporting periods beginning on or after October 1, 1997, and before October 1, 1998, the total number of full-time equivalent residents for payment purposes is equal to the average of the actual full-time equivalent resident counts (subject to the requirements listed in paragraphs (f)(1)(ii)(C) and (f)(1)(iv) of this section) for that cost reporting period and the preceding cost reporting period.

(B) For a hospital's cost reporting periods beginning on or after October 1, 1998, the total number of full-time equivalent residents for payment purposes is equal to the average of the actual full-time equivalent resident count (subject to the requirements set forth in paragraphs (f)(1)(ii)(C) and (f)(1)(iv) of this section) for that cost reporting period and the preceding two cost reporting periods.

(C) For new programs started prior to October 1, 2012, if a hospital qualified for an adjustment to the limit established under paragraph (f)(1)(iv) of this section for new medical residency programs created under paragraph (f)(1)(vii) of this section, the count of residents participating in new medical residency training programs above the number included in the hospital's full-time equivalent count for the cost reporting period ending during calendar year 1996 is added after applying the averaging rules in paragraph (f)(1)(v)(B) of this section for a period of years. Residents participating in new medical residency training programs are included in the hospital's full-time equivalent count before applying the averaging rules after the period of years has expired. For purposes of this paragraph, for each new program started, the period of years equals the minimum accredited length for each new program. The period of years for each new program begins when the first resident begins training in each new program.

(D) For new programs started on or after October 1, 2012, for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e) of this chapter, full-time equivalent residents participating in new medical residency training programs are excluded from the hospital's full-time equivalent count before applying the averaging rules during the cost reporting periods prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started, for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(1) of this chapter, and prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the each individual new program started, for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(3) of this chapter. Beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(1) of this chapter, and beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of each individual new program started for hospitals for which the full-time equivalent cap may be adjusted in accordance with § 413.79(e)(3) of this chapter, full-time equivalent residents participating in new medical residency training programs are included in the hospital's full-time equivalent count before applying the averaging rules in paragraph (f)(1)(v)(B) of this section.

(E) Subject to the provisions of paragraph (f)(1)(ix) of this section, full-time equivalent residents that are displaced by the closure of either another hospital or another hospital's program are added to the full-time equivalent count after applying the averaging rules in paragraph (f)(1)(v)(B) of this section for the receiving hospital for the duration of time that the displaced residents are training at the receiving hospital.

(F) Subject to the provisions of paragraph (f)(1)(x) of this section, effective for cost reporting periods beginning on or after April 1, 2000, full-time equivalent residents at an urban hospital in a rural track program are included in the urban hospital's rolling average calculation described in this paragraph (f)(1)(v)(B).

14. Section 412.106 is amended by revising paragraph (g)(1)(iii)(C) to read as follows:

§ 412.106
Special treatment: Hospitals that serve a disproportionate share of low-income patients.

(g) * * *

(1) * * *

(iii) * * *

(C) For fiscal year 2014 and for fiscal year 2015, CMS will base its estimates of the amount of hospital uncompensated care on the most recent available data on utilization for Medicaid and Medicare SSI patients, as determined by CMS in accordance with paragraphs (b)(2)(i) and (b)(4) of this section.

§ 412.108
[Amended]

15. In § 412.108, paragraph (a)(1) introductory text and paragraph (c)(2)(iii) introductory text, remove the date “April 1, 2014” and add in its place the date “April 1, 2015”.

16. Section 412.140 is amended by revising paragraph (c)(2) to read as follows:

§ 412.140
Participation, data submission, and validation requirements under the Hospital Inpatient Quality Reporting (IQR) Program.

(c) * * *

(2) Exception. Upon request by a hospital, CMS may grant an extension or exemption of one or more data submission deadlines in the event of extraordinary circumstances beyond the control of the hospital. Specific requirements for submission of a request for an extension or exemption are available on QualityNet.org.

17. Section 412.152 is amended by revising the definition of “Applicable hospital” to read as follows:

§ 412.152
Definitions for the Hospital Readmissions Reduction Program.

Applicable hospital is a hospital described in section 1886(d)(1)(B) of the Act.

§ 412.154
[Amended]

18. Section 412.154 is amended by removing and reserving paragraph (d).

19. Section 412.160 is amended by revising the definitions of “Base operating DRG payment amount” and “Performance standards” to read as follows:

§ 412.160
Definitions for the Hospital Value-Based Purchasing (VBP) Program.

Base operating DRG payment amount means the following:

(1) With respect to a subsection (d) hospital (as defined in section 1886(d)(1)(B) of the Act), the wage-adjusted DRG operating payment plus any applicable new technology add-on payments under subpart F of this part. This amount is determined without regard to any payment adjustments under the Hospital Readmissions Reduction Program, as specified under § 412.154. This amount does not include any additional payments for indirect medical education under § 412.105, the treatment of a disproportionate share of low-income patients under § 412.106, outliers under subpart F of this part, or a low volume of discharges under § 412.101.

(2) With respect to a Medicare-dependent, small rural hospital that receives payments under § 412.108(c) or a sole community hospital that receives payments under § 412.92(d), the wage-adjusted DRG operating payment plus any applicable new technology add-on payments under subpart F of this part. This amount does not include any additional payments for indirect medical education under § 412.105, the treatment of a disproportionate share of low-income patients under § 412.106, outliers under subpart F of this part, or a low volume of discharges under § 412.101. With respect to a Medicare-dependent, small rural hospital that receives payments under § 412.108(c) (for discharges occurring in FY 2013) or a sole community hospital that receives payments under § 412.92(d), this amount also does not include the difference between the hospital-specific payment rate and the Federal payment rate determined under subpart D of this part.

Performance standards are the levels of performance that hospitals must meet or exceed in order to earn points under the Hospital VBP Program, and are calculated with respect to a measure for a fiscal year no later than 60 days prior to the start of the performance period for that measure for that fiscal year. The performance standards for a measure may be updated as follows:

(1) To make a single correction to correct a calculation error, data issue, or other problem that would significantly change the performance standards; or

(2) To incorporate nonsubstantive technical updates made to the measure between the time that CMS first displays the performance standards for that measure for a fiscal year and the time that CMS calculates hospital performance on that measure at the conclusion of the performance period for that measure for a fiscal year.

20. Section 412.161 is revised to read as follows:

§ 412.161
Applicability of the Hospital Value-Based Purchasing (VBP) Program

The Hospital VBP Program applies to hospitals, as that term is defined in § 412.160.

§ 412.172
[Amended]

21. Section 412.172 is amended by removing and reserving paragraph (c).

22. Section 412.232 is amended by revising paragraph (b)(2) to read as follows:

§ 412.232
Criteria for all hospitals in a rural county seeking urban redesignation.

(b) * * *

(2) For fiscal years beginning with FY 2005, the group of hospitals must demonstrate that the county in which the hospitals are located meets the standards for redesignation to an MSA as an outlying county using the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data.

23. Section 412.234 is amended by revising paragraph (a)(3)(iv) to read as follows:

§ 412.234
Criteria for all hospitals in an urban county seeking redesignation to another urban area.

(a) * * *

(3) * * *

(iv) For Federal fiscal year 2008 and thereafter, hospitals located in counties that are in the same Combined Statistical Area (CSA) or Core-Based Statistical Area (CBSA) (under the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data) as the urban area to which they seek redesignation qualify as meeting the proximity requirement for reclassification to the urban area to which they seek redesignation.

24. Section 412.500 is amended by adding paragraphs (a)(4), (a)(5), and (a)(6) to read as follows:

§ 412.500
Basis and scope of subpart.

(a) * * *

(4) Section 4302(a) of Public Law 111-5, which amended sections 114(c) and (d) of Public Law 110-173 relating to several moratoria on the establishment of new long-term care hospitals and satellite facilities and on the increase in the number of beds in existing long-term care hospitals and satellite facilities under the long-term care hospital prospective payment system.

(5) Sections 3106(a) and 10312(a) of Public Law 111-148, which extended certain payment rules and moratoria under the long-term care hospital prospective payment system by further amending sections 114(c) and (d) of Public Law 110-173.

(6) Section 1206 of Public Law 113-67, which further extended certain payment rules and moratoria under the long-term care hospital prospective payment system by amending sections 114(c) and (d) of Public Law 110-173, and which:

(i) Added a new section 1886(m)(6) to the Act to establish a site neutral payment amount for long-term care hospital discharges that fail to meet the applicable criteria in cost reporting periods beginning on or after October 1, 2015; and

(ii) Requires the Secretary's review of the payment rates and regulations governing long-term care hospitals established under section 1886(d)(1)(B)(iv)(II) of the Act and application of payment adjustments based on that review.

25. Section 412.521 is amended by revising paragraph (a)(2) to read as follows:

§ 412.521
Basis for payment.

(a) * * *

(2) Except as provided for in § 412.526, the amount of payment under the prospective payment system is based on the Federal payment rate established in accordance with § 412.523, including adjustments described in § 412.525, and, if applicable during a transition period, on a blend of the Federal payment rate and the cost-based reimbursement rate described in § 412.533.

26. Section 412.523 is amended by adding a new paragraph (c)(3)(xi) to read as follows:

§ 412.523
Methodology for calculating the Federal prospective payment rates.

(c) * * *

(3) * * *

(xi) For long-term care hospital prospective payment system fiscal year beginning October 1, 2014, and ending September 30, 2015. The standard Federal rate for the long-term care hospital prospective payment system beginning October 1, 2014, and ending September 30, 2015, is the standard Federal rate for the previous long-term care hospital prospective payment system fiscal year updated by 2.2 percent, and further adjusted, as appropriate, as described in paragraph (d) of this section.

§ 412.525
[Amended]

27. Section 412.525 is amended by removing and reserving paragraph (d)(3).

28. A new § 412.526 is added to read as follows:

§ 412.526
Payment provisions for a “subclause (II)” long-term care hospital.

(a) Definition. A “subclause (II)” long-term care hospital is a hospital that qualifies as an LTCH under section 1886(d)(1)(B)(iv)(II) of the Act.

(b) Method of payment. (1) For cost reporting periods beginning on or after October 1, 2003 and before September 30, 2014, payment to a “subclause (II)” long-term care hospital is made under the prospective payment system specified in § 412.1(a)(4) and Subpart O of this part.

(2) For cost reporting periods beginning on or after October 1, 2014, payment to a “subclause (II)” long-term care hospital is made under the prospective payment system specified in § 412.1(a)(4) and under Subpart O of this part, as adjusted. The adjusted payment amount is determined based on reasonable cost, as described at § 412.526(c).

(c) Determining the adjusted payment for Medicare inpatient operating and capital-related costs under the reasonable cost-based reimbursement rules. Medicare inpatient operating costs are paid based on reasonable cost, subject to a ceiling. The ceiling is the aggregate upper limit on the amount of a hospital's net Medicare inpatient operating costs that the program will recognize for payment purposes, as determined under paragraph (c)(1) of this section.

(1) Ceiling. For each cost reporting period, the ceiling is determined by multiplying the updated target amount, as defined in paragraph (c)(2) of this section, for that period by the number of Medicare discharges paid under this subpart during that period.

(2) Target amounts. (i) For cost reporting periods beginning during Federal fiscal year 2015, the target amount equals the hospital's target amount determined under § 413.40(c)(4) for its cost reporting period beginning during Federal fiscal year 2000, updated by the applicable annual rate-of-increase percentages specified in § 413.40(c)(3) to the subject period.

(ii) For subsequent cost reporting periods, the target amount equals the hospital's target amount for the previous cost reporting period updated by the applicable annual rate-of-increase percentage specified in § 413.40(c)(3) for the subject cost reporting period.

(3) Payment for inpatient operating costs. For cost reporting periods subject to this section, the hospital's Medicare allowable net inpatient operating costs for that period (as defined at § 413.40(a)(3)) are paid on a reasonable cost basis, subject to that hospital's ceiling (as determined under paragraph (c)(1) of this section) for that period.

(4) Payment for inpatient capital-related costs. Medicare allowable net inpatient capital costs are paid on a reasonable cost basis, in accordance with the regulations under Part 413 of this chapter.

(5) Adjustments for extraordinary circumstances—(i) General rules. (A) CMS may adjust the ceiling determined under paragraph (c)(1) of this section for one or more cost reporting periods when unusual inpatient operating costs have resulted in the hospital exceeding its ceiling imposed under this section due to extraordinary circumstances beyond the hospital's control. These circumstances include, but are not limited to, strikes, fire, earthquakes, floods, or similar unusual occurrences with substantial cost effects.

(B) When the hospital requests an adjustment, CMS makes an adjustment only to the extent that the hospital's operating costs are reasonable, attributable to the circumstances specified separately, identified by the hospital, and verified by the Medicare administrative contractor.

(ii) Process for adjustment requests. The provisions of §§ 413.40(e)(1) through (e)(5) of this subchapter are applicable to extraordinary circumstances adjustment requests under this section.

§ 412.532
[Removed]

29. Section 412.532 is removed.

30. Section 412.534 is amended by—

a. Revising paragraphs (c)(1) introductory text and (c)(1)(i).

b. Removing the year “2013” and adding in its place the year “2016” in paragraph (c)(1) and (c)(2) paragraph heading.

c. Revising paragraph (c)(3).

d. Removing the year “2013” and adding in its place the year “2016” in paragraphs (d)(1) paragraph heading, (d)(1)(i), and (d)(2) paragraph heading.

e. Revising paragraph (d)(3).

f. Removing the year “2013” and adding in its place the year “2016” in paragraphs (e)(1) paragraph heading, (e)(1)(i), and (e)(2) paragraph heading.

g. Revising paragraph (e)(3).

h. Revising paragraphs (h) introductory text, (h)(4), and (h)(5).

i. Removing paragraph (h)(6).

The revisions read as follows:

§ 412.534
Special payment provisions for long-term care hospitals within hospitals and satellites of long-term care hospitals.

(c) * * *

(1) For cost reporting periods beginning on or after October 1, 2004 and before October 1, 2007 and for cost reporting periods beginning on or after October 1, 2016. (i) Except as provided in paragraphs (c)(3), (g), and (h) of this section, for any cost reporting period beginning on or after October 1, 2004 and before October 1, 2007, and for cost reporting periods beginning on or after October 1, 2016 in which the long-term care hospital or its satellite facility has a discharged Medicare inpatient population of whom no more than 25 percent were admitted to the hospital or its satellite facility from the co-located hospital, payments are made under the rules at §§ 412.500 through 412.541 with no adjustment under this section.

(3) For a long-term care hospital satellite facility described in § 412.22(h)(3)(i), for cost reporting periods beginning on or after July 1, 2007 and before July 1, 2016, payments will be determined using the methodology specified in paragraph (c)(1) of this section, except that the applicable percentage threshold for Medicare discharges is 50 percent.

(d) * * *

(3) For cost reporting periods beginning on or after July 1, 2007 and before July 1, 2016, payment for a long-term care hospital satellite facility described in § 412.22(h)(3)(i) will be determined using the methodology specified in paragraph (c)(1) of this section, except that the applicable percentage threshold for Medicare discharges is 75 percent.

(e) * * *

(3) For cost reporting periods beginning on or after July 1, 2007 and before July 1, 2016, payments for a long-term care hospital satellite facility described in § 412.22(h)(3)(i) will be determined using the methodology specified in paragraph (c)(1) of this section, except that the applicable percentage threshold for Medicare discharges is 75 percent.

(h) Effective date of policies in this section for certain co-located long-term care hospitals and satellite facilities of long-term care hospitals. Except as specified in paragraph (h)(4) of this section, the policies set forth in this paragraph (h) apply to Medicare patient discharges that were admitted from a hospital located in the same building or on the same campus as a long-term care hospital described in § 412.23(e)(2)(i) that meets the criteria in § 412.22(f) and a satellite facility of a long-term care hospital as described under § 412.22(h)(3)(i) for discharges occurring in cost reporting periods beginning on or after July 1, 2007.

(4) For a long-term care hospital described in § 412.23(e)(2)(i) that meets the criteria in § 412.22(f), the policies set forth in this paragraph (h) and in § 412.536 do not apply for discharges occurring in cost reporting periods beginning on or after July 1, 2007.

(5) For a long-term care hospital or a satellite facility that, as of December 29, 2007, was co-located with an entity that is a provider-based, off-campus location of a subsection (d) hospital which did not provide services payable under section 1886(d) of the Act at the off-campus location, the policies set forth in this paragraph (h) and in § 412.536 do not apply for discharges occurring in cost reporting periods beginning on or after July 1, 2007 and before July 1, 2016.

31. Section 412.536 is amended by—

a. Removing and reserving paragraph (a)(1)(iii).

b. Revising paragraph (a)(2) introductory text.

c. Removing and reserving paragraph (a)(2)(ii).

d. Removing paragraph (a)(3).

The revisions read as follows:

§ 412.536
Special payment provisions for long-term care hospitals and satellites of long-term care hospitals that discharged Medicare patients admitted from a hospital not located in the same building or on the same campus as the long-term care hospital or satellite of the long-term care hospital.

(a) * * *

(1) * * *

(iii) [Reserved].

(2) For cost reporting periods beginning on or after July 1, 2007 and before July 1, 2016, the policies set forth in this section are not applicable to discharges from:

(ii) [Reserved].

PART 413—PRINCIPLES OF REASONABLE COST REIMBURSEMENT; PAYMENT FOR END-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES

32. The authority for Part 413 continues to read as follows:

Authority:Secs. 1102, 1861(v)(1)(A), and 1871 of the Social Security Act (42 U.S.C. 1302, 1395x(v)(1)(A), and 1395hh).

Nomenclature Changes

PART 413—[AMENDED]

33. Amend Part 413 by removing the term or phrase in the first column and replace it with the term or phrase in the second column:

RemoveAdd
an intermediary'sa contractor's
fiscal intermediarycontractor
fiscal intermediary'scontractor's
intermediarycontractor
intermediariescontractors
intermediary'scontractor's

34. Section 413.75(b)(5) is amended by revising the definition of “Rural track FTE limitation” to read as follows:

§ 413.75
Direct GME payments: General requirements.

(b) * * *

(5) * * *

Rural track FTE limitation means the maximum number of residents (as specified in § 413.79(k)) training in a rural track residency program that an urban hospital may include in its FTE count and that is in addition to the number of FTE residents already included in the hospital's FTE cap.

35. Section 413.78 is amended by revising paragraph (g)(6) to read as follows:

§ 413.78
Direct GME payment: Determination of the total number of FTE residents.

(g) * * *

(6) The provisions of paragraphs (g)(1)(ii), (g)(2), (g)(3), and (g)(5) of this section shall not be applied in a manner that requires reopening of any settled cost reports as to which there is not a jurisdictionally proper appeal pending as of March 23, 2010, on direct GME or IME payments. Cost reporting periods beginning before July 1, 2010 are not governed by paragraph (g) of this section.

36. Section 413.79 is amended by revising paragraphs (c)(6), (d)(5), and (k)(7), to read as follows:

§ 413.79
Direct GME payments: Determination of the weighted number of FTE residents.

(c) * * *

(6) FTE resident caps for rural hospitals that are redesignated as urban. A rural hospital redesignated as urban after September 30, 2004, as a result of the most recent census data and implementation of the new MSA definitions announced by OMB on June 6, 2003, may retain the increases to its FTE resident cap that it received under paragraphs (c)(2)(i), (e)(1)(iii), and (e)(3) of this section while it was located in a rural area. Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may retain any existing increases to its FTE resident cap that it had received prior to when the redesignation became effective. Effective October 1, 2014, if a rural hospital is redesignated as urban due to the most recent OMB standards for delineating statistical areas adopted by CMS, the redesignated urban hospital may receive an increase to its FTE resident cap for a new program, in accordance with paragraph (e) of this section, if it received a letter of accreditation for the new program and/or started training residents in the new program prior to the redesignation becoming effective.

(d) * * *

(5) (i) For new programs started prior to October 1, 2012, if a hospital qualifies for an adjustment to the limit established under paragraph (c)(2) of this section for new medical residency programs created under paragraph (e) of this section, the count of the residents participating in new medical residency training programs above the number included in the hospital's FTE count for the cost reporting period ending during calendar year 1996 is added after applying the averaging rules in this paragraph (d), for a period of years. Residents participating in new medical residency training programs are included in the hospital's FTE count before applying the averaging rules after the period of years has expired. For purposes of this paragraph (d), for each new program started, the period of years equals the minimum accredited length for each new program. The period of years begins when the first resident begins training in each new program.

(ii) For new programs started on or after October 1, 2012, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e), FTE residents participating in new medical residency training programs are excluded from the hospital's FTE count before applying the averaging rules during the cost reporting periods prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started, for hospitals for which the FTE may be adjusted in accordance with § 413.79(e)(1), and prior to the beginning of the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the each individual new program started, for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3). Beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(1), and beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the each individual new program started for hospitals for which the FTE cap may be adjusted in accordance with § 413.79(e)(3), FTE residents participating in new medical residency training programs are included in the hospital's FTE count before applying the averaging rules.

(k) * * *

(7)(i) Effective prior to October 1, 2014, if an urban hospital had established a rural track training program under the provisions of this paragraph (k) with a hospital located in a rural area and that rural area subsequently becomes an urban area due to the most recent census data and implementation of the new labor market area definitions announced by OMB on June 6, 2003, the urban hospital may continue to adjust its FTE resident limit in accordance with this paragraph (k) for the rural track programs established prior to the adoption of such new labor market area definitions. In order to receive an adjustment to its FTE resident cap for a new rural track residency program, the urban hospital must establish a rural track program with hospitals that are designated rural based on the most recent geographical location delineations adopted by CMS.

(ii) Effective October 1, 2014, if an urban hospital started a rural track training program under the provisions of this paragraph (k) with a hospital located in a rural area and, during the 3-year period that is used to calculate the urban hospital's rural track FTE limit, that rural area subsequently becomes an urban area due to the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, the urban hospital may continue to adjust its FTE resident limit in accordance with this paragraph (k) and subject to paragraph (k)(7)(iii) of this section for the rural track programs started prior to the adoption of such new OMB standards for delineating statistical areas.

(iii) Effective October 1, 2014, if an urban hospital started a rural track training program under the provisions of this paragraph (k) with a hospital located in a rural area and that rural area subsequently becomes an urban area due to the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, regardless of whether the redesignation of the rural hospital occurs during the 3-year period that is used to calculate the urban hospital's rural track FTE limit, or after the 3-year period used to calculate the urban hospital's rural track FTE limit, the urban hospital may continue to adjust its FTE resident limit in accordance with this paragraph (k) based on the rural track programs started prior to the change in the hospital's geographic designation. In order for the urban hospital to receive or use the adjustment to its FTE resident cap for training FTE residents in the rural track residency program that was started prior to the most recent OMB standards for delineating statistical areas adopted by CMS, one of the following two conditions must be met by the end of a period that begins when the most recent OMB standards for delineating statistical areas are adopted by CMS and continues through the end of the second residency training year following the date the most recent OMB delineations are adopted by CMS: the hospital that has been redesignated from rural to urban must reclassify as rural under § 412.103 of this chapter, for purposes of IME only; or the urban hospital must find a new site that is geographically rural consistent with the most recent geographical location delineations adopted by CMS. In order to receive an adjustment to its FTE resident cap for an additional new rural track residency program, the urban hospital must participate in a rural track program with sites that are geographically rural based on the most recent geographical location delineations adopted by CMS.

PART 415—SERVICES FURNISHED BY PHYSICIANS IN PROVIDERS, SUPERVISING PHYSICIANS IN TEACHING SETTINGS, AND RESIDENTS IN CERTAIN SETTINGS

37. The authority citation for Part 415 continues to read as follows:

Authority:Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh), and sec. 124 of Pub. L. 106-113 (113 Stat. 1501A-332).

38. Section 415.70 is amended by revising paragraph (b) to read as follows:

§ 415.70
Limits on compensation for physician services in providers.

(b) Methodology for establishing limits. (1) For cost reporting periods beginning before January 1, 2015. CMS establishes a methodology for determining annual reasonable compensation equivalency limits and, to the extent possible, considers average physician incomes by specialty and type of location using the best available data.

(2) For cost reporting periods beginning on or after January 1, 2015. CMS establishes a methodology for determining annual reasonable compensation equivalency limits and, to the extent possible, considers average physician incomes by specialty using the best available data.

PART 422—MEDICARE ADVANTAGE PROGRAM

39. The authority citation for Part 422 continues to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh).

40. Section 422.300 is revised to read as follows:

§ 422.300
Basis and scope.

This subpart is based on sections 1106, 1128J(d), 1853, 1854, and 1858 of the Act. It sets forth the rules for making payments to Medicare Advantage (MA) organizations offering local and regional MA plans, including calculation of MA capitation rates and benchmarks, conditions under which payment is based on plan bids, adjustments to capitation rates (including risk adjustment), collection of risk adjustment data, conditions for use and disclosure of risk adjustment data, and other payment rules. See § 422.458 in subpart J for rules on risk sharing payments to MA regional organizations.

41. Section 422.310 is amended by revising paragraph (f) to read as follows:

§ 422.310
Risk adjustment data.

(f) Use and release of data.

(1) CMS use of data. CMS may use the data described in paragraphs (a) through (d) of this section for the following purposes:

(i) To determine the risk adjustment factors used to adjust payments, as required under §§ 422.304(a) and (c);

(ii) To update risk adjustment models;

(iii) To calculate Medicare DSH percentages;

(iv) To conduct quality review and improvement activities;

(v) For Medicare coverage purposes;

(vi) To conduct evaluations and other analysis to support the Medicare program (including demonstrations) and to support public health initiatives and other health care-related research;

(vii) For activities to support the administration of the Medicare program;

(viii) For activities conducted to support program integrity; and

(ix) For purposes authorized by other applicable laws.

(2) CMS release of data. Regarding data described in paragraphs (a) through (d) of this section, CMS may release the minimum data it determines is necessary for one or more of the purposes listed in paragraph (f)(1) of this section to other HHS agencies, other Federal executive branch agencies, States, and external entities in accordance with the following:

(i) Applicable Federal laws;

(ii) CMS data sharing procedures;

(iii) Subject to the protection of beneficiary identifier elements and beneficiary confidentiality, including—

(A) A prohibition against public disclosure of beneficiary identifying information;

(B) Release of beneficiary identifying information to other HHS agencies, other Federal executive branch agencies, and States only when such information is needed; and

(C) Release of beneficiary identifying information to external entities only to the extent needed to link datasets.

(iv) Subject to the aggregation of dollar amounts reported for the associated encounter to protect commercially sensitive data.

(v) Risk adjustment data other than data described in paragraphs (f)(2)(iii) and (f)(2)(iv) of this section will be released without the redaction or aggregation described in paragraphs (f)(2)(iii) and (f)(2)(iv) of this section, respectively.

(3) Risk adjustment data will not become available for release under this paragraph (f) unless—

(i) The risk adjustment reconciliation for the applicable payment year has been completed;

(ii) CMS determines that data release is necessary under paragraph (f)(1)(vi) of this section for emergency preparedness purposes before reconciliation; or

(iii) CMS determines that extraordinary circumstances exist to release the data before reconciliation.

PART 424—CONDITIONS FOR MEDICARE PAYMENT

42. The authority citation for Part 424 continues to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395(hh)).

43. Section 424.11 is amended by revising paragraph (d)(5) to read as follows:

§ 424.11
General procedures.

(d) * * *

(5) For all inpatient hospital services, including inpatient psychiatric facility services, a delayed certification may not extend past discharge.

44. Section 424.15 is amended by revising paragraph (b) to read as follows:

§ 424.15
Requirements for inpatient CAH services.

(b) Certification begins with the order for inpatient admission. All certification requirements must be completed, signed, and documented in the medical record no later than 1 day before the date on which the claim for payment for the inpatient CAH service is submitted.

PART 485—CONDITIONS OF PARTICIPATION: SPECIALIZED PROVIDERS

45. The authority citation for Part 485 continues to read as follows:

Authority:Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395(hh)).

46. Section 485.610 is amended by revising paragraph (b) introductory text and adding a new paragraph (b)(5) to read as follows:

§ 485.610
Conditions of participation: Status and location.

(b) Standard: Location in a rural area or treatment as rural. The CAH meets the requirements of either paragraph (b)(1) or (b)(2) of this section or the requirements of paragraph (b)(3), (b)(4), or (b)(5) of this section.

(5) Effective on or after October 1, 2014, for a period of 2 years beginning with the effective date of the most recent Office of Management and Budget (OMB) standards for delineating statistical areas adopted by CMS, the CAH no longer meets the location requirements in either paragraph (b)(1) or (b)(2) of this section and is located in a county that, prior to the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, was located in a rural area as defined by OMB, but under the most recent OMB standards for delineating statistical areas adopted by CMS and the most recent Census Bureau data, is located in an urban area.

PART 488—SURVEY, CERTIFICATION, AND ENFORCEMENT PROCEDURES

47. The authority citation for Part 488 continues to read as follows:

Authority: Secs. 1102, 1128I, and 1871 of the Social Security Act (42 U.S.C. 1302, 1320a-7j, and 1395(hh)); Pub. L. 110-149, 121 Stat. 1819.

48. Section 488.61 is amended by—

a. Revising paragraphs (a)(4) and (c)(3).

b. Adding new paragraphs (f), (g), and (h).

The revisions and additions read as follows:

§ 488.61
Special procedures for approval and re-approval of organ transplant centers.

(a) * * *

(4) CMS will consider mitigating factors in accordance with paragraphs (f), (g), and (h) of this section.

(c) * * *

(3) CMS will consider mitigating factors in accordance with paragraphs (f), (g), and (h) of this section.

(f) Consideration of mitigating factors in initial approval and re-approval survey, certification, and enforcement actions for transplant centers.

(1) Factors. Except for situations of immediate jeopardy or deficiencies other than failure to meet requirements of § 488.80 or § 488.82, CMS will consider such mitigating factors as may be appropriate in light of the nature of the deficiency and circumstances, including (but not limited to) the following, in making a decision of initial and re-approval of a transplant center that does not meet the data submission, clinical experience, or outcome requirements:

(i) The extent to which outcome measures are not met or exceeded;

(ii) Availability of Medicare-approved transplant centers in the area;

(iii) Extenuating circumstances (for example, natural disaster) that have a temporary effect on meeting the conditions of participation;

(iv) Program improvements that substantially address root causes of graft failures or patient deaths, that have been implemented and institutionalized on a sustainable basis, and that are supported by outcomes more recent than the latest available SRTR report, for which there is a sufficient post-transplant patient and graft survival period and a sufficient number of transplants such that CMS finds that the program demonstrates present-day compliance with the requirements at § 482.80(c)(2)(ii)(C) or § 482.82(c)(2)(ii)(C) of this chapter;

(v) Whether the program has made extensive use of innovative transplantation practices relative to other transplant programs, such as a high rate of transplantation of individuals who are highly sensitized or children who have undergone a Fontan procedure compared to most other transplant programs, where CMS finds that the innovative practices are supported by evidence-based published research literature or nationally recognized standards or Institution Review Board (IRB) approvals, and the SRTR risk-adjustment methodology does not take the relevant key factors into consideration; and

(vi) Whether the program's performance, based on the OPTN method of calculating patient and graft survival, is within the OPTN's thresholds for acceptable performance and does not flag OPTN performance review under the applicable OPTN policy.

(2) Content. A request for consideration of mitigating factors must include sufficient information to permit an adequate review and understanding of the transplant program, the factors that have contributed to outcomes, program improvements or innovations that have been implemented or planned, and in the case of natural disasters, the recovery actions planned. Examples of information to be submitted with each request include (but are not limited to) the following:

(i) The name and contact information for the transplant hospital and the names and roles of key personnel of the transplant program;

(ii) The type of organ transplant program(s) for which approval is requested;

(iii) The conditions of participation that the program does not meet for which the transplant center is requesting CMS' review for mitigating factors;

(iv) The program's organizational chart with full-time equivalent levels, roles, and structure for reporting to hospital leadership;

(v) For applications involving substandard patient or graft survival, the rationale and supporting evidence for CMS' review includes, but is not limited to—

(A) Root Cause Analysis for patient deaths and graft failures, including factors the program has identified as likely causal or contributing factors for patient deaths and graft failures;

(B) Program improvements that have been implemented and improvements that are planned;

(C) Patient and donor/organ selection criteria and evaluation protocols, including methods for pre-transplant patient evaluation by cardiologists, hematologists, nephrologists, and psychiatrists or psychologists to the extent applicable;

(D) Waitlist management protocols and practices relevant to outcomes;

(E) Pre-operative management protocols and practices;

(F) Immunosuppression/infection prophylaxis protocols;

(G) Post-transplant monitoring and management protocols and practices;

(H) Quality Assessment and Performance Improvement (QAPI) Program meeting minutes from the most recent four meetings and attendance rosters from the most recent 12 months;

(I) Quality dashboard and other performance indicators; and

(J) The most recent data regarding transplants that have been made and for outcomes in terms of both patient survival and graft survival;

(vi) For mitigating factors requests based on innovative practice:

(A) A description of the innovations that have been implemented and identification of the specific cases for which the innovative practices are relevant so as to enable the patient and graft survival data for such cases to be compared with all other transplants for at least the period covered by the latest available SRTR report.

(B) The literature, research, or other evidentiary basis that supports consideration of the practice(s) as innovative.

(vii) For requests based on natural disasters or public health emergency:

(A) A description of the disaster or emergency, the specific impact on the program, the time periods of the event(s) and of its immediate recovery aftermath;

(B) Identification of the transplants that occurred during the period for which the request is being made; and

(C) The approximate date when the program believes it substantially recovered from the event(s), or believes it will recover if substantial recovery has not been accomplished at the time of the request.

(3) Timing. Within 10 days after CMS has issued formal written notice of a condition-level deficiency to the program, CMS must receive notification of the program's intent to seek mitigating factors approval or re-approval, and receive all information for consideration of mitigating factors within 120 days of the CMS written notification for a deficiency due to data submission, clinical experience or outcomes at § 482.80 or § 482.82 of this chapter. Failure to meet these timeframes may be the basis for denial of mitigating factors. However, CMS may permit an extension of the timeline for good cause, such as a declared public health emergency.

(g) Results of mitigating factors review.

(1) Actions. Upon review of the request to consider mitigating factors, CMS may take the following actions:

(i) Approve initial approval or re-approval of a program's Medicare participation based upon approval of mitigating factors;

(ii) Deny the program's request for Medicare approval or re-approval based on mitigating factors.

(iii) Offer a time-limited Systems Improvement Agreement, in accordance with paragraph (h) of this section, when a transplant program has waived its appeal rights, has implemented substantial program improvements that address root causes and are institutionally supported by the hospital's governing body on a sustainable basis, and has requested more time to design or implement additional improvements or demonstrate compliance with CMS outcome requirements. Upon completion of the Systems Improvement Agreement or a CMS finding that the hospital has failed to meet the terms of the Agreement, CMS makes a final determination of whether to approve or deny a program's request for Medicare approval or re-approval based on mitigating factors. A Systems Improvement Agreement follows the process specified in paragraph (h) of this section.

(2) Limitation. CMS will not approve any program with a condition-level deficiency. However, CMS may approve a program with a standard-level deficiency upon receipt of an acceptable plan of correction.

(h) Transplant Systems Improvement Agreement. A Systems Improvement Agreement is a binding agreement, entered into voluntarily by the hospital and CMS, through which CMS extends a prospective Medicare termination date and offers the program additional time to achieve compliance with the conditions of participation, contingent on the hospital's agreement to participate in a structured regimen of quality improvement activities, demonstrate improved outcomes, and waive the right to appeal termination based on the identified deficiency or deficiencies (that led to the Agreement) in consideration for more time to demonstrate compliance. In some cases, transplant programs may enter a period of inactivity—voluntarily, or imposed as a condition of the Systems Improvement Agreement.

(1) Content. In exchange for the additional time to initiate or continue activities to achieve compliance with the conditions of participation, the hospital must agree to a regimen of specified activities, including (but not limited to) all of the following:

(i) Patient notification about the degree and type of noncompliance by the program, an explanation of what the program improvement efforts mean for patients, and financial assistance to defray the out-of-pocket costs of copayments and testing expenses for any wait-listed individual who wishes to be listed with another program;

(ii) An external independent peer review team that conducts an onsite assessment of the program. The peer review must include—

(A) Review of policies, staffing, operations, relationship to hospital services, and factors that contribute to program outcomes;

(B) Suggestions for quality improvements the hospital should consider;

(C) Both verbal and written feedback provided directly to the hospital;

(D) Verbal debriefing provided directly to CMS; neither the hospital nor the peer review team is required to provide a written report to CMS; and

(E) Onsite review by a multidisciplinary team that includes a transplant surgeon with expertise in the relevant organ type(s), a transplant administrator, an individual with expertise in transplant QAPI systems, a social worker or psychologist or psychiatrist, and a specialty physician with expertise in conditions particularly relevant to the applicable organ types(s) such as a cardiologist, nephrologist, or hepatologist. Except for the transplant surgeon, CMS may permit substitution of one type of expertise for another individual who has expertise particularly needed for the type of challenges experienced by the program, such as substitution of an infection control specialist in lieu of, or in addition to, a social worker;

(iii) An action plan that addresses systemic quality improvements and is updated after the onsite peer review;

(iv) An onsite consultant whose qualifications are approved by CMS, and who provides services for 8 days per month on average for the duration of the agreement, except that CMS may permit a portion of the time to be spent offsite and may agree to fewer consultant days each month after the first 3 months of the Systems Improvement Agreement;

(v) A comparative effectiveness analysis that compares policies, procedures, and protocols of the transplant program with those of other programs in areas of endeavor that are relevant to the center's current quality improvement needs;

(vi) Development of increased proficiency, or demonstration of current proficiency, with patient-level data from the Scientific Registry of Transplant Recipients and the use of registry data to analyze outcomes and inform quality improvement efforts;

(vii) A staffing analysis that examines the level, type, training, and skill of staff in order to inform transplant center efforts to ensure the engagement and appropriate training and credentialing of staff;

(viii) Activities to strengthen performance of the Quality Assessment and Performance Improvement Program to ensure full compliance with the requirements of § 482.96 and § 482.21 of this chapter;

(ix) Monthly (unless otherwise specified) reporting and conference calls with CMS regarding the status of programmatic improvements, results of the deliverables in the Systems Improvement Agreement, and the number of transplants, deaths, and graft failures that occur within 1 year post-transplant; and

(x) Additional or alternative requirements specified by CMS, tailored to the transplant program type and circumstances. CMS may waive the content elements at paragraphs (h)(1)(v), (h)(1)(vi), (h)(1)(vii), or (h)(1)(viii) of this section if it finds that the program has already adequately conducted the activity, the program is already proficient in the function, or the activity is clearly inapplicable to the deficiencies that led to the Agreement.

(2) Timeframe. A Systems Improvement Agreement will be established for up to a 12-month period, subject to CMS' discretion to determine if a shorter timeframe may suffice. At the hospital's request, CMS may extend the agreement for up to an additional 6-month period.

Dated: July 24, 2014.

Marilyn Tavenner,

Administrator, Centers for Medicare & Medicaid Services.

Dated: July 29, 2014.

Sylvia M. Burwell,

Secretary, Department of Health and Human Services.

Note:

The following Addendum and Appendixes will not appear in the Code of Federal Regulations.

Addendum—Schedule of Standardized Amounts, Update Factors, Rate-of-Increase Percentages Effective with Cost Reporting Periods Beginning on or after October 1, 2014, and Payment Rates for LTCHs Effective for Discharges Occurring on or after October 1, 2014

I. Summary and Background

In this Addendum, we are setting forth a description of the methods and data we used to determine the prospective payment rates for Medicare hospital inpatient operating costs and Medicare hospital inpatient capital-related costs for FY 2015 for acute care hospitals. We also are setting forth the rate-of-increase percentages for updating the target amounts for certain hospitals excluded from the IPPS for FY 2015. We note that, because certain hospitals excluded from the IPPS are paid on a reasonable cost basis subject to a rate-of-increase ceiling (and not by the IPPS), these hospitals are not affected by the figures for the standardized amounts, offsets, and budget neutrality factors. Therefore, in this final rule, we are establishing the rate-of-increase percentages for updating the target amounts for certain hospitals excluded from the IPPS that are effective for cost reporting periods beginning on or after October 1, 2014.

In addition, we are setting forth a description of the methods and data we used to determine the standard Federal rate that will be applicable to Medicare LTCHs for FY 2015.

In general, except for SCHs, MDHs and hospitals located in Puerto Rico, for FY 2015, each hospital's payment per discharge under the IPPS is based on 100 percent of the Federal national rate, also known as the national adjusted standardized amount. This amount reflects the national average hospital cost per case from a base year, updated for inflation.

SCHs are paid based on whichever of the following rates yields the greatest aggregate payment: the Federal national rate (including, as discussed in section IV.F. of the preamble of this final rule, uncompensated care payments under section 1886(r)(2) of the Act); the updated hospital-specific rate based on FY 1982 costs per discharge; the updated hospital-specific rate based on FY 1987 costs per discharge; the updated hospital-specific rate based on FY 1996 costs per discharge; or the updated hospital-specific rate based on FY 2006 costs per discharge.

We note that, as discussed in section IV.G. of the preamble of this final rule, section 1106 of Public Law 113-67, enacted on December 26, 2013, extended the MDH program from the end of FY 2013 (that is, for discharges occurring after September 30, 2013) through the first half of FY 2014 (that is, for discharges occurring before April 1, 2014). Subsequently, section 106 of Public Law 113-93, enacted on April 1, 2014, further extended the MDH program through the first half of FY 2015 (that is, for discharges occurring before April 1, 2015). Prior to the enactment of Public Law 113-67, the MDH program was only to be in effect through the end of FY 2013. Under current law, the MDH program will expire for discharges on or after April 1, 2015.

Under section 1886(d)(5)(G) of the Act, MDHs historically have been paid based on the Federal national rate or, if higher, the Federal national rate plus 50 percent of the difference between the Federal national rate and the updated hospital-specific rate based on FY 1982, FY 1987, or FY 2002 costs per discharge, whichever was higher. Section 5003(c) of Public Law 109-171 further required that MDHs be paid based on the Federal national rate or, if higher, the Federal national rate plus 75 percent of the difference between the Federal national rate and the updated hospital-specific rate. Further, based on the provisions of section 5003(d) of Public Law 109-171, MDHs are no longer subject to the 12-percent cap on their DSH payment adjustment factor.

For hospitals located in Puerto Rico, the payment per discharge is based on the sum of 25 percent of an updated Puerto Rico-specific rate based on average costs per case of Puerto Rico hospitals for the base year and 75 percent of the Federal national rate. (We refer readers to section II.D.2. of this Addendum for a complete description.)

As discussed below in section II. of this Addendum, we are making changes in the determination of the prospective payment rates for Medicare inpatient operating costs for acute care hospitals for FY 2015. In section III. of this Addendum, we discuss our policy changes for determining the prospective payment rates for Medicare inpatient capital-related costs for FY 2015. In section IV. of this Addendum, we are setting forth our changes for determining the rate-of-increase limits for certain hospitals excluded from the IPPS for FY 2015. In section V. of this Addendum, we discuss policy changes for determining the standard Federal rate for LTCHs paid under the LTCH PPS for FY 2015. The tables to which we refer in the preamble of this final rule are listed in section VI. of this Addendum and are available via the Internet on the CMS Web site.

II. Changes to Prospective Payment Rates for Hospital Inpatient Operating Costs for Acute Care Hospitals for FY 2015

The basic methodology for determining prospective payment rates for hospital inpatient operating costs for acute care hospitals for FY 2005 and subsequent fiscal years is set forth under § 412.64. The basic methodology for determining the prospective payment rates for hospital inpatient operating costs for hospitals located in Puerto Rico for FY 2005 and subsequent fiscal years is set forth under §§ 412.211 and 412.212. Below we discuss the factors we are using for determining the prospective payment rates for FY 2015.

In summary, the standardized amounts set forth in Tables 1A, 1B, and 1C that are listed and published in section VI. of this Addendum (and available via the Internet) reflect—

  • Equalization of the standardized amounts for urban and other areas at the level computed for large urban hospitals during FY 2004 and onward, as provided for under section 1886(d)(3)(A)(iv)(II) of the Act.
  • The labor-related share that is applied to the standardized amounts and Puerto Rico-specific standardized amounts to give the hospital the highest payment, as provided for under sections 1886(d)(3)(E) and 1886(d)(9)(C)(iv) of the Act. For FY 2015, depending on whether a hospital submits quality data under the rules established in accordance with section 1886(b)(3)(B)(viii) of the Act (hereafter referred to as a hospital that submits quality data) and is a meaningful EHR user under section 1886(b)(3)(B)(ix) of the Act (hereafter referred to as a hospital that is a meaningful EHR user), there are four possible applicable percentage increases that can be applied to the national standardized amount. We refer readers to section IV.B. of the preamble of this final rule for a complete discussion on the FY 2015 inpatient hospital update. Below is a table with these four options:
FY 2015Hospital submitted quality data and is a meaningful EHR UserHospital submitted quality data and is NOT a meaningful EHR UserHospital did NOT submit quality data and is a meaningful EHR UserHospital did NOT submit quality data and is NOT a meaningful EHR User
Market Basket Rate-of-Increase2.92.92.92.9
Adjustment for Failure to Submit Quality Data under Section 1886(b)(3)(B)(viii) of the Act0.00.0−0.725−0.725
Adjustment for Failure to be a Meaningful EHR User under Section 1886(b)(3)(B)(ix) of the Act0.0−0.7250.0−0.725
MFP Adjustment under Section 1886(b)(3)(B)(xi) of the Act−0.5−0.5−0.5−0.5
Statutory Adjustment under Section 1886(b)(3)(B)(xii) of the Act−0.2−0.2−0.2−0.2
Applicable Percentage Increase Applied to Standardized Amount2.21.4751.4750.75
  • An update of 2.2 percent to the Puerto Rico-specific standardized amount (that is, the FY 2015 estimate of the market basket rate-of-increase of 2.9 percent less an adjustment of 0.5 percentage point for MFP and less 0.2 percentage point), in accordance with section 1886(d)(9)(C)(i) of the Act, as amended by section 401(c) of Public Law 108-173, which sets the update to the Puerto Rico-specific standardized amount equal to the applicable percentage increase set forth under section 1886(b)(3)(B)(i) of the Act.
  • An adjustment to the standardized amount to ensure budget neutrality for DRG recalibration and reclassification, as provided for under section 1886(d)(4)(C)(iii) of the Act.
  • An adjustment to ensure the wage index changes are budget neutral, as provided for under section 1886(d)(3)(E)(i) of the Act. We note that section 1886(d)(3)(E)(i) of the Act requires that when we compute such budget neutrality, we assume that the provisions of section 1886(d)(3)(E)(ii) of the Act (requiring a 62 percent labor-related share in certain circumstances) had not been enacted.
  • An adjustment to ensure the effects of geographic reclassification are budget neutral, as provided for under section 1886(d)(8)(D) of the Act, by removing the FY 2014 budget neutrality factor and applying a revised factor.
  • As discussed below and in section III. of the preamble of this final rule, an adjustment to offset the cost of the transitional wage index provisions provided by CMS as a result of the adoption of the new OMB labor market area delineations.
  • An adjustment to ensure the effects of the rural community hospital demonstration program required under section 410A of Public Law 108-173, as amended by sections 3123 and 10313 of Public Law 111-148, which extended the demonstration program for an additional 5 years, are budget neutral as required under section 410A(c)(2) of Public Law 108-173.
  • An adjustment to remove the FY 2014 outlier offset and apply an offset for FY 2015, as provided for under section 1886(d)(3)(B) of the Act.
  • As discussed below and in section II.D. of the preamble of this final rule, a recoupment to meet the requirements of section 631 of ATRA to adjust the standardized amount to offset the estimated amount of the increase in aggregate payments as a result of not completing the prospective adjustment authorized under section 7(b)(1)(A) of Public Law 110-90 until FY 2013.

Beginning in FY 2008, we applied the budget neutrality adjustment for the rural floor to the hospital wage indexes rather than the standardized amount. As we did for FY 2014, for FY 2015, consistent with current law, we are continuing to apply the rural floor budget neutrality adjustment to hospital wage indexes rather than the standardized amount. Also, consistent with section 3141 of the Affordable Care Act, instead of applying a State level rural floor budget neutrality adjustment to the wage index, we are applying a uniform, national budget neutrality adjustment to the FY 2015 wage index for the rural floor. We note that, in section III.G.2.b. of the preamble to this final rule, we are extending the imputed floor policy (both the original methodology and alternative methodology) for another year, through September 30, 2015.

Therefore, for FY 2015, in this final rule, we are continuing to include the imputed floor (calculated under the original and alternative methodologies) in calculating the uniform, national rural floor budget neutrality adjustment, which will be reflected in the FY 2015 wage index.

A. Calculation of the Adjusted Standardized Amount

1. Standardization of Base-Year Costs or Target Amounts

In general, the national standardized amount is based on per discharge averages of adjusted hospital costs from a base period (section 1886(d)(2)(A) of the Act), updated and otherwise adjusted in accordance with the provisions of section 1886(d) of the Act. For Puerto Rico hospitals, the Puerto Rico-specific standardized amount is based on per discharge averages of adjusted target amounts from a base period (section 1886(d)(9)(B)(i) of the Act), updated and otherwise adjusted in accordance with the provisions of section 1886(d)(9) of the Act. The September 1, 1983 interim final rule (48 FR 39763) contained a detailed explanation of how base-year cost data (from cost reporting periods ending during FY 1981) were established for urban and rural hospitals in the initial development of standardized amounts for the IPPS. The September 1, 1987 final rule (52 FR 33043 and 33066) contains a detailed explanation of how the target amounts were determined and how they are used in computing the Puerto Rico rates.

Sections 1886(d)(2)(B) and 1886(d)(2)(C) of the Act require us to update base-year per discharge costs for FY 1984 and then standardize the cost data in order to remove the effects of certain sources of cost variations among hospitals. These effects include case-mix, differences in area wage levels, cost-of-living adjustments for Alaska and Hawaii, IME costs, and costs to hospitals serving a disproportionate share of low-income patients.

In accordance with section 1886(d)(3)(E) of the Act, the Secretary estimates, from time-to-time, the proportion of hospitals' costs that are attributable to wages and wage-related costs. In general, the standardized amount is divided into labor-related and nonlabor-related amounts; only the proportion considered to be the labor-related amount is adjusted by the wage index. Section 1886(d)(3)(E) of the Act requires that 62 percent of the standardized amount be adjusted by the wage index, unless doing so would result in lower payments to a hospital than would otherwise be made. (Section 1886(d)(9)(C)(iv)(II) of the Act extends this provision to the labor-related share for hospitals located in Puerto Rico.)

For FY 2015, we are using the national and Puerto Rico-specific labor-related and nonlabor-related shares established for FY 2014, using the FY 2010-based hospital market basket. Specifically, under section 1886(d)(3)(E) of the Act, the Secretary estimates, from time to time, the proportion of payments that are labor-related: “[T]he Secretary shall adjust the proportion, (as estimated by the Secretary from time to time) of hospitals' costs which are attributable to wages and wage-related costs, of the DRG prospective payment rates . . . .” We refer to the proportion of hospitals' costs that are attributable to wages and wage-related costs as the “labor-related share.” For FY 2015, as discussed in section III. of the preamble of this final rule, we are continuing to use a labor-related share of 69.6 percent for the national standardized amounts, and 63.2 percent for the Puerto Rico-specific standardized amount, if the hospital has a wage index value that is greater than 1.0000. Consistent with section 1886(d)(3)(E) of the Act, we are applying the wage index to a labor-related share of 62 percent of the national standardized amount for all IPPS hospitals whose wage index values are less than or equal to 1.0000. For all IPPS hospitals whose wage indexes are greater than 1.0000, we are applying the wage index to a labor-related share of 69.6 percent of the national standardized amount.

For FY 2015, all Puerto Rico hospitals have a wage index value that is less than 1.0000 because the average hourly rate of every hospital in Puerto Rico divided by the national average hourly rate (the sum of all salaries and hours for all hospitals in the 50 United States and Puerto Rico) results in a wage index that is below 1.0000. However, when we divide the average hourly rate of every hospital located in Puerto Rico by the Puerto Rico-specific national average hourly rate (the sum of all salaries and hours for all hospitals located only in Puerto Rico), we determine a Puerto Rico-specific wage index value for some hospitals that is either above, or below 1.0000, depending on the hospital's location within Puerto Rico. Therefore, for hospitals located in Puerto Rico, we are applying a labor-related share of 63.2 percent if its Puerto Rico-specific wage index is greater than 1.0000. For hospitals located in Puerto Rico whose Puerto Rico-specific wage index values are less than or equal to 1.0000, we are applying a labor share of 62 percent.

The standardized amounts for operating costs appear in Tables 1A, 1B, and 1C that are listed and published in section VI. of the Addendum to this final rule and are available via the Internet on the CMS Web site.

2. Computing the National Average Standardized Amount and Puerto Rico-Specific Standardized Amount

Section 1886(d)(3)(A)(iv)(II) of the Act requires that, beginning with FY 2004 and thereafter, an equal standardized amount be computed for all hospitals at the level computed for large urban hospitals during FY 2003, updated by the applicable percentage update. Section 1886(d)(9)(A)(ii)(II) of the Act equalizes the Puerto Rico-specific urban and rural area rates. Accordingly, we are calculating the FY 2015 national average standardized amount and Puerto Rico-specific standardized amount irrespective of whether a hospital is located in an urban or rural location.

3. Updating the National Average Standardized Amount and Puerto Rico-Specific Standardized Amount

Section 1886(b)(3)(B) of the Act specifies the applicable percentage increase used to update the standardized amount for payment for inpatient hospital operating costs. We note that, in compliance with section 404 of the MMA, in this final rule, we are using the revised and rebased FY 2010-based IPPS operating and capital market baskets for FY 2015 (which replaced the FY 2006-based IPPS operating and capital market baskets in FY 2014). As discussed in section IV.B. of the preamble of this final rule, in accordance with section 1886(b)(3)(B) of the Act, as amended by section 3401(a) of the Affordable Care Act, we are reducing the FY 2015 applicable percentage increase (which is based on IHS Global Insight, Inc.'s (IGI's) second quarter 2014 forecast of the FY 2010-based IPPS market basket) by the MFP adjustment (the 10-year moving average of MFP for the period ending FY 2015) of 0.5 percentage point, which is calculated based on IGI's second quarter 2014 forecast.

In addition, in accordance with section 1886(b)(3)(B)(i) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act, we are further updating the standardized amount for FY 2015 by the estimated market basket percentage increase less 0.2 percentage point for hospitals in all areas. Sections 1886(b)(3)(B)(xi) and (xii) of the Act, as added and amended by sections 3401(a) and 10319(a) of the Affordable Care Act, further state that these adjustments may result in the applicable percentage increase being less than zero. The percentage increase in the market basket reflects the average change in the price of goods and services comprising routine, ancillary, and special care unit hospital inpatient services.

Based on IGI's 2014 second quarter forecast of the hospital market basket increase (as discussed in Appendix B of this final rule), the most recent forecast of the hospital market basket increase for FY 2015 is 2.9 percent. As discussed above, for FY 2015, depending on whether a hospital submits quality data under the rules established in accordance with section 1886(b)(3)(B)(viii) of the Act and is a meaningful EHR user under section 1886(b)(3)(B)(ix) of the Act, there are four possible applicable percentage increases that could be applied to the standardized amount. We refer readers to section IV. of the preamble of this final rule for a complete discussion on the FY 2015 inpatient hospital update to the standardized amount. We also refer readers to the table above for the four possible applicable percentage increases that would be applied to update the national standardized amount. The standardized amounts shown in Tables 1A through 1C that are published in section VI. of this Addendum and that are available via the Internet on the CMS Web site reflect these differential amounts.

Section 401(c) of Public Law 108-173 amended section 1886(d)(9)(C)(i) of the Act and states that, for discharges occurring in a fiscal year (beginning with FY 2004), the Secretary shall compute an average standardized amount for hospitals located in any area of Puerto Rico that is equal to the average standardized amount computed under subclause (I) for FY 2003 for hospitals in a large urban area (or, beginning with FY 2005, for all hospitals in the previous fiscal year) increased by the applicable percentage increase under subsection (b)(3)(B) for the fiscal year involved. Therefore, the update to the Puerto Rico-specific operating standardized amount is subject to the applicable percentage increase set forth under section 1886(b)(3)(B)(i) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act (that is, the same update factor as for all other hospitals subject to the IPPS). Accordingly, we are establishing an applicable percentage increase to the Puerto Rico-specific standardized amount of 2.2 percent for FY 2015.

Although the update factors for FY 2015 are set by law, we are required by section 1886(e)(4) of the Act to recommend, taking into account MedPAC's recommendations, appropriate update factors for FY 2015 for both IPPS hospitals and hospitals and hospital units excluded from the IPPS. Section 1886(e)(5)(A) of the Act requires that we publish our proposed recommendations in the Federal Register for public comment. Our recommendation on the update factors is set forth in Appendix B of this final rule.

4. Other Adjustments to the Average Standardized Amount

As in the past, we are adjusting the FY 2015 standardized amount to remove the effects of the FY 2014 geographic reclassifications and outlier payments before applying the FY 2015 updates. We then apply budget neutrality offsets for outliers and geographic reclassifications to the standardized amount based on FY 2015 payment policies.

We do not remove the prior year's budget neutrality adjustments for reclassification and recalibration of the DRG relative weights and for updated wage data because, in accordance with sections 1886(d)(4)(C)(iii) and 1886(d)(3)(E) of the Act, estimated aggregate payments after updates in the DRG relative weights and wage index should equal estimated aggregate payments prior to the changes. If we removed the prior year's adjustment, we would not satisfy these conditions.

Budget neutrality is determined by comparing aggregate IPPS payments before and after making changes that are required to be budget neutral (for example, changes to MS-DRG classifications, recalibration of the MS-DRG relative weights, updates to the wage index, and different geographic reclassifications). We include outlier payments in the simulations because they may be affected by changes in these parameters.

In order to appropriately estimate aggregate payments in our modeling, we make several inclusions and exclusions so that the appropriate universe of claims and charges are included. We discuss IME Medicare Advantage payment amounts, fee-for-service only claims, and charges for anti-hemophilic blood factor and organ acquisition below.

Consistent with our methodology established in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50422 through 50433), because IME Medicare Advantage payments are made to IPPS hospitals under section 1886(d) of the Act, we believe these payments must be part of these budget neutrality calculations. However, we note that it is not necessary to include Medicare Advantage IME payments in the outlier threshold calculation or the outlier offset to the standardized amount because the statute requires that outlier payments be not less than 5 percent nor more than 6 percent of total “operating DRG payments,” which does not include IME and DSH payments. We refer readers to the FY 2011 IPPS/LTCH PPS final rule for a complete discussion on our methodology of identifying and adding the total Medicare Advantage IME payment amount to the budget neutrality adjustments.

In addition, consistent with the methodology in the FY 2012 IPPS/LTCH PPS final rule, in order to ensure that we capture only fee-for-service claims, we are only including claims with a “Claim Type” of 60 (which is a field on the MedPAR file that indicates a claim is a fee-for-service claim).

Finally, consistent with our methodology established in the FY 2011 IPPS/LTCH PPS final rule (75 FR 50422 through 50423), we examined the MedPAR file and removed pharmacy charges for anti-hemophilic blood factor (which are paid separately under the IPPS) with an indicator of “3” for blood clotting with a revenue code of “0636” from the covered charge field for the budget neutrality adjustments. We also removed organ acquisition charges from the covered charge field for the budget neutrality adjustments because organ acquisition is a pass-through payment not paid under the IPPS.

The Bundled Payments for Care Improvement (BPCI) initiative, developed under the authority of section 3021 of the Affordable Care Act (codified at section 1115A of the Act), is comprised of four broadly defined models of care, which link payments for multiple services beneficiaries receive during an episode of care. Under the BPCI initiative, organizations enter into payment arrangements that include financial and performance accountability for episodes of care. On January 31, 2013, CMS announced the health care organizations selected to participate in the BPCI initiative. For additional information on the BPCI initiative, we refer readers to the CMS Center for Medicare and Medicaid Innovation's Web site at: http://innovation.cms.gov/initiatives/Bundled-Payments/index.html.

In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53341 through 53343), for FY 2013 and subsequent fiscal years, we finalized a methodology to treat hospitals that participate in the BPCI initiative the same as prior fiscal years for the IPPS payment modeling and ratesetting process (which includes recalibration of the MS-DRG relative weights, ratesetting, calculation of the budget neutrality factors, and the impact analysis) without regard to a hospital's participation within these bundled payment models (that is, as if they are not participating in those models under the BPCI initiative). Therefore, for FY 2015, as discussed in section II.H.4. of the preamble to this final rule, as we proposed, we are continuing to include all applicable data from subsection (d) hospitals participating in BPCI Models 1, 2, and 4 in our IPPS payment modeling and ratesetting calculations. We refer readers to the FY 2013 IPPS/LTCH PPS final rule for a complete discussion on our final policy for the treatment of hospitals in the BPCI initiative in our ratesetting process.

The Affordable Care Act established the Hospital Readmissions Reduction Program and the Hospital VBP Program which adjust payments to certain IPPS hospitals beginning with discharges on or after October 1, 2012. Because the adjustments made under these programs affect the estimation of aggregate IPPS payments, in this final rule, consistent with our methodology established in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53687 through 53688), we believe that it is appropriate to include adjustments for these programs within our budget neutrality calculations. We discuss the treatment of these two programs in the context of budget neutrality adjustments below.

Section 1886(q) of the Act establishes the “Hospital Readmissions Reduction Program” effective for discharges from an “applicable hospital” beginning on or after October 1, 2012, under which payments to those hospitals under section 1886(d) of the Act are reduced to account for certain excess readmissions. Under the Hospital Readmissions Reduction Program, for discharges beginning on October 1, 2012 discharges from an “applicable hospital” are paid at an amount equal to the product of the “base operating DRG payment amount” and an “adjustment factor” that accounts for excess readmissions for the hospital for the fiscal year plus any applicable add-on payments. We refer readers to section IV.H. of the preamble of this final rule for full details of our implementation of and FY 2015 policy changes to the Hospital Readmissions Reduction Program. We also note that the Hospital Readmissions Reduction Program provided for under section 1886(q) of the Act is not budget neutral.

Section 1886(o) of the Act requires the Secretary to establish a Hospital VBP Program under which, for discharges beginning on October 1, 2012, value-based incentive payments are made in a fiscal year to eligible subsection (d) hospitals that meet performance standards established for a performance period for that fiscal year. As specified under section 1886(o)(7)(B)(i) of the Act, these value-based incentive payments are funded by a reduction applied to each eligible hospital's base-operating DRG payment amount, for each discharge occurring in the fiscal year. As required by section 1886(o)(7)(A) of the Act, the total amount of allocated funds available for value-based incentive payments with respect to a fiscal year is equal to the total amount of base-operating DRG payment reductions, as estimated by the Secretary. In a given fiscal year, hospitals may earn a value-based incentive payment amount for a fiscal year that is greater than, equal to, or less than the reduction amount, based on their performance on quality measures under the Hospital VBP Program. Thus, the Hospital VBP Program is estimated to have no net effect on overall payments. We refer readers to section IV.I. of the preamble of this final rule for full details regarding the Hospital VBP Program.

Both the hospital readmissions payment adjustment (reduction) and the hospital VBP payment adjustment (redistribution) are applied on a claim-by-claim basis by adjusting, as applicable, the base-operating DRG payment amount for individual subsection (d) hospitals, which affects the overall sum of aggregate payments on each side of the comparison within the budget neutrality calculations. For example, when we calculate the budget neutrality factor for MS-DRG reclassification and recalibration of the relative weights, we compare aggregate payments estimated using the prior year's GROUPER and relative weights to estimated payments using the new GROUPER and relative weights. (We refer readers to section II.A.4.a. of this Addendum for full details.) Other factors, such as the DSH and IME payment adjustments, are the same on both sides of the comparison because we are only seeking to ensure that aggregate payments do not increase or decrease as a result of the changes of MS-DRG reclassification and recalibration.

In order to properly determine aggregate payments on each side of the comparison, as we did for FY 2014, for FY 2015 and subsequent years, we are continuing to apply the hospital readmissions payment adjustment and the hospital VBP payment adjustment on each side of the comparison, consistent with the methodology that we adopted in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53687 through 53688). That is, we are applying the readmissions payment adjustment factor and the hospital VBP payment adjustment factor on both sides of our comparison of aggregate payments when determining all budget neutrality factors described in section II.A.4. of this Addendum.

For the purpose of calculating the FY 2015 readmissions payment adjustment factors, we are using excess readmission ratios and aggregate payments for excess readmissions based on admissions from the prior fiscal year's applicable period because hospitals have had the opportunity to review and correct these data before the data were made public under the policy we adopted regarding the reporting of hospital-specific readmission rates, consistent with section 1886(q)(6) of the Act. For FY 2015, in this final rule, we are calculating the readmissions payment adjustment factors using excess readmission ratios and aggregate payments for excess readmissions based on admissions from the finalized applicable period for FY 2015 as hospitals have had the opportunity to review and correct these data under our policy regarding the reporting of hospital-specific readmission rates consistent with section 1886(q)(6) of the Act. We discuss our policy regarding the reporting of hospital-specific readmission rates for FY 2015 in section IV.H.3.f. of the preamble of this final rule. (For additional information on our general policy for the reporting of hospital-specific readmission rates, consistent with section 1886(q)(6) of the Act, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53399 through 53400).)

In addition, for FY 2015, in this final rule, for the purpose of modeling aggregate payments when determining all budget neutrality factors, we are using proxy hospital VBP payment adjustment factors for FY 2015 that are based on data from a historical period because hospitals have not yet had an opportunity to review and submit corrections for their data from the FY 2015 performance period. (For additional information on our policy regarding the review and correction of hospital-specific measure rates under the Hospital VBP Program, consistent with section 1886(o)(10)(A)(ii) of the Act, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53578 through 53581), the CY 2012 OPPS/ASC final rule with comment period (76 FR 74544 through 74547), and the Hospital Inpatient VBP final rule (76 FR 26534 through 26536).)

The Affordable Care Act also established section 1886(r) of the Act, which modifies the methodology for computing the Medicare DSH payment adjustment beginning in FY 2014. Beginning in FY 2014, IPPS hospitals receiving Medicare DSH payment adjustments will receive an empirically justified Medicare DSH payment equal to 25 percent of the amount that would previously have been received under the current statutory formula set forth under section 1886(d)(5)(F) of the Act governing the Medicare DSH payment adjustment. In accordance with section 1886(r)(2) of the Act, the remaining amount, equal to an estimate of 75 percent of what otherwise would have been paid as Medicare DSH payments, reduced to reflect changes in the percentage of individuals under age 65 who are uninsured, will be available to make additional payments to Medicare DSH hospitals based on their share of the total amount of uncompensated care reported by Medicare DSH hospitals for a given time period. In order to properly determine aggregate payments on each side of the comparison for budget neutrality, prior to FY 2014, we included estimated Medicare DSH payments on both sides of our comparison of aggregate payments when determining all budget neutrality factors described in section II.A.4. of this Addendum.

To do this for FY 2015 and subsequent years (as we did for FY 2014), we are including estimated empirically justified Medicare DSH payments that will be paid in accordance with section 1886(r)(1) of the Act and estimates of the additional uncompensated care payments made to hospitals receiving Medicare DSH payment adjustments as described by section 1886(r)(2) of the Act. That is, we are considering estimated empirically justified Medicare DSH payments at 25 percent of what would otherwise have been paid, and also the estimated additional uncompensated care payments for hospitals receiving Medicare DSH payment adjustments on both sides of our comparison of aggregate payments when determining all budget neutrality factors described in section II.A.4. of this Addendum.

We note that, when calculating total payments for budget neutrality, to determine total payments for SCHs we model total hospital-specific rate payments and total Federal rate payments and then include whichever one of the total payments is greater. As discussed in section IV.F. of the preamble to this final rule and below, we are continuing the FY 2014 finalized methodology under which we will take into consideration uncompensated care payments in the comparison of payments under the Federal rate and the hospital-specific rate for SCHs. Therefore, we are including estimated uncompensated care payments in this comparison.

Similarly, for MDHs, as discussed in section IV. of the preamble to this final rule, when computing payments under the Federal national rate plus 75 percent of the difference between the payments under the Federal national rate and the payments under the updated hospital-specific rate, we are continuing to take into consideration uncompensated care payments in the computation of payments under the Federal rate and the hospital-specific rate for MDHs.

Also, for FY 2015, as of the time of development of this final rule, CMS has yet to finalize a list of hospitals that are not meaningful EHR users under section 1886(b)(3)(B)(ix) of the Act. Therefore, we are not including this adjustment to the standardized amount (for those hospitals that are not meaningful EHR users) in our modeling of aggregate payments for budget neutrality for FY 2015. CMS intends to release a final list of hospitals that are not meaningful EHR users in September 2014. Hospitals identified on this list will be paid based on the applicable standardized amount in Tables 1A and 1B for discharges occurring in FY 2015.

We finally note that the wage index value is calculated and assigned to a hospital based on the hospital's labor market area. Under section 1886(d)(3)(E) of the Act, beginning with FY 2005, we delineate hospital labor market areas based on the Core-Based Statistical Areas (CBSAs) established by the Office of Management and Budget (OMB). The current statistical areas used in FY 2014 are based on OMB standards published on December 27, 2000 (65 FR 82228) and Census 2000 data and Census Bureau population estimates for 2007 and 2008 (OMB Bulletin No. 10-02). For purposes of determining all of the FY 2014 budget neutrality factors, we determined aggregate payments on each side of the comparison for our budget neutrality calculations using wage indexes based on the current CBSAs.

As stated in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27552) and final rule (78 FR 50586), on February 28, 2013, OMB issued OMB Bulletin No. 13-01, which established revised delineations for Metropolitan Statistical Areas, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the use of the delineations of these statistical areas. In order to implement these changes for the IPPS, it was necessary to identify the new OMB labor market area delineation for each county and hospital in the country. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50586), we stated that we intended to propose changes to the wage index policy based on the new OMB delineations in the FY 2015 IPPS/LTCH PPS proposed rule. As discussed in section III. of the preamble of this final rule, as we proposed, we are adopting the new OMB labor market area delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for the FY 2015 IPPS wage index.

Consistent with our policy to adopt the new OMB delineations, in order to properly determine aggregate payments on each side of the comparison for our budget neutrality calculations, we are using wage indexes based on the new OMB delineations in the determination of all of the budget neutrality factors discussed below (with the exception of the transitional budget neutrality factor and outlier threshold as explained below). We also note that, consistent with past practice as finalized in the FY 2005 IPPS final rule (69 FR 49034), we are not adopting the new OMB delineations themselves in a budget neutral manner. We continue to believe that the revision to the labor market areas in and of itself does not constitute an “adjustment or update” to the adjustment for area wage differences, as provided under section 1886(d)(3)(E) of the Act.

a. Recalibration of MS-DRG Relative Weights and Updated Wage Index—Budget Neutrality Adjustment

Section 1886(d)(4)(C)(iii) of the Act specifies that, beginning in FY 1991, the annual DRG reclassification and recalibration of the relative weights must be made in a manner that ensures that aggregate payments to hospitals are not affected. As discussed in section II.H. of the preamble of this final rule, we normalized the recalibrated MS-DRG relative weights by an adjustment factor so that the average case relative weight after recalibration is equal to the average case relative weight prior to recalibration. However, equating the average case relative weight after recalibration to the average case relative weight before recalibration does not necessarily achieve budget neutrality with respect to aggregate payments to hospitals because payments to hospitals are affected by factors other than average case relative weight. Therefore, as we have done in past years, we are making a budget neutrality adjustment to ensure that the requirement of section 1886(d)(4)(C)(iii) of the Act is met.

Section 1886(d)(3)(E)(i) of the Act requires us to update the hospital wage index on an annual basis beginning October 1, 1993. This provision also requires us to make any updates or adjustments to the wage index in a manner that ensures that aggregate payments to hospitals are not affected by the change in the wage index. Section 1886(d)(3)(E)(i) of the Act requires that we implement the wage index adjustment in a budget neutral manner. However, section 1886(d)(3)(E)(ii) of the Act sets the labor-related share at 62 percent for hospitals with a wage index less than or equal to 1.0000, and section 1886(d)(3)(E)(i) of the Act provides that the Secretary shall calculate the budget neutrality adjustment for the adjustments or updates made under that provision as if section 1886(d)(3)(E)(ii) of the Act had not been enacted. In other words, this section of the statute requires that we implement the updates to the wage index in a budget neutral manner, but that our budget neutrality adjustment should not take into account the requirement that we set the labor-related share for hospitals with wage indexes less than or equal to 1.0000 at the more advantageous level of 62 percent. Therefore, for purposes of this budget neutrality adjustment, section 1886(d)(3)(E)(i) of the Act prohibits us from taking into account the fact that hospitals with a wage index less than or equal to 1.0000 are paid using a labor-related share of 62 percent. Consistent with current policy, for FY 2015, we are adjusting 100 percent of the wage index factor for occupational mix. We describe the occupational mix adjustment in section III.F. of the preamble of this final rule.

For FY 2015, to comply with the requirement that MS-DRG reclassification and recalibration of the relative weights be budget neutral for the Puerto Rico standardized amount and the hospital-specific rates, we used FY 2013 discharge data to simulate payments and compared the following:

  • Aggregate payments using the FY 2014 labor-related share percentages, the new OMB labor market area delineations for FY 2015, the FY 2014 relative weights, and the FY 2014 pre-reclassified wage data, and applied the FY 2015 hospital readmissions payment adjustments and estimated FY 2015 hospital VBP payment adjustments; and
  • Aggregate payments using the FY 2014 labor-related share percentages, the new OMB labor market area delineations for FY 2015, the FY 2015 relative weights, and the FY 2014 pre-reclassified wage data, and applied the same hospital readmissions payment adjustments and estimated hospital VBP payment adjustments applied above.

Based on this comparison, we computed a budget neutrality adjustment factor equal to 0.997543. As discussed in section IV. of this Addendum, we also are applying the MS-DRG reclassification and recalibration budget neutrality factor of 0.997543 to the hospital-specific rates that are effective for cost reporting periods beginning on or after October 1, 2014.

Comment: Several commenters stated that CMS miscalculated the MS-DRG reclassification and recalibration budget neutrality adjustment factor presented in the proposed rule. The commenters noted that the budget neutrality adjustment factor of 0.992938 presented in the proposed rule was much lower than historical levels. The commenters also noted that, for the last 5 years, the budget neutrality adjustment factor has been between 0.996731 (FY 2011) and 0.998431 (FY 2013). In addition, the commenters informed CMS that they attempted to replicate the calculation of this budget neutrality adjustment factor, but were unable to do so. The commenters added that in May of 2014, CMS posted a revised set of MS-DRG relative weights on the CMS Web site via the Internet because a number of postacute care transfer-adjusted cases for certain MS-DRGs presented in the FY 2015 IPPS/LTCH PPS proposed rule were inadvertently miscalculated. However, the commenters stated that they were still not able to verify the budget neutrality adjustment factor using the updated MS-DRG relative weights. The commenters stated that, by using the revised MS-DRGs, they calculated a revised budget neutrality adjustment factor of 1.000301. The commenters recommended that CMS examine the calculation of the budget neutrality adjustment factor and, if necessary, revise the budget neutrality adjustment factor for the FY 2015 IPPS/LTCH PPS final rule.

One commenter recommended that CMS perform an analysis to confirm whether the proposed MS-DRG reclassification and recalibration budget neutrality adjustment factor is accurate and correct. The commenter also recommended that CMS discuss the results of its analysis in the final rule and afford interested parties a further opportunity to review and comment on the final budget neutrality adjustment factor before it becomes effective on October 1, 2014.

Response: We appreciate the commenters' input. As the commenters requested, we examined the calculation of the budget neutrality adjustment factor presented in the proposed rule. We agree with the commenters that the MS-DRG reclassification and recalibration budget neutrality adjustment factor was calculated incorrectly during the development of the proposed rule due to the inadvertent miscalculation of a number of postacute care transfer-adjusted cases for certain MS-DRGs. Using the updated MS-DRG relative weights, we calculated a revised proposed budget neutrality adjustment factor similar to the factor calculated by the commenters. For FY 2015, in this final rule, using accurate postacute care transfer-adjusted cases for these MS-DRGs, we have calculated a MS-DRG reclassification and recalibration budget neutrality factor of 0.997543, which is consistent with historical levels.

In response to the commenters' concerns regarding verifying the accuracy of the budget neutrality adjustment factor, we announced through information posted via the Internet on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/FY2015-IPPS-Proposed-Rule-Home-Page-Items/FY2015-IPPS-Proposed-Rule-Data-Files.html that there was an inadvertent miscalculation of a number of postacute care transfer-adjusted cases for certain MS-DRGs. Therefore, after the publication of the FY 2015 IPPS/LTCH PPS proposed rule, we also posted via the Internet on the CMS Web site a revised table of the proposed MS-DRG relative weights for FY 2015. It is our goal to strive for accuracy in regard to our adjustment factor calculations, and we appreciate the commenters' recognition of the mistake and for pointing out the effects of the miscalculation during the comment period. However, we believe that the 60-day comment period affords the public an appropriate opportunity to review and comment on all of the proposals presented throughout the entire FY 2015 IPPS/LTCH PPS proposed rule. We are not changing our proposed policy in calculating this budget neutrality adjustment, but rather are using corrected information. Therefore, we do not believe that an additional opportunity for comment necessary.

Comment: The commenter also noted that CMS did not explicitly state which labor-related share percentages were used in the calculation of the MS-DRG reclassification and recalibration budget neutrality adjustment factor. In addition, the commenter did not believe that it was appropriate to use the new OMB delineations in the calculation of the MS-DRG reclassification and recalibration budget neutrality adjustment factor. The commenter requested that CMS address why it is appropriate to apply the new OMB delineations in the MS-DRG reclassification and recalibration budget neutrality adjustment factor and how and whether the new OMB delineations impact the calculation of the final budget neutrality adjustment factor. The commenter also requested that CMS identify which labor-related share percentages were in each component of the payment simulation model used to calculate the final budget neutrality adjustment factor.

Response: As discussed in section III.B.(2)(e)(6) of the preamble of this final rule and consistent with past practice (69 FR 49034), we are not adopting the new OMB delineations, in and of themselves, in a budget neutral manner. However, we are adopting the transitional policies we have effectuated in a budget neutral manner as we describe below. We do not believe that the revision to the labor market areas in and of itself constitutes an “adjustment or update” to the adjustment for area wage differences, as provided under section 1886(d)(3)(E) of the Act. Therefore, the new OMB delineations did not impact the calculation of the final budget neutrality adjustment factor. Also, as stated in the FY 2015 IPPS/LTCH PPS proposed rule and above, consistent with our policy to adopt the new OMB delineations, in order to properly determine aggregate payments on each side of the comparison for our budget neutrality adjustment factor calculations, we are using wage indexes based on the new OMB delineations in the determination of all of the budget neutrality adjustment factors discussed below (with the exception of the transitional budget neutrality factor and outlier fixed-loss threshold as explained below).

We also did not include the labor-related share percentages used in the calculation of the proposed MS-DRG reclassification and recalibration budget neutrality adjustment factor presented in the proposed rule. For FY 2015, in this final rule, as requested by the commenters, we present the labor-related share percentages used in the calculation of the budget neutrality adjustment factor in response to public comments we received in the discussion above, which are the same labor-related share percentages used for the proposed rule.

In order to meet the statutory requirements that we do not take into account the labor-related share of 62 percent when computing wage index budget neutrality adjustment factor, it was necessary to use a three-step process to comply with the requirements that MS-DRG reclassification and recalibration of the relative weights and the updated wage index and labor-related share have no effect on aggregate payments for IPPS hospitals. Under the first step, we determined an MS- DRG reclassification and recalibration budget neutrality adjustment factor of 0.997543 (by using the same methodology described above to determine the MS-DRG reclassification and recalibration budget neutrality factor for the Puerto Rico standardized amount and hospital-specific rates). Under the second step, to compute a budget neutrality adjustment factor for wage index and labor-related share percentage changes we used FY 2013 discharge data to simulate payments and compared the following:

  • Aggregate payments using the new OMB labor market area delineations for FY 2015, FY 2015 relative weights and the FY 2014 pre-reclassified wage indexes, applied the FY 2014 labor-related share of 69.6 percent to all hospitals (regardless of whether the hospital's wage index was above or below 1.0000), and applied the FY 2015 hospital readmissions payment adjustment and the FY 2015 estimated hospital VBP payment adjustment; and
  • Aggregate payments using the new OMB labor market area delineations for FY 2015, FY 2015 relative weights and the FY 2015 pre-reclassified wage indexes, applied the labor-related share for FY 2015 of 69.6 percent to all hospitals (regardless of whether the hospital's wage index was above or below 1.0000), and applied the same FY 2015 hospital readmissions payment adjustments and estimated FY 2015 hospital VBP payment adjustments applied above.

In addition, we applied the MS-DRG reclassification and recalibration budget neutrality adjustment factor (derived in the first step) to the payment rates that were used to simulate payments for this comparison of aggregate payments from FY 2014 to FY 2015. By applying this methodology, we determined a budget neutrality adjustment factor of 1.001443 for changes to the wage index. Finally, we multiplied the MS-DRG reclassification and recalibration budget neutrality adjustment factor of 0.997543 (derived in the first step) by the budget neutrality adjustment factor of 1.001443 for changes to the wage index (derived in the second step) to determine the MS-DRG reclassification and recalibration and updated wage index budget neutrality adjustment factor of 0.998982.

b. Reclassified Hospitals—Budget Neutrality Adjustment

Section 1886(d)(8)(B) of the Act provides that certain rural hospitals are deemed urban. In addition, section 1886(d)(10) of the Act provides for the reclassification of hospitals based on determinations by the MGCRB. Under section 1886(d)(10) of the Act, a hospital may be reclassified for purposes of the wage index.

Under section 1886(d)(8)(D) of the Act, the Secretary is required to adjust the standardized amount to ensure that aggregate payments under the IPPS after implementation of the provisions of sections 1886(d)(8)(B) and (C) and 1886(d)(10) of the Act are equal to the aggregate prospective payments that would have been made absent these provisions. We note that the wage index adjustments provided for under section 1886(d)(13) of the Act are not budget neutral. Section 1886(d)(13)(H) of the Act provides that any increase in a wage index under section 1886(d)(13) shall not be taken into account in “applying any budget neutrality adjustment with respect to such index” under section 1886(d)(8)(D) of the Act. To calculate the budget neutrality adjustment factor for FY 2015, we used FY 2013 discharge data to simulate payments and compared the following:

  • Aggregate payments using the FY 2014 labor-related share percentages, the new OMB labor market area delineations for FY 2015, FY 2015 relative weights, and FY 2015 wage data prior to any reclassifications under sections 1886(d)(8)(B) and (C) and 1886(d)(10) of the Act, and applied the FY 2015 hospital readmissions payment adjustments and the estimated FY 2015 hospital VBP payment adjustments; and
  • Aggregate payments using the FY 2014 labor-related share percentages, the new OMB labor market area delineations for FY 2015, FY 2015 relative weights, and FY 2015 wage data after such reclassifications, and applied the same hospital readmissions payment adjustments and the estimated hospital VBP payment adjustments applied above.

We note that the reclassifications applied under the second simulation and comparison are those listed in Tables 9A2 and 9C2, which are posted on the CMS Web site. These tables reflect reclassification crosswalks based on the new OMB labor market area delineations for FY 2015, and apply the policies explained in section III. of the preamble to this final rule. Based on these simulations, we calculated a budget neutrality adjustment factor of 0.990406 to ensure that the effects of these provisions are budget neutral, consistent with the statute.

The FY 2015 budget neutrality adjustment factor was applied to the standardized amount after removing the effects of the FY 2014 budget neutrality adjustment factor. We note that the FY 2015 budget neutrality adjustment reflects FY 2015 wage index reclassifications approved by the MGCRB or the Administrator.

c. Rural Floor Budget Neutrality Adjustment

Under § 412.64(e)(4), we make an adjustment to the wage index to ensure that aggregate payments after implementation of the rural floor under section 4410 of the BBA (Pub. L. 105-33) and the imputed floor under § 412.64(h)(4) are equal to the aggregate prospective payments that would have been made in the absence of such provisions. Consistent with section 3141 of the Affordable Care Act and as discussed in section III.G. of the preamble of this final rule and codified at § 412.64(e)(4)(ii), the budget neutrality adjustment for the rural and imputed floor is a national adjustment to the wage index.

As noted above and as discussed in section III.G.2.b. of the preamble of this final rule, in the FY 2012 IPPS/LTCH PPS final rule, we extended the imputed floor calculated under the original methodology through FY 2013 (76 FR 51594). In the FY 2013 IPPS/LTCH PPS final rule, we established an alternative methodology for calculating the imputed floor and established a policy that the minimum wage index value for an all-urban state would be the higher of the value determined under the original methodology or the value computed using the alternative methodology (77 FR 53368 through 53369). Consistent with the methodology for treating the imputed floor, similar to the methodology we used in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53368 through 53369), we included this alternative methodology for computing the imputed floor index in the calculation of the uniform, national rural floor budget neutrality adjustment for FY 2014. For FY 2015, as discussed in section III.G.2.b. of the preamble of this final rule, we are extending the imputed floor using the higher of the value determined under the original methodology or the alternative methodology for FY 2015. Therefore, in order to ensure that aggregate payments to hospitals are not affected, similar to prior years, we will follow our policy of including the imputed floor in the rural floor budget neutrality adjustment to the wage index.

As discussed above, for FY 2015, we are implementing the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective for the FY 2015 IPPS wage index. Therefore, the budget neutrality adjustment for the rural floor and imputed floor will be calculated using the new OMB delineations.

Under the OMB delineations used for FY 2014, the imputed floor (both the original methodology and alternative methodology) was applied to New Jersey and Rhode Island because these were the only two all-urban States. Under OMB's 2010 revised delineations based on Census 2010 data, in addition to New Jersey and Rhode Island, Delaware will become an all-urban state. Therefore, for FY 2015, the imputed floor will be applied to the wage index for hospitals located in New Jersey, Rhode Island, and Delaware.

Similar to our calculation in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51593 and 51788), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53689), and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50975 through 50976), for FY 2015, we are calculating a national rural Puerto Rico wage index (used to adjust the labor-related share of the national standardized amount for hospitals located in Puerto Rico which receive 75 percent of the national standardized amount) and a rural Puerto Rico-specific wage index (which is used to adjust the labor-related share of the Puerto Rico-specific standardized amount for hospitals located in Puerto Rico that receive 25 percent of the Puerto Rico-specific standardized amount). Because there are no rural Puerto Rico hospitals with established wage data, our calculation of the FY 2015 rural Puerto Rico wage index is based on the policy adopted in the FY 2008 IPPS final rule with comment period (72 FR 47323). That is, we will use the unweighted average of the wage indexes from all CBSAs (urban areas) that are contiguous (share a border with) to the rural counties to compute the rural floor (72 FR 47323; 76 FR 51594). Under the new OMB labor market area delineations, except for Arecibo, Puerto Rico (CBSA 11640), all other Puerto Rico urban areas are contiguous to a rural area. Therefore, based on our existing policy, the FY 2015 rural Puerto Rico wage index is calculated based on the average of the FY 2015 wage indexes for the following urban areas: Aguadilla-Isabela, PR (CBSA 10380); Guayama, PR (CBSA 25020); Mayaguez, PR (CBSA 32420); Ponce, PR (CBSA 38660), San German, PR (CBSA 41900) and San Juan-Carolina-Caguas, PR (CBSA 41980).

To calculate the national rural floor and imputed floor budget neutrality adjustment factors and the Puerto Rico-specific rural floor budget neutrality adjustment factor, we used FY 2013 discharge data to simulate payments, the FY 2015 new OMB labor market area delineations, and post-reclassified national and Puerto Rico-specific wage indexes and compared the following:

  • The national and Puerto Rico-specific simulated payments without the national rural floor and imputed floor and Puerto Rico-specific rural floor applied; and
  • The national and Puerto Rico-specific simulated payments with the national rural floor and imputed floor and Puerto Rico-specific rural floor applied.

Based on this comparison, we determined a national rural budget neutrality adjustment factor of 0.989507 and the Puerto Rico-specific budget neutrality adjustment factor of 0.991291. The national adjustment was applied to the national wage indexes to produce a national rural floor budget neutral wage index and the Puerto Rico-specific adjustment was applied to the Puerto Rico-specific wage indexes to produce a Puerto Rico-specific rural floor budget neutral wage index.

Comment: Many commenters opposed the continued application of a nationwide rural floor budget neutrality adjustment. Some commenters noted that under the current rural floor policy, all hospitals located in Massachusetts are eligible for the rural floor wage index as a result of one rural hospital, which resulted in an approximate 4.9 percent increase in payments for hospitals located in Massachusetts and creates a disparity when considering the wage index of other hospitals around the country. The commenters also noted that under the rural floor policy, hospitals located in California will also receive an increase in payments of approximately $196 million as a result of the application of the rural floor policy. The commenters stated that the adverse consequences of applying a nationwide rural floor budget neutrality adjustment have been recognized by CMS, MedPAC, and many others over the past several years. The commenters believed that the Medicare wage index system cannot accomplish its objective of ensuring that payments for the wage component of labor accurately reflect actual wage costs until this policy is corrected. Other commenters recommended that CMS consider applying the rural floor budget neutrality adjustment through a State-specific budget neutrality adjustment factor, as CMS has previously applied.

Response: We appreciate the commenters' input and for informing us of their concerns. Section 3141 of Public Law 111-148 requires that a national budget neutrality adjustment be applied in implementing the rural floor policy. Therefore, absent a legislative change enacted by Congress, we are unable to change the rural floor budget neutrality adjustment from a national to a State-specific adjustment.

Comment: Some commenters recommended that CMS consider implementing a policy under the IPPS and the OPPS that would result in only hospitals located in rural areas being included in the statewide rural floor wage index used for urban hospitals located in areas with wage indexes that are lower than the statewide rural wage index. The commenters believed that such a policy would prevent urban hospitals from reclassifying to rural status simply to improve the rural wage index, which might be used as a floor for urban hospitals located in areas of a State that have lower wage index values. The commenters added that they believed that CMS has the regulatory authority to make such a policy change without the enactment of Congressional legislation.

Another commenter recommended that, for FY 2015, CMS require States to have at least 5 percent of their PPS hospitals physically located in rural areas as a prerequisite for establishing a rural floor wage index for each State. The commenter believed that this would ensure the original intent of the rural floor policy, which is to serve as an equalizer, and would protect the policy from being used as a manipulation tool that allows a handful of hospitals in one isolated area of the State to dictate the wage index for a major Metropolitan area.

In addition, the same commenter urged CMS to create a national hospital wage index floor of 0.91. The commenter explained that this would reduce current disparities between hospitals. The commenter also stated that the purpose of the hospital area wage index is to fairly account for labor costs incurred by providers, and not to reward “winners” or punish “losers” as a result of reclassifications and a proliferation of other modifications. The commenter further noted that it recognized that there is growing interest from MedPAC and others regarding revising the hospital area wage index system, but acknowledged that such revisions take time. Therefore, the commenter believed that a hospital rural floor wage index is appropriate until CMS creates a system that better reflects the realities of today's healthcare system and levels the playing field for all Medicare providers.

Response: We appreciate the commenters' input. We did not make any proposals to change the rural floor wage index policy. Any changes to this policy would first need to be proposed through rulemaking. Consequently, we are not making any changes to address the commenters' concerns at this time. With respect to the commenter who recommended that CMS establish a national hospital wage index floor of 0.91, we do not believe that there is any statistical basis to support this calculation. In addition, we are unclear how such a wage index floor policy could be implemented nor do we believe that this suggestion meets the requirement of the statute. With respect to the other commenters' suggestions, we first need to determine if the revised policy that the commenters suggested would be inconsistent with any longstanding policy or statutory requirement. We will consider the commenters' suggestions in future rulemaking.

Comment: One commenter requested that CMS provide an updated, detailed, State-specific analysis of the effect of a nationwide rural floor budget neutrality adjustment. The commenter specifically noted the estimated “windfall” expected to be received by hospitals located in Massachusetts as a result of the rural floor policy, and requested that CMS provide data and additional analysis of the impacts of a national rural floor budget neutrality adjustment. In addition, commenters questioned whether the addition of one rural hospital located in Franklin County, Massachusetts reduced the impact of the Massachusetts rural floor wage index from FY 2014 to FY 2015.

Response: We have provided an updated State-specific analysis of the effect of the rural floor budget neutrality adjustment in Appendix A of the Addendum to this final rule. We also discuss in Appendix A to this final rule the increase in payments the hospitals in Massachusetts are expected to receive as a result of the rural floor wage index policy.

We discuss below the reduced impact of the rural floor wage index policy for hospitals located in Massachusetts from FY 2014 to FY 2015. In FY 2014, CMS calculated that 60 hospitals would benefit from the Massachusetts rural floor wage index, resulting in an estimated $167.6 million being received by hospitals located in Massachusetts as a result of the national rural floor budget neutrality adjustment. In FY 2015, fewer hospitals located in Massachusetts (51) have been identified as benefitting from the rural floor wage index, and the fiscal impact of rural floor budget neutrality adjustment has been reduced. Below we explain why nine providers (60 minus 51) received the Massachusetts rural floor wage index in FY 2014, but not in FY 2015.

The commenters are correct that the addition of one rural hospital located in Franklin County, Massachusetts reduced the impact of the rural floor wage index in FY 2015, as compared to the impact of the rural floor wage index in FY 2014. To further clarify, in FY 2014, there was only one geographically located rural hospital in Massachusetts (located in Nantucket County). Therefore, the Massachusetts pre- and post- reclassified rural wage index in the calculation of the reclassification budget neutrality adjustment, and the application of the rural floor budget neutrality adjustment, was established based on wage data from that one hospital located in Nantucket County, Massachusetts. For FY 2015, another hospital, which is defined as “urban” under the current delineations, is now considered to be “rural” under the new OMB delineations. Specifically, this hospital is located in Franklin County, Massachusetts, which is no longer considered to be part of CBSA 44140 (Springfield, MA) under the new OMB delineations, and is now considered to be geographically located in a rural area. However, under the new OMB delineations, Franklin County meets the requirements under section 1886(d)(8)(B) of the Act for reclassification. Therefore, in FY 2015, any hospital located within Franklin County is deemed an “urban” labor market (that is, the hospitals are considered “Lugar” hospitals). The calculation of the FY 2015 Massachusetts pre-reclassified rural wage index, which is used in the calculation of reclassification budget neutrality adjustment, is calculated based on the two geographically located rural hospitals (one from Franklin County and one from Nantucket County). The average hourly wage of the Franklin County hospital is lower than the average hourly wage of the Nantucket County hospital, lowering the pre-reclassified rural wage index for FY 2015 relative to FY 2014.

With respect to budget neutrality, as described earlier in this Addendum, we first calculate and apply the MS-DRG and wage index budget neutrality adjustment, then the reclassification budget neutrality adjustment, and then the rural floor budget neutrality adjustment. This analysis focuses on the reclassification and rural floor budget neutrality adjustments and applies the requirement of section 1886(d)(8)(C)(iii) of the Act, which specifies that an area's post-reclassified wage index (without application of the rural floor budget neutrality adjustment) may not be reduced below the State's post-reclassified rural wage index value (without application of the rural floor budget neutrality adjustment), as a result of reclassification. As stated in the FY 1992 IPPS final rule (56 FR 43220 through 43221), if reclassification (either to or from an area) would lower an area's post-reclassified wage index (without application of the rural floor budget neutrality adjustment) below the State's post-reclassified rural wage index (without application of the rural floor budget neutrality adjustment), CMS assigns those areas the post-reclassified rural wage index value for that State (without application of the rural floor budget neutrality adjustment). For this to occur, the area's pre-reclassified wage index value must be greater than or equal to the State's pre-reclassified rural wage index value prior to calculating the effects of the reclassification budget neutrality adjustment.

As discussed above in section II.A.4.b. of this Addendum regarding the reclassification budget neutrality adjustment, to ensure that the effects of applying sections 1886(d)(8)(B) and (C) and 1886(d)(10) of the Act are budget neutral, we compare FY 2015 wage data prior to any reclassifications under sections 1886(d)(8)(B) and (C) and 1886(d)(10) of the Act (that is, pre-reclassified wage data) to FY 2015 wage data after such reclassifications (that is, the post-reclassified wage data). Specifically, we compared the Massachusetts pre-reclassified rural wage index (Column C in the table below) to the pre-reclassified area wage index (Column B in the table below). (We note that the Massachusetts pre-reclassified rural wage index is comprised from the wage data of two rural hospitals, one located in Franklin County, Massachusetts and one located in Nantucket County, Massachusetts.) If a hospital's pre-reclassified area wage index (Column B in the table below) is greater than or equal to the Massachusetts pre-reclassified rural wage index (Column C in the table below), then we compare the Massachusetts post-reclassified rural wage index (Column F in the table below, which is based only on the wage data from one rural hospital in Nantucket County, and does not include the hospital located in Franklin County because it has been reclassified as an urban Lugar hospital) to the post-reclassified area wage index (Column E in the table below). For hospitals that receive reclassification in FY 2015, if the hospital's post-reclassified area wage index (Column E in the table below) is less than the Massachusetts post-reclassified rural wage index (Column F in the table below, which is based on the wage data from one rural hospital located in Nantucket County), then we assign the hospital the Massachusetts post-reclassified rural wage index (Column F in the table below) prior to application of the rural floor budget neutrality adjustment. The nine hospitals were reclassified for FY 2015, and their post-reclassified area wage index (Column E in the table below) is less than the Massachusetts post-reclassified rural wage index (Column F in the table below). Therefore, although there are other hospitals located in Massachusetts that also have been reclassified, only the nine hospitals meet both conditions and are being assigned the Massachusetts post-reclassified rural wage index (without application of the rural floor budget neutrality adjustment).

Specifically, when we compared the Massachusetts pre-reclassified wage index to Massachusetts post-reclassified wage index in the calculation of the reclassification budget neutrality adjustment, the area's pre-reclassified wage index value for the nine hospitals is greater than or equal to the Massachusetts pre-reclassified rural wage index value of 1.1447 (which is calculated based on the wage data from the two rural hospitals). After application of the reclassifications, the area's post-reclassified wage index value for these nine hospitals is lower than the Massachusetts post-reclassified rural wage index value of 1.3477 (which only includes wage data from one rural hospital located in Nantucket County, Massachusetts). Therefore, in accordance with our reclassification hold-harmless methodology, these nine hospitals are assigned the Massachusetts post-reclassified rural wage index value of 1.3477 within the calculation of the reclassification budget neutrality adjustment, prior to the calculation and application of the rural floor budget neutrality adjustment. The impact of this increase in payments (Column B compared to Column F for the nine hospitals) is factored into the reclassification budget neutrality adjustment factor, which is applied to standardized amount. The table below illustrates the various wage indexes in each step of the process described above and why these nine hospitals were assigned the Massachusetts post-reclassified rural wage index prior to the application of the rural floor budget neutrality adjustment.

Provider(A) Pre-reclassified CBSA(B) Area pre-reclassified wage index(C) Massachusetts pre-reclassified rural wage index(D) Post-reclassified CBSA(E) Area post-reclassified wage index(F) Massachusetts post-reclassified rural wage index
220001493401.17281.1447144541.23181.3477
220016221.14471.1447441401.03791.3477
220019493401.17281.1447144541.23181.3477
220058493401.17281.1447144541.23181.3477
220062493401.17281.1447144541.23181.3477
220090493401.17281.1447144541.23181.3477
220095493401.17281.1447144541.23181.3477
220163493401.17281.1447144541.23181.3477
220176493401.17281.1447144541.23181.3477
Note: All wage indexes in this table do not include application of the rural floor budget neutrality adjustment.

The next step in the sequence of our calculation of the budget neutrality adjustment factor is to calculate the rural floor budget neutrality adjustment, which is applied to the wage index. For the 51 hospitals located in Massachusetts, their post-reclassified area wage index is compared to the Massachusetts rural floor wage index (consisting of the one rural hospital located in Nantucket County). Because their post-reclassified area wage index value is lower than the Massachusetts rural floor wage index value, the hospitals are assigned the Massachusetts rural floor wage index value of 1.3477. Therefore, a rural floor budget neutrality adjustment factor is applied to the wage indexes of the 51 hospitals to account for the increase in payments as a result of the application of the rural floor wage index policy. However, with regard to the nine reclassified hospitals, they have already been assigned a post-reclassified wage index value of 1.3477, which is equal to the Massachusetts rural wage index. Accordingly, there is no need to make any further adjustments to ensure budget neutrality. As a result, the nine hospitals are excluded from and have no effect on the rural floor budget neutrality adjustment for FY 2015, and the impact of the FY 2015 rural floor budget neutrality adjustment for Massachusetts is lower than that of the FY 2014 rural floor budget neutrality adjustment. While the overall impact of the rural floor budget neutrality adjustment has decreased for hospitals located in Massachusetts in FY 2015, this explains why the same number of hospitals (60) will still receive the Massachusetts rural wage index in FY 2015 (determined by using the wage data from one rural hospital located in Nantucket County, Massachusetts) based on two different policies.

d. Wage Index Transition Budget Neutrality

As discussed in section III. of the preamble of this final rule, in the past, we have provided for transition periods when adopting changes that have significant payment implications, particularly large negative impacts.

Similar to FY 2005, for FY 2015, we have determined that the transition to using the new OMB delineations will have the largest impact on hospitals that are currently located in an urban county that became rural under the new OMB delineations. To alleviate the decreased payments associated with having a rural wage index, in calculating the area wage index, similar to the transition provided in the FY 2005 IPPS final rule, we are finalizing, as we proposed, a policy to generally assign these counties the urban wage index value of the CBSA to which they are physically located in for FY 2014 for FYs 2015, 2016, and 2017.

In addition to the 3-year transition period for hospitals being transitioned from urban to rural status as discussed above, we are finalizing, as we proposed, a 1-year blended wage index transitional policy for all hospitals that will experience any decrease in their wage index value (that is, a hospital's actual wage index value used for payment, which accounts for all applicable effects of reclassification and redesignation) exclusively as a result of the implementation of the new OMB delineations. Similar to the policy adopted in the FY 2005 IPPS final rule (69 FR 49033), a post-reclassified wage index with the rural and imputed floor applied is computed based on the hospital's FY 2014 CBSA (that is, using all of its FY 2014 constituent county/ies), and another post-reclassified wage index with the rural and imputed floor applied will be computed based on the hospital's new FY 2015 CBSA (that is, the FY 2015 constituent county/ies). We compared these two wage indexes. If the FY 2015 wage index using the FY 2015 CBSAs is lower than the FY 2015 wage index using the FY 2014 CBSAs, we are computing a blended wage index consisting of 50 percent of each of the two wage indexes added together. This blended wage index is the hospital's wage index for FY 2015. Hospitals that benefit from the adoption of the new OMB delineations are assigned their new wage index based on the new OMB delineations. We refer readers to section III. of the preamble of this final rule for a complete discussion on the transitional wage index policy.

In the past, CMS has budget neutralized transitional wage indexes. Because we are establishing a policy that allows for the application of a transitional wage index only when it benefits the hospital, we believe that it would be appropriate to ensure that such a transitional policy does not increase aggregate Medicare payments beyond the payments that would be made had we simply adopted the new OMB delineations without any transitional provisions. Therefore, for FY 2015, we proposed to use our exceptions and adjustments authority under section 1886(d)(5)(I)(i) of the Act to make an adjustment to the national and Puerto Rico-specific standardized amounts to ensure that total payments, including the effect of the transitional wage index provisions, will equal what payments would have been if we had fully adopted the new OMB delineations without any transitional provisions. We did not receive any public comments on this proposal and are finalizing our proposal to make this adjustment under section 1886(d)(5)(I)(i) of the Act.

Also, because we did not receive any public comments on this proposal we are finalizing our proposal to use the same methodology presented in the proposed rule in this final rule to calculate the transitional wage index budget neutrality adjustment factor. We discuss the calculation of this adjustment factor below.

As stated above, the 50/50 blended wage indexes use post-reclassified wage index data with the rural and imputed floor applied computed based on FY 2014 CBSAs. Because the 50/50 blended methodology uses data based on FY 2014 CBSAs, in order to properly calculate the transitional budget neutrality factor, it was first necessary to calculate the following budget neutrality factors based on the FY 2014 CBSAs: An MS-DRG and wage index budget neutrality, a reclassification budget neutrality, and a rural floor budget neutrality. It was necessary to compute the first three budget neutrality factors of MS-DRG, wage index, and reclassification budget neutrality (which are applied to the standardized amount) to ensure that the calculation of the rural and imputed floor budget neutrality factor applied to the wage index based on FY 2014 CBSAs is accurate. We calculated these four budget neutrality factors using the same methodology stated above, but used the FY 2014 CBSAs instead of the FY 2015 CBSAs on both sides of the comparison.

After calculating all of the budget neutrality factors using FY 2014 and FY 2015 CBSAs, to calculate the transitional wage index budget neutrality factor for FY 2015, we used FY 2013 discharge data to simulate payments and compared the following:

  • Aggregate payments using new OMB delineations for FY 2015, the FY 2015 relative weights, FY 2015 wage data after such reclassifications under sections 1886(d)(8)(B) and (C) and 1886(d)(10) of the Act (using the new OMB delineations), applied the rural floor budget neutrality adjustment factor to the wage index (using the new OMB delineations), and applied the FY 2015 hospital readmissions payment adjustments and the estimated FY 2015 hospital VBP payment adjustments; and
  • Aggregate payments using FY 2015 relative weights, FY 2015 wage data after applying the transitional wage indexes, and applied the same hospital readmissions payment adjustments and the estimated hospital VBP payment adjustments applied above. We note that hospitals that did not receive the transitional 50/50 blended wage index were assigned the post-reclassified wage index values with the rural floor budget neutrality adjustment based on the FY 2015 new OMB delineations.

Based on these simulations, we calculated a budget neutrality adjustment factor of 0.998859. Therefore, for FY 2015, we are applying a transitional wage index budget neutrality adjustment factor of 0.998859 to the national average and Puerto Rico-specific standardized amounts to ensure that the effects of these transitional wage indexes are budget neutral.

We note that the budget neutrality adjustment factor calculated above is based on the increase in payments in FY 2015 that would result from the transitional wage indexes. Therefore, we are applying this budget neutrality adjustment factor as a one-time adjustment to the FY 2015 national and Puerto Rico-specific standardized amounts in order to offset the increase in payments in FY 2015 as a result of these transitional wage indexes. For subsequent fiscal years, we will not take into consideration the adjustment factor applied to the national and Puerto Rico-specific standardized amounts in the previous fiscal year's update when calculating the current fiscal year transitional wage index budget neutrality adjustment factor (that is, this adjustment will not be applied cumulatively). Because we are establishing a 3-year transitional wage index policy for urban hospitals that became rural as a result of the adoption of the new OMB delineations, we intend to establish transitional wage index budget neutrality adjustment factors to apply to the FY 2016 and FY 2017 national and Puerto Rico-specific standardized amounts during those respective rulemaking cycles. Similar to the policy for FY 2015, we intend to propose that the FYs 2016 and 2017 adjustments would be applied as “one-time” adjustments and not cumulative adjustments applied each fiscal year.

e. Case-Mix Budget Neutrality Adjustment

(1) Background

Below we summarize the recoupment adjustment to the FY 2015 payment rates, as required by section 631 of ATRA, to account for the increase in aggregate payments as a result of not completing the prospective adjustment authorized under section 7(b)(1)(A) of Public Law 110-90 until FY 2013. We refer readers to section II.D. of the preamble of this final rule for a complete discussion regarding our policies for FY 2015 in this final rule and previously finalized policies (including our historical adjustments to the payment rates) relating to the effect of changes in documentation and coding that do not reflect real changes in case-mix.

(2) Recoupment or Repayment Adjustment Authorized by Section 631 of the American Taxpayer Relief Act of 2012 (ATRA) to the National Standardized Amount

Section 631 of the ATRA amended section 7(b)(1)(B) of Public Law 110-90 to require the Secretary to make a recoupment adjustment totaling $11 billion by FY 2017. Our actuaries estimated that if CMS were to fully account for the $11 billion recoupment required by section 631 of ATRA in FY 2014, a one-time −9.3 percent adjustment to the standardized amount would be necessary. It is often our practice to delay or phase-in payment rate adjustments over more than 1 year, in order to moderate the effect on payment rates in any 1 year. Therefore, consistent with the policies that we have adopted in many similar cases, for FY 2014, we applied a −0.8 percent adjustment to the standardized amount. In this final rule, as we proposed, we are applying an additional −0.8 percent adjustment to the standardized amount for FY 2015. We note that, as section 631 of the ATRA instructs the Secretary to make a recoupment adjustment only to the standardized amount, this adjustment would not apply to the Puerto Rico-specific standardized amount and hospital-specific payment rates.

f. Rural Community Hospital Demonstration Program Adjustment

As discussed in section IV.L. of the preamble of this final rule, section 410A of Public Law 108-173 originally required the Secretary to establish a demonstration program that modifies reimbursement for inpatient services for up to 15 small rural hospitals. Section 410A(c)(2) of Public Law 108-173 requires that “[i]n conducting the demonstration program under this section, the Secretary shall ensure that the aggregate payments made by the Secretary do not exceed the amount which the Secretary would have paid if the demonstration program under this section was not implemented.”

Sections 3123 and 10313 of the Affordable Care Act extended the demonstration program for an additional 5-year period, and allowed up to 30 hospitals to participate in 20 States with low population densities determined by the Secretary. (In determining which States to include in the expansion, the Secretary is required to use the same criteria and data that the Secretary used to determine the States for purposes of the initial 5-year period.) In the FY 2013 IPPS/LTCH PPS final rule (77 FR 53449 through 53453), in order to achieve budget neutrality, we adjusted the national IPPS payment rates by an amount sufficient to account for the added costs of this demonstration program as described in section IV.K. of that final rule. In other words, we applied budget neutrality across the payment system as a whole rather than merely across the participants of this demonstration program, consistent with past practice. We stated that we believe the language of the statutory budget neutrality requirement permits the agency to implement the budget neutrality provision in this manner. The statutory language requires that “aggregate payments made by the Secretary do not exceed the amount which the Secretary would have paid if the demonstration . . . was not implemented,” but does not identify the range across which aggregate payments must be held equal.

As we did for FY 2014, for FY 2015, we are adjusting the national IPPS payment rates according to the same methodology that we used for FY 2013, as set forth in section IV.L. of the preamble of this final rule, to account for the estimated additional costs of the demonstration program for FY 2015. For FY 2015, in this final rule, the estimated amount of this budget neutrality adjustment factor applied to the national IPPS payment rates for FY 2015 is $54,177,144. In addition, similar to previous years, we are including in the budget neutrality offset amount the amount by which the actual demonstration costs corresponding to an earlier given year (which would be determined once we have finalized cost reports for that year) exceeded the budget neutrality offset amount finalized in the corresponding year's IPPS final rule. For this FY 2015 IPPS/LTCH PPS final rule, we have calculated the amount by which the actual costs of the demonstration in FY 2008 (that is, the costs of the demonstration for the 10 hospitals that participated in FY 2008, as shown in these hospitals' finalized cost reports for the cost report period beginning in that calendar year), exceeded the amount that was finalized in the FY 2008 IPPS final rule. For FY 2015, in this final rule, we are establishing a budget neutrality offset amount of $10,389,771 for FY 2008.

We also are currently working with the MACs that service the hospitals participating in the demonstration to obtain finalized cost reports for FYs 2009, 2010, 2011, and 2012). These data were unavailable for this final rule. However, depending on our progress in obtaining these cost reports, we may include in the FY 2016 IPPS final rule the difference between the demonstration costs for one or more of these years and the amounts that were finalized in the respective fiscal years' final rules.

Therefore, the final total budget neutrality offset amount that we are applying to the FY 2015 IPPS payment rates is $64,566,915. This amount is the sum of two separate components: (1) the difference between the total estimated FY 2014 reasonable cost amount to be paid under the demonstration to the 22 participating hospitals participating in the demonstration program for covered inpatient services, and the total estimated amount that would otherwise be paid to the participating hospitals in FY 2014 without the demonstration ($54,177,144); and (2) the amount by which the actual costs of the demonstration for FY 2008, which are calculated in accordance with the finalized cost reports for the hospitals that participated in the demonstration during FY 2008, exceed the budget neutrality offset amount that was finalized in the FY 2008 IPPS final rule ($10,389,771).

Accordingly, using the most recent data available to account for the estimated costs of the demonstration program, for FY 2015, we computed a factor of 0.99931 for the rural community hospital demonstration program budget neutrality adjustment that will be applied to the IPPS standard Federal payment rate.

g. Outlier Payments

Section 1886(d)(5)(A) of the Act provides for payments in addition to the basic prospective payments for “outlier” cases involving extraordinarily high costs. To qualify for outlier payments, a case must have costs greater than the sum of the prospective payment rate for the DRG, any IME and DSH payments, any new technology add-on payments, and the “outlier threshold” or “fixed-loss” amount (a dollar amount by which the costs of a case must exceed payments in order to qualify for an outlier payment). We refer to the sum of the prospective payment rate for the DRG, any IME and DSH payments, any new technology add-on payments, and the outlier threshold as the outlier “fixed-loss cost threshold.” To determine whether the costs of a case exceed the fixed-loss cost threshold, a hospital's CCR is applied to the total covered charges for the case to convert the charges to estimated costs. Payments for eligible cases are then made based on a marginal cost factor, which is a percentage of the estimated costs above the fixed-loss cost threshold. The marginal cost factor for FY 2015 is 80 percent, the same marginal cost factor we have used since FY 1995 (59 FR 45367).

In accordance with section 1886(d)(5)(A)(iv) of the Act, outlier payments for any year are projected to be not less than 5 percent nor more than 6 percent of total operating DRG payments (which does not include IME and DSH payments) plus outlier payments. When setting the outlier threshold, we compute the 5.1 percent target by dividing the total operating outlier payments by the total operating DRG payments plus outlier payments. We do not include any other payments such as IME and DSH within the outlier target amount. Therefore, it is not necessary to include Medicare Advantage IME payments in the outlier threshold calculation. Section 1886(d)(3)(B) of the Act requires the Secretary to reduce the average standardized amount by a factor to account for the estimated proportion of total DRG payments made to outlier cases. Similarly, section 1886(d)(9)(B)(iv) of the Act requires the Secretary to reduce the average standardized amount applicable to hospitals located in Puerto Rico to account for the estimated proportion of total DRG payments made to outlier cases. More information on outlier payments may be found on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/outlier.htm.

(1) FY 2015 Outlier Fixed-Loss Cost Threshold

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50977-50983), in response to public comments on the FY 2013 IPPS/LTCH PPS proposed rule, we made changes to our methodology for projecting the outlier fixed-loss cost threshold for FY 2014. We refer readers to the FY 2014 IPPS/LTCH PPS final rule for detailed discussion of the changes.

For FY 2015, we proposed to continue to use the same methodology that we used in FY 2014. As we have done in the past, to calculate the proposed FY 2015 outlier threshold, we simulated payments by applying proposed FY 2015 payment rates and policies using cases from the FY 2013 MedPAR file. Therefore, in order to determine the proposed FY 2015 outlier threshold, we inflated the charges on the MedPAR claims by 2 years, from FY 2013 to FY 2015. As discussed in the FY 2014 IPPS/LTCH PPS final rule, we believe a methodology that is based on 1-year of charge data will provide a more stable measure to project the average charge per case because our prior methodology used a 6-month measure, which inherently uses fewer claims than a 1-year measure and makes it more susceptible to fluctuations in the average charge per case as a result of any significant charge increases or decreases by hospitals. Under this new methodology, to compute the 1-year average annualized rate-of-change in charges per case for FY 2015, we proposed to compare the second quarter of FY 2012 through the first quarter of FY 2013 (January 1, 2012, through December 31, 2012) to the second quarter of FY 2013 through the first quarter of FY 2014 (January 1, 2013, through December 31, 2013). This rate-of-change is 5.6 percent (1.055736) or 11.5 percent (1.114579) over 2 years.

Comment: Commenters were concerned that they were unable to replicate the calculation of the charge inflation factor that CMS used in the proposed rule. The commenters stated that the first quarter of the FY 2014 MedPAR claims were not released to the public. The commenters requested that CMS release the claims data used to calculate the charge inflation factor used in the proposed rule. One commenter, a provider, requested that CMS reevaluate the calculation of the inflation factor because it far exceeds the inflation factors used in labor markets that the provider operates within.

Response: In the FY 2014 IPPS/LTCH PPS proposed rule, we proposed to adopt a new methodology to compare periods of 1-year of the most recent charge data in order to inflate charges. Commenters supported this proposal and it was adopted for FY 2014 and future years in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50978). We note that we did not provide additional data for the first quarter of FY 2013, which was used to inflate charges in the FY 2014 IPPS/LTCH PPS proposed rule, nor was it requested during the comment period for the FY 2014 IPPS/LTCH PPS proposed rule. We further note that our charge inflation policy from FY 2005 through FY 2013 also compared the most recent period of charge data using a 6-month comparison instead of a 1 year comparison. Similar to above, we did no provide additional data for the first quarter of the applicable fiscal year, nor was it requested during the comment period for those years.

Consistent with this policy, for FY 2015, we proposed to compare the most recent charge data from the second quarter of FY 2012 through the first quarter of FY 2013 (January 1, 2012, through December 31, 2012) to the second quarter of FY 2013 through the first quarter of FY 2014 (January 1, 2013, through December 31, 2013).

In response to the commenters who requested a restructuring of the limited data set files for the FY 2015 IPPS/LTCH PPS proposed and final rule to provide an additional quarter of MedPAR claims data, we did not have sufficient time to restructure the files as the commenters requested prior to the publication of the proposed and this final rule. Consistent with our longstanding policy since FY 2005, we continue to believe that it is optimal to use the most recent period of charge data available to measure charge inflation. We will consider how best to provide additional information on the charge inflation factor for future years.

With respect to the commenter requesting that CMS reevaluate the calculation of the inflation factor, we believe that our measure of charge inflation accurately reflects the national charge inflation. Our charge inflation factor represents the average percentage increase in charge inflation for all hospitals. We recognize that charge inflation may vary geographically, and we do not believe that it is appropriate to base the charge inflation factor on selective labor markets because we apply this charge inflation factor to all claims for all hospitals.

As we have done in the past, in the FY 2015 IPPS/LTCH PPS proposed rule we proposed to establish the FY 2015 outlier threshold using hospital CCRs from the December 2013 update to the Provider-Specific File (PSF)—the most recent available data at the time of the proposed rule. We also proposed that if more recent data became available would we use that data to calculate the final FY 2015 outlier threshold. For FY 2015, we also proposed to continue to apply an adjustment factor to the CCRs to account for cost and charge inflation (as explained below).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50979), we adopted a new methodology to adjust the CCRs. Specifically, we finalized a policy to compare the national average case-weighted operating and capital CCR from the most recent update of the PSF to the national average case-weighted operating and capital CCR from the same period of the prior year.

Therefore, as we did for FY 2014, for FY 2015, we proposed to adjust the CCRs from the December 2013 update of the PSF by comparing the percentage change in the national average case-weighted operating CCR and capital CCR from the December 2012 update of the PSF to the national average case-weighted operating CCR and capital CCR from the December 2013 update of the PSF. We note that in the proposed rule we used total transfer-adjusted cases from FY 2013 to determine the national average case-weighted CCRs for both sides of the comparison. As stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50979), we believe that it is appropriate to use the same case count on both sides of the comparison because this will produce the true percentage change in the average case-weighted operating and capital CCR from one year to the next without any effect from a change in case count on different sides of the comparison.

Using the proposed methodology above, for the proposed rule we calculated a December 2012 operating national average case-weighted CCR of 0.295101 and a December 2013 operating national average case-weighted CCR of 0.289587. We then calculated the percentage change between the two national operating case-weighted CCRs by subtracting the December 2012 operating national average case-weighted CCR from the December 2013 operating national average case-weighted CCR and then dividing the result by the December 2012 national operating average case-weighted CCR. This resulted in a proposed national operating CCR adjustment factor of 0.981315.

We used the same methodology proposed above to adjust the capital CCRs. Specifically, for the proposed rule we calculated a December 2012 capital national average case-weighted CCR of 0.025079 and a December 2013 capital national average case-weighted CCR of 0.024868. We then calculated the percentage change between the two national capital case-weighted CCRs by subtracting the December 2012 capital national average case-weighted CCR from the December 2013 capital national average case-weighted CCR and then dividing the result by the December 2012 capital national average case-weighted CCR. This resulted in a proposed national capital CCR adjustment factor of 0.991587.

Consistent with our methodology used in the past and as stated in the FY 2009 IPPS final rule (73 FR 48763), we continue to believe that it is appropriate to apply only a 1-year adjustment factor to the CCRs. On average, it takes approximately 9 months for a fiscal intermediary or MAC to tentatively settle a cost report from the fiscal year end of a hospital's cost reporting period. The average “age” of hospitals' CCRs from the time the fiscal intermediary or the MAC inserts the CCR in the PSF until the beginning of FY 2015 is approximately 1 year. Therefore, as stated above, we believe a 1-year adjustment factor to the CCRs is appropriate.

Comment: One commenter matched the CCRs used for the proposed rule impact file to the December 2013 PSF and found that 126 providers' CCRs did not match. The commenter noted that although an extremely high percentage of providers' CCRs matched the data in the December 2013 update, the average percent difference for those CCRs that did not match is much higher than any other comparison from prior years. The commenter stated that this difference could lead to differences in the calculated fixed-loss threshold. The commenter further stated that the data demonstrated that CMS used significantly outdated CCRs to make projections for the FY 2015 fixed-loss threshold. The commenter recommended that this error be rectified in the final rule, which would result in a substantially reduced threshold. In addition, the commenter recommended that CMS use the most recently updated PSF file for the final rule.

Response: With regard to the commenter's finding of 126 providers with CCRs from the proposed rule impact file that did not match the data in the December 2013 PSF, as stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50979), we apply the following edits to providers' CCRs in the PSF. We believe these edits are appropriate in order to accurately model the outlier threshold. We first search for Indian Health Service providers and those providers assigned the statewide average CCR from the current fiscal year. We then replace these CCRs with the statewide average CCR for the upcoming fiscal year. We also assign the statewide average CCR (for the upcoming fiscal year) to those providers that have no value in the CCR field in the PSF. We believe that the edits above are the reason why the commenter found that 126 providers had CCRs in the impact file that did not match the CCRs in the December 2013 PSF, and contributed to the average percentage difference for those CCRs that did not match. We also believe that we have accurately calculated and applied these statewide average CCRs and will continue to monitor any large variances in the future. With regard to using the most recently updated PSF file for the final rule, we responded to a similar comment below.

As stated above, for FY 2015, we applied the proposed FY 2015 payment rates and policies using cases from the FY 2013 MedPAR files in calculating the outlier threshold.

As discussed above, for FY 2015, we are applying transitional wage indexes because of the adoption of the new OMB labor market area delineations. Also, as discussed in section III.B.3. of the preamble to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50160 and 50161) and in section III.G.3. of the preamble of this final rule, in accordance with section 10324(a) of the Affordable Care Act, beginning in FY 2011, we created a wage index floor of 1.0000 for all hospitals located in States determined to be frontier States. We noted that the frontier State floor adjustments will be calculated and applied after rural and imputed floor budget neutrality adjustments are calculated for all labor market areas, in order to ensure that no hospital in a frontier State will receive a wage index less than 1.0000 due to the rural and imputed floor adjustment. In accordance with section 10324(a) of the Affordable Care Act, the frontier State adjustment will not be subject to budget neutrality, and will only be extended to hospitals geographically located within a frontier State. However, for purposes of estimating the proposed outlier threshold for FY 2015, it was necessary to apply the transitional wage indexes and adjust the wage index of those eligible hospitals in a frontier State when calculating the outlier threshold that results in outlier payments being 5.1 percent of total payments for FY 2015. If we did not take the above into account, our estimate of total FY 2015 payments would be too low, and, as a result, our proposed outlier threshold would be too high, such that estimated outlier payments would be less than our projected 5.1 percent of total payments.

As we did in establishing the FY 2009 outlier threshold (73 FR 57891), in our projection of FY 2015 outlier payments, we proposed not to make any adjustments for the possibility that hospitals' CCRs and outlier payments may be reconciled upon cost report settlement. We stated that we continue to believe that, due to the policy implemented in the June 9, 2003 Outlier final rule (68 FR 34494), CCRs will no longer fluctuate significantly and, therefore, few hospitals will actually have these ratios reconciled upon cost report settlement. In addition, it is difficult to predict the specific hospitals that will have CCRs and outlier payments reconciled in any given year. We also note that, in accordance with our reconciliation criteria, reconciliation occurs in instances where a hospital's actual CCR for the cost reporting period fluctuates plus or minus 10 percentage points compared to the interim CCR used to calculate outlier payments when a bill is processed. Our simulations assume that CCRs accurately measure hospital costs based on information available to us at the time we set the outlier threshold. For these reasons, we proposed not to make any assumptions regarding the effects of reconciliation on the outlier threshold calculation.

Comment: One commenter expressed concern that CMS did not consider outlier reconciliation in the development of the outlier threshold. The commenter stated that CMS did not provide any objective data concerning the number of hospitals that have been subject to outlier reconciliation and the amounts recovered. The commenter further stated that, in February 2003, the Secretary signed what the commenter described as an emergency interim final rule that would have corrected the outlier threshold and included outlier reconciliation payments (in the calculation of the outlier threshold), but that rule was not issued because of objections from the Office of Management and Budget. The commenter asserted that if it was possible to account for outlier reconciliation payments at the initial implementation of the outlier reconciliation policy in the calculation of the threshold, it should be possible to do so 10 years later. The commenter also searched cost reports from the HCRIS database for the years 2003 through 2010 (Form CMS-2552-96 and CMS-2552-10) and, based on these data, provided its estimate that the annual amounts recovered by CMS through reconciliation totaled $108,934,425. The commenter believed that these data can be used to provide a baseline and trend information to assess whether outlier reconciliation is a significant factor to be considered in the development of the outlier threshold. The commenter noted that it was unable to extract outlier reconciliation payment information from cost reports filed under Form CMS-2552-10. The commenter was unsure why this data was not being captured. Therefore, the commenter requested that CMS disclose in the final rule and future rulemaking the amount CMS has recovered through reconciliation by year.

Another commenter cited a report issued by the Office of Inspector General (OIG) on June 28, 2012, entitled “The Centers for Medicare & Medicaid Services Did Not Reconcile Medicare Outlier Payments in Accordance With Federal Regulations and Guidance” (A-07-10-02764), which reviewed the reconciliation process for outlier payments under the IPPS. The commenter stated that the 2012 OIG Report identified approximately $664 million in unreconciled outlier payments, which is a material amount in relation to total outlier payments. Moreover, the commenter further stated that CMS now has approximately 10 full fiscal years of experience with reconciliation from which to project the impact of its reconciliation in the upcoming fiscal year. As such, the commenter asserted that CMS' policy of refusing to account for the impact of reconciliation in setting the FY 2015 outlier fixed-loss cost threshold is neither reasonable nor consistent with the outlier provisions of the statute.

The same commenter cited the OIG report issued on November 13, 2013, entitled “Medicare Hospital Outlier Payments Warrant Increased Scrutiny” (OEI-06-10-00520). The commenter stated that the proposed outlier fixed-loss cost threshold appears improperly inflated and, therefore, overstated because CMS does not report, and has not taken any actions to report, any steps to account for “high-outlier” payments identified in the report. The commenter further stated that it is neither consistent with the outlier provisions of the statute nor reasonable for CMS, in modeling outlier payments for the upcoming fiscal year, to include outlier payments that were based on excessively high charges for particular MS-DRGs and not based on truly unusually high costs. The commenter concluded that such payments will presumably be recouped by CMS following audit and reconciliation. However, CMS has not disclosed or discussed what, if anything, it has done to address this issue in setting the outlier fixed-loss cost threshold for FY 2015.

Response: A similar comment was received in response to the policies presented in last year's rule. We appreciate the commenter's input and for informing us of its concern regarding our policy of not including outlier reconciliation within the development of the outlier fixed-loss cost threshold. The commenter provided data from HCRIS that demonstrated total outlier reconciliation payments from 2003 through 2010 were $108,934,425, which equates to approximately $13,616,803 annually. Assuming that the totals provided by the commenter are correct, we do not believe that this relatively small annual amount would have an impact on the outlier threshold because total outlier payments are approximately $4.3 billion. Further, with regard to the draft final rule referenced by the commenter, that draft document was never finalized or published in the Federal Register (neither on a proposed or interim basis), or implemented in any way. We also disagree with the commenter's characterization of the draft interim final rule. That draft rule would not have adjusted the outlier threshold by accounting for payment changes as a result of outlier reconciliation, as the commenter suggested. Rather, the draft interim final rule merely would have calculated a new fixed-loss threshold to be applied for the remainder of Federal fiscal year 2003 using the same data that originally had been used for that purpose, but excluding data from 123 hospitals whose percentage of outlier payments relative to total DRG payments increased by at least 5 percentage points between FY 1999 and FY 2001, and whose case-mix (the average DRG relative weight value for all of a hospital's Medicare cases) adjusted charges increased at a rate at or above the 95th percentile rate of charge increase for all hospitals (46.63 percent) over the same period. As previously stated, this draft rule was never finalized or published in the Federal Register. Therefore, that document has little, if any, relevance to the current discussion. With respect to the commenter citing the 2012 OIG Report which identified approximately $664 million in unreconciled outlier payments, we cannot substantiate this amount until all of the outlier reconciliations are performed. As the MACs continue to perform these outlier reconciliations, they record these amounts on the cost report, which are then publicly available through the HCRIS database. Also, CMS has requested that the MACs submit to CMS the reconciled outlier amounts. We will continue to track these outlier reconciliations as stated in our response to the OIG report.

As stated in prior final rules, we continue to believe that, as a result of the policy implemented in the June 9, 2003 outlier final rule (68 FR 34494), CCRs will no longer fluctuate significantly and, therefore, few hospitals will actually have these ratios reconciled upon cost report settlement as demonstrated by the total outlier payments provided by the commenter. In addition, it is difficult to predict the specific hospitals that will have CCRs and outlier payments reconciled in any given year. We also note that, in accordance with our outlier reconciliation criteria, reconciliation occurs in instances where a hospital's actual CCR for the cost reporting period fluctuates plus or minus 10 percentage points compared to the interim CCR used to calculate outlier payments when a bill is processed. Our simulations assume that CCRs accurately measure hospital costs based on information available to us at the time we set the outlier threshold. For these reasons, as we proposed, we are finalizing our proposal not to make any assumptions regarding the effects of reconciliation on the outlier threshold calculation.

Also, outlier reconciliation is a function of the cost report and MACs record the outlier reconciliation amount on each provider's cost report (and are not required to report these data to CMS outside of the cost report settlement process). Therefore, the outlier reconciliation data that the commenter requested should be publicly available through the cost report. With regard to the commenter not being able to retrieve the data for outlier reconciliation payments from cost reports filed under Form CMS-2552-10, we received a similar comment in response to last year's proposed rule, as summarized in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50980). We will continue to follow up with our information systems team to ensure this information is readily available to the public. Since the effective date of Change Request 7192 on April 1, 2011, we have approved the reconciliation of outlier payments for some hospitals. Other hospitals that were flagged for outlier reconciliation are still under review for approval. In addition, some hospitals flagged for outlier reconciliation may experience a delay in reconciling their outlier payments due to circumstances that prevent the MACs from finalizing the hospital's cost report (such as other payments that may need to be reconciled aside from outlier payments).

We disagree with the commenter that stated that we should not include outlier payments that were based on excessively high charges for particular MS-DRGs and not based on truly unusually high costs because such payments will presumably be recouped by CMS following audit and reconciliation. The purpose of the CCR is to measure a hospital's costs and charges. We believe that the CCRs will reflect these low costs and high charges that the commenter referred to, and when applied to the charges on the claim will result in less outlier payments for such cases because the costs of the case will be lower when compared to the total MS-DRG payments excluding outlier payments. Also, the commenter appears to assume that providers with high charges will be eligible for outlier reconciliation and CMS will recoup these funds at a later time. We disagree with the commenter. If a hospital's interim CCR is consistent with its charges on the claim then no matter how high or low a hospital's charges, the hospital probably will not meet the outlier reconciliation criteria.

As described in sections IV.H. and IV.I., respectively, of the preamble of this final rule, sections 1886(q) and 1886(o) of the Act establish the Hospital Readmissions Reduction Program and the Hospital VBP Program, respectively. We do not believe that it is appropriate to include the hospital VBP payment adjustments and the hospital readmissions payment adjustments in the outlier threshold calculation or the outlier offset to the standardized amount. Specifically, consistent with our definition of the base operating DRG payment amount for the Hospital Readmissions Reduction Program under § 412.152 and the Hospital VBP Program under § 412.160, outlier payments under section 1886(d)(5)(A) of the Act are not affected by these payment adjustments. Therefore, outlier payments will continue to be calculated based on the unadjusted base DRG payment amount (as opposed to using the base-operating DRG payment amount adjusted by the hospital readmissions payment adjustment and the hospital VBP payment adjustment). Consequently, we proposed to exclude the hospital VBP payment adjustments and the hospital readmissions payment adjustments from the calculation of the outlier fixed-loss cost threshold.

We noted that, to the extent section 1886(r) of the Act modifies the existing DSH payment methodology under section 1886(d)(5)(F), the new uncompensated care payment under section 1886(r)(2), like the empirically justified Medicare DSH payment under section 1886(r)(1), may be considered an amount payable under section 1886(d)(5)(F) of the Act such that it would be reasonable to include the payment in the outlier determination under section 1886(d)(5)(A). As we did for FY 2014, for FY 2015, we stated that we also are proposing to allocate an estimated per-discharge uncompensated care payment amount to all cases for the hospitals eligible to receive the uncompensated care payment amount in the calculation of the outlier fixed-loss cost threshold methodology. We stated that we continue to believe that allocating an eligible hospital's estimated uncompensated care payment to all cases equally in the calculation of the outlier fixed-loss cost threshold would best approximate the amount we would pay in uncompensated care payments during the year because, when we make claim payments to a hospital eligible for such payments, we would be making estimated per-discharge uncompensated care payments to all cases equally. Furthermore, we stated that we continue to believe that using the estimated per-claim uncompensated care payment amount to determine outlier estimates provides predictability as to the amount of uncompensated care payments included in the calculation of outlier payments. Therefore, consistent with the methodology used in FY 2014 to calculate the outlier fixed-loss cost threshold, for FY 2015, we stated that we are proposing to include estimated FY 2015 uncompensated care payments in the computation of the proposed outlier fixed-loss cost threshold. Specifically, we stated we are proposing to use the estimated per-discharge uncompensated care payments to hospitals eligible for the uncompensated care payment for all cases in the calculation of the outlier fixed-loss cost threshold methodology.

Using this methodology, we proposed an outlier fixed-loss cost threshold for FY 2015 equal to the prospective payment rate for the MS-DRG, plus any IME, empirically justified Medicare DSH payments, estimated uncompensated care payment, and any add-on payments for new technology, plus $25,799.

In the proposed rule we noted that the proposed FY 2015 fixed-loss cost threshold is higher than the FY 2014 final outlier fixed-loss cost threshold of $21,748. We stated that we believe that the increase in the charge inflation factor (compared to the FY 2014 charge inflation factor) contributed to a higher proposed outlier fixed-loss threshold for FY 2015. As charges increase, so do outlier payments. As a result, it was necessary for us to raise the proposed outlier fixed-loss cost threshold to decrease the amount of outlier payments expended in order to reach the 5.1 percent target.

Comment: Some commenters were surprised by the magnitude of the increase of the outlier threshold in the proposed rule compared to the threshold of $21,748 for FY 2014. The commenters explained that, for FY 2013, CMS currently estimates that outliers are 4.81 percent of total MS-DRG payments. The commenters asserted that, given that the threshold for FY 2013 of $21,821 was similar to the outlier threshold for FY 2014, they find little justification for a dramatic increase in the threshold for FY 2015.

The commenters also stated that it is important that CMS is aware of the magnitude of inaccuracies when estimating the actual outlier payout for prior years or calculating the current outlier threshold. The commenters noted that, in prior years, CMS has estimated outlier payments for a FY in one year and then the next year revises the estimate at a much lower number than the initial estimate. The commenters cited the FY 2013 outlier estimate as an example where CMS estimated total outlier payments for FY 2013 in the FY 2014 IPPS/LTCH PPS proposed rule as 5.17 percent and then revised this number in the FY 2015 IPPS/LTCH PPS proposed rule to 4.81 percent.

The commenters also noted that with each rulemaking the final outlier threshold established by CMS is always lower than the threshold set forth in the proposed rule. One commenter speculated that this is most likely as result of the use of updated CCRs or other data in calculating the final outlier threshold. As a result, the commenter emphasized the need for CMS to use the most recent data available when calculating the outlier threshold. The commenter stated that, with regard to the current rulemaking, CMS used data from the December 2013 PSF in the proposed rule, when the March 2014 PSF was available at the time the proposed rule was issued. Using the March 2014 PSF, the commenter calculated an outlier threshold of $25,375 (compared to the threshold presented in the proposed rule of $25,799, which used the December 2013 PSF).

Response: When we conduct our modeling to determine the outlier threshold, we factor in all payments and policies that would affect actual payments for the current year in order to estimate that outlier payments are 5.1 percent of total MS-DRG payments. While we recognize that outlier payments have been below the 5.1 percent target in prior fiscal years, we do not believe that these lower payouts are relevant to the current fiscal year because they do not lend greater accuracy to the estimate of payments that are 5.1 percent of total MS-DRG payments for FY 2015. We also note that in response to commenters' concerns, last year we modified our outlier threshold calculation by changing the way we adjust the CCRs. We also changed the measure of inflation from using 6 months of claims data to 1 year of claims data. CMS shares the commenters' belief that outlier payments in every fiscal year meet the 5.1 percent target, and we made these changes to improve our methodology for calculating the outlier threshold. As in prior years, CMS will continue to consider any suggestions made by the commenters to improve the accuracy of the calculation of the outlier threshold.

CMS' historical policy is to use the best available data when setting the payment rates and factors in both the proposed and final rules. Sometimes there are variables that change between the proposed and final rule as result of the availability of more recent data, such as the charge inflation factor and the CCR adjustment factors that can cause fluctuations in the threshold amount. Other factors such as changes to the wage indexes and market basket increase can also cause the outlier fixed-loss cost threshold to fluctuate between the proposed rule and the final rule each year. We use the latest data that is available at the time of the development of the proposed and final rules, such as the most recent update of MedPAR claims data and CCRs from the most recent update of the PSF. With regard to the commenter noting the availability of the March 2014 PSF at the time the proposed rule was issued, this file was not available when we calculated the proposed outlier fixed-loss cost threshold as part of the development of the proposed rule. Therefore, for the proposed rule, we used the latest update available, which was the December 2013 PSF. If we were to wait for the March 2014 PSF to become available, this would cause further delay of publication of the proposed rule, leading to a possible further delay of issuance of the final rule in a timely fashion.

Comment: One commenter stated that as a result of the large increase in the proposed outlier threshold it suspected that CMS is duplicating its charge increases through the use of the charge inflation factor. The commenter believed that this duplication is compounded by the fact that the CCRs would also reflect high charges. The commenter believed that these two issues are artificially inflating the threshold while hospitals have lower costs. The commenter offered an alternative threshold of $24,340, by measuring the change in outlier percentage payments of 5.1 percent for FY 2015 compared to the FY 2014 outlier estimate of 5.79 percent (5.1 percent minus 5.79 percent = −0.69 percent divided by 5.79 percent = 11.92 percent). The commenter recommended using a forecast correction of 100 plus 11.92 percent based on their calculation above.

Response: We disagree with the commenter. We believe that our measure and application of the charge inflation factor is accurate and appropriate as explained in the proposed rule. As discussed, we apply a 2-year charge inflation factor because we use claims from FY 2013 for FY 2015. Also, the CCRs we use come directly from the PSF, which comes directly from hospitals' cost reports. We believe that these CCRs are accurate. We also are unsure what “high charges” to which the commenter referred. Further, as noted above, section 1886(d)(5)(A)(iv) of the Act requires outlier payments to be not less than 5 percent nor more than 6 percent of total estimated or projected payments in that year. Therefore, we cannot adopt the commenter's suggestion to use a forecast correction to compute the outlier threshold. When we calculate the threshold, we use the latest data that are available at the time of the proposed and final rules in order to estimate that outlier payments are 5.1 percent of total payments.

Comment: One commenter noted that CMS did not indicate if it has made any additional changes to its methodology to exclude the charges for hemophilia clotting factors from the calculation of the fixed-loss outlier threshold. The commenter noted that CMS provides a methodology for excluding such charges from MedPAR data for the budget neutrality calculation. The commenter wanted to ensure that such efforts also resulted in the exclusion of such charges from MedPAR data used for the calculation of the fixed-loss threshold as well.

Response: We appreciate the commenter's input and for seeking clarification on the calculation of the fixed-loss outlier threshold. Similar to what is done in the budget neutrality calculation, CMS excludes the charges for hemophilia clotting factors from the calculation of the fixed-loss outlier threshold.

Comment: One commenter stated that in its public comment submitted in response to the FY 2014 IPPS/LTCH PPS proposed rule regarding outliers, it explained why uncompensated care payments should be included as part of the fixed-loss threshold calculation. The commenter noted that it is clear why CMS considered this in the FY 2015 IPPS/LTCH PPS proposed rule. The commenter wanted to ensure that updates to the uncompensated care payment calculation are also considered in the final rule.

Response: As discussed above, we included updates to the uncompensated care payment calculation as part of the fixed-loss outlier threshold calculation in this final rule.

After consideration of the public comments we received, we are not making any changes to our methodology in this final rule for FY 2015. Therefore, we are using the same methodology we proposed to calculate the final outlier threshold.

As we have done in the past, to calculate the final FY 2015 outlier threshold, we simulated payments by applying FY 2015 payment rates and policies using cases from the FY 2013 MedPAR file. Therefore, in order to determine the final FY 2015 outlier threshold, we inflated the charges on the MedPAR claims by 2 years, from FY 2013 to FY 2015. As discussed in the FY 2014 IPPS/LTCH PPS final rule, we believe that a methodology that is based on 1-year of charge data will provide a more stable measure to project the average charge per case. To compute the 1-year average annualized rate-of-change in charges per case for FY 2015, we compared the third quarter of FY 2012 through the second quarter of FY 2013 (April 1, 2012, through March 31, 2013) to the third quarter of FY 2013 through the second quarter of FY 2014 (April 1, 2013, through March 31, 2014). This rate-of-change is 5.1 percent (1.050917) or 10.4 percent (1.104427) over 2 years.

As we have done in the past, we are establishing the FY 2015 outlier threshold using hospital CCRs from the March 2014 update to the Provider-Specific File (PSF)—the most recent available data at the time of development of this final rule. For FY 2015, we also are continuing to apply an adjustment factor to the CCRs to account for cost and charge inflation (as explained below). In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50979), we adopted a new methodology to adjust the CCRs. Specifically, we finalized a policy to compare the national average case-weighted operating and capital CCR from the most recent update of the PSF to the national average case-weighted operating and capital CCR from the same period of the prior year.

Therefore, as we did for FY 2014, for FY 2015, we are adjusting the CCRs from the March 2014 update of the PSF by comparing the percentage change in the national average case-weighted operating CCR and capital CCR from the March 2013 update of the PSF to the national average case-weighted operating CCR and capital CCR from the March 2014 update of the PSF. We note that we used total transfer-adjusted cases from FY 2013 to determine the national average case-weighted CCRs for both sides of the comparison. As stated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50979), we believe that it is appropriate to use the same case count on both sides of the comparison as this will produce the true percentage change in the average case-weighted operating and capital CCR from one year to the next without any effect from a change in case count on different sides of the comparison.

Using the methodology above, we calculated a March 2013 operating national average case-weighted CCR of 0.292377 and a March 2014 operating national average case-weighted CCR of 0.28714. We then calculated the percentage change between the two national operating case-weighted CCRs by subtracting the March 2013 operating national average case-weighted CCR from the March 2014 operating national average case-weighted CCR and then dividing the result by the March 2013 national operating average case-weighted CCR. This resulted in a national operating CCR adjustment factor of 0.982088.

We also used the same methodology above to adjust the capital CCRs. Specifically, we calculated a March 2013 capital national average case-weighted CCR of 0.025143 and a March 2014 capital national average case-weighted CCR of 0.024849. We then calculated the percentage change between the two national capital case-weighted CCRs by subtracting the March 2013 capital national average case-weighted CCR from the March 2014 capital national average case-weighted CCR and then dividing the result by the March 2013 capital national average case-weighted CCR. This resulted in a national capital CCR adjustment factor of 0.988307.

Consistent with our methodology in the past and as stated in the FY 2009 IPPS final rule (73 FR 48763), we continue to believe that it is appropriate to apply only a 1-year adjustment factor to the CCRs. On average, it takes approximately 9 months for a fiscal intermediary or MAC to tentatively settle a cost report from the fiscal year end of a hospital's cost reporting period. The average “age” of hospitals' CCRs from the time the fiscal intermediary or the MAC inserts the CCR in the PSF until the beginning of FY 2015 is approximately 1 year. Therefore, as stated above, we believe a 1-year adjustment factor to the CCRs is appropriate.

As stated above, for FY 2015, we applied the FY 2015 payment rates and policies using cases from the FY 2013 MedPAR files in calculating the outlier threshold.

As discussed above, for FY 2015, we are applying transitional wage indexes because of the adoption of the new OMB labor market area delineations. Also, as discussed in section III.B.3. of the preamble to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50160 and 50161) and in section III.G.3. of the preamble of this final rule, in accordance with section 10324(a) of the Affordable Care Act, beginning in FY 2011, we created a wage index floor of 1.0000 for all hospitals located in States determined to be frontier States. We note that the frontier State floor adjustments are calculated and applied after rural and imputed floor budget neutrality adjustments are calculated for all labor market areas, in order to ensure that no hospital in a frontier State receives a wage index less than 1.0000 due to the rural and imputed floor adjustment. In accordance with section 10324(a) of the Affordable Care Act, the frontier State adjustment is not subject to budget neutrality, and is only extended to hospitals geographically located within a frontier State. However, for purposes of estimating the outlier threshold for FY 2015, it was necessary to apply the transitional wage indexes and adjust the wage index of those eligible hospitals in a frontier State when calculating the outlier threshold that results in outlier payments being 5.1 percent of total payments for FY 2015. If we did not take the above into account, our estimate of total FY 2015 payments would be too low, and, as a result, our outlier threshold would be too high, such that estimated outlier payments would be less than our projected 5.1 percent of total payments.

As we did in establishing the FY 2009 outlier threshold (73 FR 57891), as we proposed and for the reasons discussed above, in our projection of FY 2015 outlier payments, we are not making any adjustments for the possibility that hospitals' CCRs and outlier payments may be reconciled upon cost report settlement.

As described in sections IV.H. and IV.I., respectively, of the preamble of this final rule, sections 1886(q) and 1886(o) of the Act establish the Hospital Readmissions Reduction Program and the Hospital VBP Program, respectively. We do not believe that it is appropriate to include the hospital VBP payment adjustments and the hospital readmissions payment adjustments in the outlier threshold calculation or the outlier offset to the standardized amount. Specifically, consistent with our definition of the base operating DRG payment amount for the Hospital Readmissions Reduction Program under § 412.152 and the Hospital VBP Program under § 412.160, outlier payments under section 1886(d)(5)(A) of the Act are not affected by these payment adjustments. Therefore, outlier payments will continue to be calculated based on the unadjusted base DRG payment amount (as opposed to using the base-operating DRG payment amount adjusted by the hospital readmissions payment adjustment and the hospital VBP payment adjustment). Consequently, we are excluding the hospital VBP payment adjustments and the hospital readmissions payment adjustments from the calculation of the outlier fixed-loss cost threshold.

We note that, to the extent section 1886(r) of the Act modifies the existing DSH payment methodology under section 1886(d)(5)(F), the new uncompensated care payment under section 1886(r)(2), like the empirically justified Medicare DSH payment under section 1886(r)(1), may be considered an amount payable under section 1886(d)(5)(F) of the Act such that it would be reasonable to include the payment in the outlier determination under section 1886(d)(5)(A). As we did for FY 2014, for FY 2015, for the reasons discussed above, we also are allocating an estimated per-discharge uncompensated care payment amount to all cases for the hospitals eligible to receive the uncompensated care payment amount in the calculation of the outlier fixed-loss cost threshold methodology. Specifically, we are using the estimated per-discharge uncompensated care payments to hospitals eligible for the uncompensated care payment for all cases in the calculation of the outlier fixed-loss cost threshold methodology.

Using this methodology, we calculated a final outlier fixed-loss cost threshold for FY 2015 equal to the prospective payment rate for the MS-DRG, plus any IME, empirically justified Medicare DSH payments, estimated uncompensated care payment, and any add-on payments for new technology, plus $24,758.

We note that the final FY 2015 fixed-loss cost threshold is higher than the FY 2014 final outlier fixed-loss cost threshold of $21,748. We believe that the increase in the charge inflation factor (compared to the FY 2014 charge inflation factor) contributed to a higher outlier fixed-loss threshold for FY 2015. As charges increase, so do outlier payments. As a result, it was necessary for us to raise the outlier fixed-loss cost threshold to decrease the amount of outlier payments expended in order to reach the 5.1 percent target.

Also, the final FY 2015 fixed-loss cost threshold is lower than the FY 2015 proposed outlier fixed-loss cost threshold of $25,799. As discussed above, the proposed MS-DRG reclassification and recalibration budget neutrality factor was calculated incorrectly in the proposed rule as a result of the inadvertent miscalculation of a number of postacute care transfer-adjusted cases for certain MS-DRGs. We believe that the corrected factor, which offsets less money from the standardized amount, results in less outlier payments. Therefore, it was necessary to lower the outlier threshold from the proposed rule in the final rule in order to reach the 5.1 percent target.

(2) Other Changes Concerning Outliers

As stated in the FY 1994 IPPS final rule (58 FR 46348), we establish an outlier threshold that is applicable to both hospital inpatient operating costs and hospital inpatient capital-related costs. When we modeled the combined operating and capital outlier payments, we found that using a common threshold resulted in a lower percentage of outlier payments for capital-related costs than for operating costs. We project that the thresholds for FY 2015 will result in outlier payments that will equal 5.1 percent of operating DRG payments and 6.27 percent of capital payments based on the Federal rate.

In accordance with section 1886(d)(3)(B) of the Act, we are reducing the FY 2015 standardized amount by the same percentage to account for the projected proportion of payments paid as outliers.

The outlier adjustment factors that will be applied to the standardized amount based on the FY 2015 outlier threshold are as follows:

Operating standardized amountsCapital federal rate
National0.9489980.937327
Puerto Rico0.9265750.915412

We are applying the outlier adjustment factors to the FY 2015 payment rates after removing the effects of the FY 2014 outlier adjustment factors on the standardized amount.

To determine whether a case qualifies for outlier payments, we apply hospital-specific CCRs to the total covered charges for the case. Estimated operating and capital costs for the case are calculated separately by applying separate operating and capital CCRs. These costs are then combined and compared with the outlier fixed-loss cost threshold.

Under our current policy at § 412.84, we calculate operating and capital CCR ceilings and assign a statewide average CCR for hospitals whose CCRs exceed 3.0 standard deviations from the mean of the log distribution of CCRs for all hospitals. Based on this calculation, for hospitals for which the fiscal intermediary or MAC computes operating CCRs greater than 1.23 or capital CCRs greater than 0.172, or hospitals for which the fiscal intermediary or MAC is unable to calculate a CCR (as described under § 412.84(i)(3) of our regulations), statewide average CCRs are used to determine whether a hospital qualifies for outlier payments. Table 8A listed in section VI. of this Addendum (and available only via the Internet on the CMS Web site) contains the statewide average operating CCRs for urban hospitals and for rural hospitals for which the fiscal intermediary or MAC is unable to compute a hospital-specific CCR within the above range. Effective for discharges occurring on or after October 1, 2014, these statewide average ratios will replace the ratios posted on our Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/FY-2014-IPPS-Final-Rule-Home-Page-Items/FY-2014-IPPS-Final-Rule-CMS-1599-F-Tables.html. Table 8B listed in section VI. of this Addendum (and available via the Internet on the CMS Web site) contains the comparable statewide average capital CCRs. As previously stated, the CCRs in Tables 8A and 8B will be used during FY 2015 when hospital-specific CCRs based on the latest settled cost report are either not available, or are outside the range noted above. Table 8C listed in section VI. of this Addendum (and available via the Internet on the CMS Web site) contains the statewide average total CCRs used under the LTCH PPS as discussed in section V. of this Addendum.

We finally note that we published a manual update (Change Request 3966) to our outlier policy on October 12, 2005, which updated Chapter 3, Section 20.1.2 of the Medicare Claims Processing Manual. The manual update covered an array of topics, including CCRs, reconciliation, and the time value of money. We encourage hospitals that are assigned the statewide average operating and/or capital CCRs to work with their fiscal intermediary or MAC on a possible alternative operating and/or capital CCR as explained in Change Request 3966. Use of an alternative CCR developed by the hospital in conjunction with the fiscal intermediary or MAC can avoid possible overpayments or underpayments at cost report settlement, thereby ensuring better accuracy when making outlier payments and negating the need for outlier reconciliation. We also note that a hospital may request an alternative operating or capital CCR ratio at any time as long as the guidelines of Change Request 3966 are followed. In addition, as mentioned above, we published an additional manual update (Change Request 7192) to our outlier policy on December 3, 2010, which also updated Chapter 3, Section 20.1.2 of the Medicare Claims Processing Manual. The manual update outlines the outlier reconciliation process for hospitals and Medicare contractors. To download and view the manual instructions on outlier reconciliation, we refer readers to the CMS Web site: http://www.cms.hhs.gov/manuals/downloads/clm104c03.pdf.

(3) FY 2013 and FY 2014 Outlier Payments

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50983 through 50984), we stated that, based on available data, we estimated that actual FY 2013 outlier payments would be approximately 4.77 percent of actual total MS-DRG payments. This estimate was computed based on simulations using the FY 2012 MedPAR file (discharge data for FY 2012 claims). That is, the estimate of actual outlier payments did not reflect actual FY 2013 claims, but instead reflected the application of FY 2013 payment rates and policies to available FY 2012 claims.

Our current estimate, using available FY 2013 claims data, is that actual outlier payments for FY 2013 were approximately 4.86 percent of actual total MS-DRG payments. Therefore, the data indicate that, for FY 2013, the percentage of actual outlier payments relative to actual total payments is lower than we projected for FY 2013. Consistent with the policy and statutory interpretation we have maintained since the inception of the IPPS, we do not make retroactive adjustments to outlier payments to ensure that total outlier payments for FY 2013 are equal to 5.1 percent of total MS-DRG payments.

We currently estimate that, using the latest CCRs from the March 2014 update of the PSF, actual outlier payments for FY 2014 will be approximately 5.71 percent of actual total MS-DRG payments, approximately 0.61 percentage point higher than the 5.1 percent we projected when setting the outlier policies for FY 2014. This estimate of 5.71 percent is based on simulations using the FY 2013 MedPAR file (discharge data for FY 2013 claims).

5. FY 2015 Standardized Amount

The adjusted standardized amount is divided into labor-related and nonlabor-related portions. Tables 1A and 1B listed and published in section VI. of this Addendum (and available via the Internet) contain the national standardized amounts that we are applying to all hospitals, except hospitals located in Puerto Rico, for FY 2015. The Puerto Rico-specific amounts are shown in Table 1C listed and published in section VI. of this Addendum (and available via the Internet on the CMS Web site). The amounts shown in Tables 1A and 1B differ only in that the labor-related share applied to the standardized amounts in Table 1A is 69.6 percent, and the labor-related share applied to the standardized amounts in Table 1B is 62 percent. In accordance with sections 1886(d)(3)(E) and 1886(d)(9)(C)(iv) of the Act, we are applying a labor-related share of 62 percent, unless application of that percentage would result in lower payments to a hospital than would otherwise be made. In effect, the statutory provision means that we will apply a labor-related share of 62 percent for all hospitals whose wage indexes are less than or equal to 1.0000.

In addition, Tables 1A and 1B include the standardized amounts reflecting the applicable percentage increases for FY 2015.

Under section 1886(d)(9)(A)(ii) of the Act, the Federal portion of the Puerto Rico payment rate is based on the discharge-weighted average of the national large urban standardized amount (this amount is set forth in Table 1A). The labor-related and nonlabor-related portions of the national average standardized amounts for Puerto Rico hospitals for FY 2015 are set forth in Table 1C listed and published in section VI. of this Addendum (and available via the Internet on the CMS Web site). This table also includes the Puerto Rico-specific standardized amounts. The labor-related share applied to the Puerto Rico-specific standardized amount is the labor-related share of 63.2 percent, or 62 percent, depending on which provides higher payments to the hospital. (Section 1886(d)(9)(C)(iv) of the Act, as amended by section 403(b) of Public Law 108-173, provides that the labor-related share for hospitals located in Puerto Rico be 62 percent, unless the application of that percentage would result in lower payments to the hospital.)

The following table illustrates the changes from the FY 2014 national standardized amount. The second through fifth columns display the changes from the FY 2014 standardized amounts for each applicable FY 2015 standardized amount. The first row of the table shows the updated (through FY 2014) average standardized amount after restoring the FY 2014 offsets for outlier payments, demonstration budget neutrality, the geographic reclassification budget neutrality, and the retrospective documentation and coding adjustment under section 7(b)(1)(B) of Public Law 110-90. The MS-DRG reclassification and recalibration and wage index budget neutrality adjustment factors are cumulative. Therefore, those FY 2014 adjustment factors are not removed from this table.

Comparison of FY 2014 Standardized Amounts to the FY 2015 Standardized Amounts

Hospital submitted quality data and is a meaningful EHR userHospital submitted quality data and is NOT a meaningful EHR userHospital did NOT submit quality sata and is a meaningful EHR userHospital did NOT submit quality data and is NOT a meaningful EHR user
FY 2014 Base Rate after removing: 1. FY 2014 Geographic Reclassification Budget Neutrality (0.990718) 2. FY 2014 Rural Community Hospital Demonstration Program Budget Neutrality (0.999415)If Wage Index is Greater Than 1.0000: Labor (69.6%): $4,230.38. Nonlabor (30.4%): $1,847.75If Wage Index is Greater Than 1.0000: Labor (69.6%): $4,230.38. Nonlabor (30.4%): $1,847.75If Wage Index is Greater Than 1.0000: Labor (69.6%): $4,230.38. Nonlabor (30.4%): $1,847.75If Wage Index is Greater Than 1.0000: Labor (69.6%): $4,230.38. Nonlabor (30.4%): $1,847.75.
3. Cumulative Factor: FY 2008, FY 2009, FY 2012, FY 2013, and FY 2014 Documentation and Coding Adjustment as Required under Sections 7(b)(1)(A) and 7(b)(1)(B) of Pub. L. 110-90 and Documentation and Coding Recoupment Adjustment as required under Section 631 of the American Taxpayer Relief Act of 2012 (0.9403)If Wage Index is less Than or Equal to 1.0000: Labor (62%): $3,768.45. Nonlabor (38%): $2,309.70If Wage Index is less Than or Equal to 1.0000: Labor (62%): $3,768.45. Nonlabor (38%): $2,309.70If Wage Index is less Than or Equal to 1.0000: Labor (62%): $3,768.45. Nonlabor (38%): $2,309.70If Wage Index is less Than or Equal to 1.0000: Labor (62%): $3,768.45. Nonlabor (38%): $2,309.70.
4. FY 2014 Operating Outlier Offset (0.948995)
FY 2015 Update Factor1.0221.014751.014751.0075.
FY 2015 MS-DRG Recalibration and Wage Index Budget Neutrality Factor0.9989820.9989820.9989820.998982.
FY 2015 Reclassification Budget Neutrality Factor0.9904060.9904060.9904060.990406.
FY 2015 Rural Community Demonstration Program Budget Neutrality Factor0.999310.999310.999310.99931.
FY 2015 Operating Outlier Factor0.9489980.9489980.9489980.948998.
Cumulative Factor: FY 2008, FY 2009, FY 2012, FY 2013, FY 2014 and FY 2015 Documentation and Coding Adjustment as Required under Sections 7(b)(1)(A) and 7(b)(1)(B) of Pub. L. 110-90 and Documentation and Coding Recoupment Adjustment as required under Section 631 of the American Taxpayer Relief Act of 20120.93290.93290.93290.9329.
FY 2015 New Labor Market Delineation Wage Index Transition Budget Neutrality Factor0.9988590.9988590.9988590.998859.
National Standardized Amount for FY 2015 if Wage Index is Greater Than 1.0000; Labor/Non-Labor Share Percentage (69.6/30.4)Labor: $3,780.13. Nonlabor: $1,651.09Labor: $3,753.31. Nonlabor: $1,639.38Labor: $3,753.31. Nonlabor: $1,639.38Labor: $3,726.50. Nonlabor: $1,627.66.
National Standardized Amount for FY 2015 if Wage Index is less Than or Equal to 1.0000; Labor/Non-Labor Share Percentage (62/38)Labor: $3,367.36. Nonlabor: $2,063.86Labor: $3,343.47. Nonlabor: $2,049.22Labor: $3,343.47. Nonlabor: $2,049.22Labor: $3,319.58. Nonlabor: $2,034.58.

The following table illustrates the changes from the FY 2014 Puerto Rico-specific payment rate for hospitals located in Puerto Rico. The second column shows the changes from the FY 2014 Puerto Rico specific payment rate for hospitals with a Puerto Rico-specific wage index greater than 1.0000. The third column shows the changes from the FY 2014 Puerto Rico specific payment rate for hospitals with a Puerto Rico-specific wage index less than or equal to 1.0000. The first row of the table shows the updated (through FY 2014) Puerto Rico-specific payment rate after restoring the FY 2014 offsets for Puerto Rico-specific outlier payments, rural community hospital demonstration program budget neutrality, and the geographic reclassification budget neutrality. The MS-DRG recalibration budget neutrality adjustment factor is cumulative and is not removed from this table.

Comparison of FY 2014 Puerto Rico-Specific Payment Rate to the FY 2015 Puerto Rico-Specific Payment Rate

Update (2.2 percent); wage index is greater than 1.0000; labor/Non-labor share percentage (63.2/36.8)Update (2.2 percent); wage index is less than or equal to 1.0000; labor/Non-labor share percentage (62/38)
FY 2014 Puerto Rico Base Rate, after removing: 1. FY 2014 Geographic Reclassification Budget Neutrality (0.990718) 2. FY 2014 Rural Community Hospital Demonstration Program Budget Neutrality (0.999415)
3. FY 2014 Puerto Rico Operating Outlier Offset (0.943455)Labor: $1,722.31 Nonlabor: $1,002.86Labor: $1,689.61 Nonlabor: $1,035.56
FY 2015 Update Factor1.0221.022
FY 2015 MS-DRG Recalibration Budget Neutrality Factor0.9975430.997543
FY 2015 Reclassification Budget Neutrality Factor0.9904060.990406
FY 2015 Rural Community Hospital Demonstration Program Budget Neutrality Factor0.999310.99931
FY 2015 New Labor Market Delineation Wage Index Transition Budget Neutrality Factor0.9988590.998859
FY 2015 Puerto Rico Operating Outlier Factor0.9265750.926575
Puerto Rico-Specific Payment Rate for FY 2015Labor: $1,608.39 Nonlabor: $936.54Labor: $1,577.86 Nonlabor: $967.07

B. Adjustments for Area Wage Levels and Cost-of-Living

Tables 1A through 1C, as published in section VI. of this Addendum (and available via the Internet), contain the labor-related and nonlabor-related shares that we used to calculate the prospective payment rates for hospitals located in the 50 States, the District of Columbia, and Puerto Rico for FY 2015. This section addresses two types of adjustments to the standardized amounts that are made in determining the prospective payment rates as described in this Addendum.

1. Adjustment for Area Wage Levels

Sections 1886(d)(3)(E) and 1886(d)(9)(C)(iv) of the Act require that we make an adjustment to the labor-related portion of the national and Puerto Rico prospective payment rates, respectively, to account for area differences in hospital wage levels. This adjustment is made by multiplying the labor-related portion of the adjusted standardized amounts by the appropriate wage index for the area in which the hospital is located. In section III. of the preamble of this final rule, we discuss the data and methodology for the FY 2015 wage index.

2. Adjustment for Cost-of-Living in Alaska and Hawaii

Section 1886(d)(5)(H) of the Act provides discretionary authority to the Secretary to make “such adjustments . . . as the Secretary deems appropriate to take into account the unique circumstances of hospitals located in Alaska and Hawaii.” Higher labor-related costs for these two States are taken into account in the adjustment for area wages described above. To account for higher nonlabor-related costs for these two States, we multiply the nonlabor-related portion of the standardized amount for hospitals located in Alaska and Hawaii by an adjustment factor.

In the FY 2013 IPPS/LTCH PPS final rule, we established a methodology to update the COLA factors for Alaska and Hawaii that were published by the U.S. Office of Personnel Management (OPM) every 4 years (at the same time as the update to the labor-related share of the IPPS market basket), beginning in FY 2014. We refer readers to the FY 2013 IPPS/LTCH PPS proposed and final rules for additional background and a detailed description of this methodology (77 FR 28145 through 28146 and 77 FR 53700 through 53701, respectively).

For FY 2014, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50985 through 50987), we updated the COLA factors published by OPM for 2009 (as these are the last COLA factors OPM published prior to transitioning from COLAs to locality pay) using the methodology that we finalized in the FY 2013 IPPS/LTCH PPS final rule.

Based on the policy finalized in the FY 2013 IPPS/LTCH PPS final rule, we are using the same COLA factors established in FY 2014 for FY 2015 to adjust the nonlabor-related portion of the standardized amount for hospitals located in Alaska and Hawaii. Below is a table listing the COLA factors for FY 2015.

Final FY 2015 Cost-of-Living Adjustment Factors: Alaska and Hawaii Hospitals

AreaCost of living adjustment factor
Alaska:
City of Anchorage and 80-kilometer (50-mile) radius by road1.23
City of Fairbanks and 80-kilometer (50-mile) radius by road1.23
City of Juneau and 80-kilometer (50-mile) radius by road1.23
Rest of Alaska1.25
Hawaii:
City and County of Honolulu1.25
County of Hawaii1.19
County of Kauai1.25
County of Maui and County of Kalawao1.25

Based on the policy finalized in the FY 2013 IPPS/LTCH PPS final rule, the next update to the COLA factors for Alaska and Hawaii would occur in FY 2018.

C. Calculation of the Prospective Payment Rates

General Formula for Calculation of the Prospective Payment Rates for FY 2015

In general, the operating prospective payment rate for all hospitals paid under the IPPS located outside of Puerto Rico, except SCHs and MDHs, for FY 2015 equals the Federal rate (which includes uncompensated care payments).

We note that, as discussed in section IV.G. of the preamble of this final rule, section 1106 of Public Law 113-67, enacted on December 26, 2013, extended the MDH program from the end of FY 2013 through the first half of FY 2014 (that is, for discharges occurring before April 1, 2014). Subsequently, section 106 of Public Law 113-93, enacted on April 1, 2014, further extended the MDH program through the first half of FY 2015 (that is, for discharges occurring before April 1, 2015). Prior to the enactment of Public Law 113-67, the MDH program was only to be in effect through the end of FY 2013. Under current law, the MDH program will expire for discharges beginning on April 1, 2015.

SCHs are paid based on whichever of the following rates yields the greatest aggregate payment: The Federal national rate (which, as discussed in section IV.F. of the preamble of this final rule, includes uncompensated care payments); the updated hospital-specific rate based on FY 1982 costs per discharge; the updated hospital-specific rate based on FY 1987 costs per discharge; the updated hospital-specific rate based on FY 1996 costs per discharge; or the updated hospital-specific rate based on FY 2006 costs per discharge to determine the rate that yields the greatest aggregate payment.

The prospective payment rate for SCHs for FY 2015 equals the higher of the applicable Federal rate, or the hospital-specific rate as described below.

The prospective payment rate for MDHs for FY 2015 discharges occurring before April 1, 2015 equals the higher of the Federal rate or the Federal rate plus 75 percent of the difference between the Federal rate and the hospital-specific rate as described below. For MDHs, the updated hospital-specific rate is based on FY 1982, FY 1987 or FY 2002 costs per discharge, whichever yields the greatest aggregate payment.

The prospective payment rate for hospitals located in Puerto Rico for FY 2015 equals 25 percent of the Puerto Rico-specific payment rate plus 75 percent of the applicable national rate.

1. Federal Rate

The Federal rate is determined as follows:

Step 1—Select the applicable average standardized amount depending on whether the hospital submitted qualifying quality data and is a meaningful EHR user, as described above.

Step 2—Multiply the labor-related portion of the standardized amount by the applicable wage index for the geographic area in which the hospital is located or the area to which the hospital is reclassified.

Step 3—For hospitals located in Alaska and Hawaii, multiply the nonlabor-related portion of the standardized amount by the applicable cost-of-living adjustment factor.

Step 4—Add the amount from Step 2 and the nonlabor-related portion of the standardized amount (adjusted, if applicable, under Step 3).

Step 5—Multiply the final amount from Step 4 by the relative weight corresponding to the applicable MS-DRG (Table 5 listed in section VI. of this Addendum and available via the Internet).

The Federal payment rate as determined in Step 5 may then be further adjusted if the hospital qualifies for either the IME or DSH adjustment. In addition, for hospitals that qualify for a low-volume payment adjustment under section 1886(d)(12) of the Act and 42 CFR 412.101(b), the payment in Step 5 would be increased by the formula described in section IV.D. of the preamble of this final rule. The base-operating DRG payment amount may be further adjusted by the hospital readmissions payment adjustment and the hospital VBP payment adjustment as described under sections 1886(q) and 1886(o) of the Act, respectively. Finally, we add the uncompensated care payment to the total claim payment amount. We note that, as discussed above, we take uncompensated care payments into consideration when calculating outlier payments.

2. Hospital-Specific Rate (Applicable Only to SCHs and MDHs)

a. Calculation of Hospital-Specific Rate

Section 1886(b)(3)(C) of the Act provides that SCHs are paid based on whichever of the following rates yields the greatest aggregate payment: The Federal rate (which, as discussed in section IV.F. of the preamble of this final rule, includes uncompensated care payments); the updated hospital-specific rate based on FY 1982 costs per discharge; the updated hospital-specific rate based on FY 1987 costs per discharge; the updated hospital-specific rate based on FY 1996 costs per discharge; or the updated hospital-specific rate based on FY 2006 costs per discharge to determine the rate that yields the greatest aggregate payment.

As discussed previously, currently MDHs are paid based on the Federal national rate or, if higher, the Federal national rate plus 75 percent of the difference between the Federal national rate and the greater of the updated hospital-specific rates based on either FY 1982, FY 1987, or FY 2002 costs per discharge.

For a more detailed discussion of the calculation of the hospital-specific rates, we refer readers to the FY 1984 IPPS interim final rule (48 FR 39772); the April 20, 1990 final rule with comment period (55 FR 15150); the FY 1991 IPPS final rule (55 FR 35994); and the FY 2001 IPPS final rule (65 FR 47082). We also refer readers to section IV.F. of the preamble of this final rule for a complete discussion on empirically justified Medicare DSH and uncompensated care payments.

b. Updating the FY 1982, FY 1987, FY 1996 and FY 2006 Hospital-Specific Rate for FY 2015

Section 1886(b)(3)(B)(iv) of the Act provides that the applicable percentage increase applicable to the hospital-specific rates for SCHs and MDHs equals the applicable percentage increase set forth in section 1886(b)(3)(B)(i) of the Act (that is, the same update factor as for all other hospitals subject to the IPPS). Because the Act sets the update factor for SCHs and MDHs equal to the update factor for all other IPPS hospitals, the update to the hospital-specific rates for SCHs and MDHs is subject to the amendments to section 1886(b)(3)(B) of the Act made by sections 3401(a) and 10319(a) of the Affordable Care Act. Accordingly, the applicable percentage increases to the hospital-specific rates applicable to SCHs and MDHs are the following:

FY 2015Hospital submitted quality data and is a meaningful EHR userHospital submitted quality data and is not a meaningful EHR userHospital did not submit quality data and is a meaningful EHR userHospital did not submit quality data and is not a meaningful EHR user
Market Basket Rate-of-Increase2.92.92.92.9
Adjustment for Failure to Submit Quality Data under Section 1886(b)(3)(B)(viii) of the Act0.00.0−0.725−0.725
Adjustment for Failure to be a Meaningful EHR User under Section 1886(b)(3)(B)(ix) of the Act0.0−0.7250.0−0.725
MFP Adjustment under Section 1886(b)(3)(B)(xi) of the Act−0.5−0.5−0.5−0.5
Statutory Adjustment under Section 1886(b)(3)(B)(xii) of the Act−0.2−0.2−0.2−0.2
Applicable Percentage Increase Applied to Hospital-Specific Rate2.21.4751.4750.75

For a complete discussion of the applicable percentage increase applied to the hospital-specific rates for SCHs and MDHs, we refer readers to section IV.B. of the preamble of this final rule.

In addition, because SCHs and MDHs use the same MS-DRGs as other hospitals when they are paid based in whole or in part on the hospital-specific rate, the hospital-specific rate is adjusted by a budget neutrality factor to ensure that changes to the MS-DRG classifications and the recalibration of the MS-DRG relative weights are made in a manner so that aggregate IPPS payments are unaffected. Therefore, a SCH's and MDH's hospital-specific rate is adjusted by the MS-DRG reclassification and recalibration budget neutrality factor of 0.997543, as discussed in section III. of this Addendum. The resulting rate is used in determining the payment rate that an SCH will receive for its discharges beginning on or after October 1, 2014, and the payment rate that an MDH will receive for its discharges beginning on or after October 1, 2014, and before April 1, 2015. We note that, in this final rule, for FY 2015, we are not making a documentation and coding adjustment to the hospital-specific rate. We refer readers to section II.D. of the preamble of this final rule for a complete discussion regarding our policies and previously finalized policies (including our historical adjustments to the payment rates) relating to the effect of changes in documentation and coding that do not reflect real changes in case-mix.

3. General Formula for Calculation of Prospective Payment Rates for Hospitals Located in Puerto Rico Beginning on or After October 1, 2014, and Before October 1, 2015

Section 1886(d)(9)(E)(iv) of the Act provides that, effective for discharges occurring on or after October 1, 2004, hospitals located in Puerto Rico are paid based on a blend of 75 percent of the national prospective payment rate and 25 percent of the Puerto Rico-specific rate.

a. Puerto Rico-Specific Rate

The Puerto Rico-specific prospective payment rate is determined as follows:

Step 1—Select the applicable average standardized amount considering the applicable wage index (obtained from Table 1C published in section VI. of this Addendum and available via the Internet).

Step 2—Multiply the labor-related portion of the standardized amount by the applicable Puerto Rico-specific wage index.

Step 3—Add the amount from Step 2 and the nonlabor-related portion of the standardized amount.

Step 4—Multiply the amount from Step 3 by the applicable MS-DRG relative weight (obtained from Table 5 listed in section VI. of this Addendum and available via the Internet).

Step 5—Multiply the result in Step 4 by 25 percent.

b. National Prospective Payment Rate

The national prospective payment rate is determined as follows:

Step 1—Select the applicable national average standardized amount.

Step 2—Multiply the labor-related portion of the national average standardized amount by the applicable wage index for the geographic area in which the hospital is located or the area to which the hospital is reclassified.

Step 3—Add the amount from Step 2 and the nonlabor-related portion of the national average standardized amount.

Step 4—Multiply the amount from Step 3 by the applicable MS-DRG relative weight (obtained from Table 5 listed in section VI. of this Addendum and available via the Internet on the CMS Web site).

Step 5—Multiply the result in Step 4 by 75 percent.

The sum of the Puerto Rico-specific rate and the national prospective payment rate computed above equals the prospective payment rate for a given discharge for a hospital located in Puerto Rico. This payment rate is then further adjusted if the hospital qualifies for either the IME or DSH adjustment.

Finally, we add the uncompensated care payment to the total claim payment amount. We note that, as discussed above, we take uncompensated care payments into consideration when calculating outlier payments.

III. Changes to Payment Rates for Acute Care Hospital Inpatient Capital-Related Costs for FY 2015

The PPS for acute care hospital inpatient capital-related costs was implemented for cost reporting periods beginning on or after October 1, 1991. Effective with that cost reporting period, over a 10-year transition period (which extended through FY 2001) the payment methodology for Medicare acute care hospital inpatient capital-related costs changed from a reasonable cost-based methodology to a prospective methodology (based fully on the Federal rate).

The basic methodology for determining Federal capital prospective rates is set forth in the regulations at 42 CFR 412.308 through 412.352. Below we discuss the factors that we used to determine the capital Federal rate for FY 2015, which is effective for discharges occurring on or after October 1, 2014.

The 10-year transition period ended with hospital cost reporting periods beginning on or after October 1, 2001 (FY 2002). Therefore, for cost reporting periods beginning in FY 2002, all hospitals (except “new” hospitals under § 412.304(c)(2)) are paid based on the capital Federal rate. For FY 1992, we computed the standard Federal payment rate for capital-related costs under the IPPS by updating the FY 1989 Medicare inpatient capital cost per case by an actuarial estimate of the increase in Medicare inpatient capital costs per case. Each year after FY 1992, we update the capital standard Federal rate, as provided at § 412.308(c)(1), to account for capital input price increases and other factors. The regulations at § 412.308(c)(2) also provide that the capital Federal rate be adjusted annually by a factor equal to the estimated proportion of outlier payments under the capital Federal rate to total capital payments under the capital Federal rate. In addition, § 412.308(c)(3) requires that the capital Federal rate be reduced by an adjustment factor equal to the estimated proportion of payments for exceptions under § 412.348. (We note that, as discussed in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53705), there is generally no longer a need for an exceptions payment adjustment factor.) However, in limited circumstances, an additional payment exception for extraordinary circumstances is provided for under § 412.348(f) for qualifying hospitals. Therefore, in accordance with § 412.308(c)(3), an exceptions payment adjustment factor may need to be applied if such payments are made. Section 412.308(c)(4)(ii) requires that the capital standard Federal rate be adjusted so that the effects of the annual DRG reclassification and the recalibration of DRG weights and changes in the geographic adjustment factor (GAF) are budget neutral.

Section 412.374 provides for blended payments to hospitals located in Puerto Rico under the IPPS for acute care hospital inpatient capital-related costs. Accordingly, under the capital PPS, we compute a separate payment rate specific to hospitals located in Puerto Rico using the same methodology used to compute the national Federal rate for capital-related costs. In accordance with section 1886(d)(9)(A) of the Act, under the IPPS for acute care hospital operating costs, hospitals located in Puerto Rico are paid for operating costs under a special payment formula. Effective October 1, 2004, in accordance with section 504 of Public Law 108-173, the methodology for operating payments made to hospitals located in Puerto Rico under the IPPS was revised to make payments based on a blend of 25 percent of the applicable standardized amount specific to Puerto Rico hospitals and 75 percent of the applicable national average standardized amount. In conjunction with this change to the operating blend percentage, effective with discharges occurring on or after October 1, 2004, we also revised the methodology for computing capital payments made to hospitals located in Puerto Rico to be based on a blend of 25 percent of the Puerto Rico capital rate and 75 percent of the national capital Federal rate (69 FR 49185).

A. Determination of the Federal Hospital Inpatient Capital-Related Prospective Payment Rate Update

In the discussion that follows, we explain the factors that we used to determine the capital Federal rate for FY 2015. In particular, we explain why the FY 2015 capital Federal rate increases approximately 1.2 percent, compared to the FY 2014 capital Federal rate. As discussed in the impact analysis in Appendix A to this final rule, we estimate that capital payments per discharge will increase approximately 1.5 percent during that same period. Because capital payments constitute about 10 percent of hospital payments, a percent change in the capital Federal rate yields only about a 0.1 percent change in actual payments to hospitals.

1. Projected Capital Standard Federal Rate Update

a. Description of the Update Framework

Under § 412.308(c)(1), the capital standard Federal rate is updated on the basis of an analytical framework that takes into account changes in a capital input price index (CIPI) and several other policy adjustment factors. Specifically, we adjust the projected CIPI rate-of-increase as appropriate each year for case-mix index-related changes, for intensity, and for errors in previous CIPI forecasts. The update factor for FY 2015 under that framework is 1.5 percent based on the best data available at this time. The update factor under that framework is based on a projected 1.5 percent increase in the FY 2010-based CIPI, a 0.0 percentage point adjustment for intensity, a 0.0 percentage point adjustment for case-mix, a 0.0 percentage point adjustment for the FY 2013 DRG reclassification and recalibration, and a forecast error correction of 0.0 percentage point. As discussed below in section III.C. of this Addendum, we continue to believe that the CIPI is the most appropriate input price index for capital costs to measure capital price changes in a given year. We also explain the basis for the FY 2015 CIPI projection in that same section of this Addendum. Below we describe the policy adjustments that we are applying in the update framework for FY 2015.

The case-mix index is the measure of the average DRG weight for cases paid under the IPPS. Because the DRG weight determines the prospective payment for each case, any percentage increase in the case-mix index corresponds to an equal percentage increase in hospital payments.

The case-mix index can change for any of several reasons:

  • The average resource use of Medicare patients changes (“real” case-mix change);
  • Changes in hospital documentation and coding of patient records result in higher-weighted DRG assignments (“coding effects”); and
  • The annual DRG reclassification and recalibration changes may not be budget neutral (“reclassification effect”).

We define real case-mix change as actual changes in the mix (and resource requirements) of Medicare patients as opposed to changes in documentation and coding behavior that result in assignment of cases to higher-weighted DRGs, but do not reflect higher resource requirements. The capital update framework includes the same case-mix index adjustment used in the former operating IPPS update framework (as discussed in the May 18, 2004 IPPS proposed rule for FY 2005 (69 FR 28816)). (We no longer use an update framework to make a recommendation for updating the operating IPPS standardized amounts as discussed in section II. of Appendix B to the FY 2006 IPPS final rule (70 FR 47707).)

For FY 2015, we are projecting a 0.5 percent total increase in the case-mix index. We estimated that the real case-mix increase will also equal 0.5 percent for FY 2015. The net adjustment for change in case-mix is the difference between the projected real increase in case-mix and the projected total increase in case-mix. Therefore, as we proposed, the net adjustment for case-mix change in FY 2015 is 0.0 percentage point.

The capital update framework also contains an adjustment for the effects of DRG reclassification and recalibration. This adjustment is intended to remove the effect on total payments of prior year's changes to the DRG classifications and relative weights, in order to retain budget neutrality for all case-mix index-related changes other than those due to patient severity of illness. Due to the lag time in the availability of data, there is a 2-year lag in data used to determine the adjustment for the effects of DRG reclassification and recalibration. For example, we have data available to evaluate the effects of the FY 2013 DRG reclassification and recalibration as part of our update for FY 2015. We estimate that FY 2013 DRG reclassification and recalibration resulted in no change in the case-mix when compared with the case-mix index that would have resulted if we had not made the reclassification and recalibration changes to the DRGs. Therefore, as we proposed, we are making a 0.0 percentage point adjustment for reclassification and recalibration in the update framework for FY 2015.

The capital update framework also contains an adjustment for forecast error. The input price index forecast is based on historical trends and relationships ascertainable at the time the update factor is established for the upcoming year. In any given year, there may be unanticipated price fluctuations that may result in differences between the actual increase in prices and the forecast used in calculating the update factors. In setting a prospective payment rate under the framework, we make an adjustment for forecast error only if our estimate of the change in the capital input price index for any year is off by 0.25 percentage point or more. There is a 2-year lag between the forecast and the availability of data to develop a measurement of the forecast error. A forecast error of 0.0 percentage point was calculated for the FY 2015 update. Historically, when forecast error of the CIPI is greater than 0.25 percentage point in absolute terms, it is reflected in the update recommended under this framework. Current historical data indicate that the forecasted FY 2013 rate-of-increase of the FY 2006-based CIPI (1.2 percent) used in calculating the FY 2013 update factor slightly understated the actual realized FY 2013 price increases of the FY 2006-based CIPI (1.3 percent) by 0.1 percentage point because the prices associated with both the depreciation and other capital-related cost categories grew more quickly than anticipated. Because this forecast error does not exceed the 0.25 percentage point threshold, as we proposed, we are making a 0.0 percentage point adjustment for forecast error in the update for FY 2015.

Under the capital IPPS update framework, we also make an adjustment for changes in intensity. Historically, we calculated this adjustment using the same methodology and data that were used in the past under the framework for operating IPPS. The intensity factor for the operating update framework reflected how hospital services are utilized to produce the final product, that is, the discharge. This component accounts for changes in the use of quality-enhancing services, for changes within DRG severity, and for expected modification of practice patterns to remove noncost-effective services. Our intensity measure is based on a 5-year average.

We calculate case-mix constant intensity as the change in total cost per discharge, adjusted for price level changes (the CIPI for hospital and related services) and changes in real case-mix. Without reliable estimates of the proportions of the overall annual intensity increases that are due, respectively, to ineffective practice patterns and the combination of quality-enhancing new technologies and complexity within the DRG system, we assume that one-half of the annual increase is due to each of these factors. The capital update framework thus provides an add-on to the input price index rate of increase of one-half of the estimated annual increase in intensity, to allow for increases within DRG severity and the adoption of quality-enhancing technology.

In this final rule, we are continuing to use a Medicare-specific intensity measure that is based on a 5-year adjusted average of cost per discharge for FY 2015 (we refer readers to the FY 2011 IPPS/LTCH PPS final rule (75 FR 50436) for a full description of our Medicare-specific intensity measure). Specifically, for FY 2015, we are using an intensity measure that is based on an average of cost per discharge data from the 5-year period beginning with FY 2007 and extending through FY 2012. Based on these data, we estimated that case-mix constant intensity declined during FYs 2007 through 2012. In the past, when we found intensity to be declining, we believed a zero (rather than a negative) intensity adjustment was appropriate. Consistent with this approach, because we estimate that intensity declined during that 5-year period, we believe it is appropriate to continue to apply a zero intensity adjustment for FY 2015. Therefore, as we proposed, we are making a 0.0 percentage point adjustment for intensity in the update for FY 2015.

Above, we described the basis of the components used to develop the 1.5 percent capital update factor under the capital update framework for FY 2015 as shown in the table below.

CMS FY 2015 Update Factor to the Capital Federal Rate

Capital Input Price Index*1.5
Intensity:0.0
Case-Mix Adjustment Factors:
Real Across DRG Change−0.5
Projected Case-Mix Change0.5
Subtotal1.5
Effect of FY 2013 Reclassification and Recalibration0.0
Forecast Error Correction0.0
Total Update1.5
*The capital input price index is based on the FY 2010-based CIPI.

b. Comparison of CMS and MedPAC Update Recommendation

In its March 2014 Report to Congress, MedPAC did not make a specific update recommendation for capital IPPS payments for FY 2015. (We refer readers to MedPAC's Report to the Congress: Medicare Payment Policy, March 2014, Chapter 3.)

2. Outlier Payment Adjustment Factor

Section 412.312(c) establishes a unified outlier payment methodology for inpatient operating and inpatient capital-related costs. A single set of thresholds is used to identify outlier cases for both inpatient operating and inpatient capital-related payments. Section 412.308(c)(2) provides that the standard Federal rate for inpatient capital-related costs be reduced by an adjustment factor equal to the estimated proportion of capital-related outlier payments to total inpatient capital-related PPS payments. The outlier thresholds are set so that operating outlier payments are projected to be 5.1 percent of total operating IPPS DRG payments.

For FY 2014, we estimated that outlier payments for capital will equal 6.07 percent of inpatient capital-related payments based on the capital Federal rate in FY 2014. Based on the thresholds as set forth in section II.A. of this Addendum, we estimate that outlier payments for capital-related costs will equal 6.27 percent for inpatient capital-related payments based on the capital Federal rate in FY 2015. Therefore, we are applying an outlier adjustment factor of 0.9373 in determining the capital Federal rate for FY 2015. Thus, we estimate that the percentage of capital outlier payments to total capital Federal rate payments for FY 2015 will be slightly higher than the percentage for FY 2014.

The outlier reduction factors are not built permanently into the capital rates; that is, they are not applied cumulatively in determining the capital Federal rate. The FY 2015 outlier adjustment of 0.9373 is a −0.21 percent change from the FY 2014 outlier adjustment of 0.9393. Therefore, the net change in the outlier adjustment to the capital Federal rate for FY 2015 is 0.9979 (0.9373/0.9393). Thus, the outlier adjustment will decrease the FY 2015 capital Federal rate by 0.21 percent compared to the FY 2014 outlier adjustment.

3. Budget Neutrality Adjustment Factor for Changes in DRG Classifications and Weights and the GAF

Section 412.308(c)(4)(ii) requires that the capital Federal rate be adjusted so that aggregate payments for the fiscal year based on the capital Federal rate after any changes resulting from the annual DRG reclassification and recalibration and changes in the GAF are projected to equal aggregate payments that would have been made on the basis of the capital Federal rate without such changes. Because we implemented a separate GAF for Puerto Rico, we apply separate budget neutrality adjustments for the national GAF and the Puerto Rico GAF. We apply the same budget neutrality factor for DRG reclassifications and recalibration nationally and for Puerto Rico. Separate adjustments were unnecessary for FY 1998 and earlier because the GAF for Puerto Rico was implemented in FY 1998.

To determine the factors for FY 2015, we compared (separately for the national capital rate and the Puerto Rico capital rate) estimated aggregate capital Federal rate payments based on the FY 2014 MS-DRG classifications and relative weights and the FY 2014 GAF to estimated aggregate capital Federal rate payments based on the FY 2014 MS-DRG classifications and relative weights and the FY 2015 GAFs. To achieve budget neutrality for the changes in the national GAFs, based on calculations using updated data, we are applying an incremental budget neutrality adjustment factor of 0.9999 for FY 2015 to the previous cumulative FY 2014 adjustment factor of 0.9891, yielding an adjustment factor of 0.9890 through FY 2015. For the Puerto Rico GAFs, we are applying an incremental budget neutrality adjustment factor of 1.0012 for FY 2015 to the previous cumulative FY 2014 adjustment factor of 1.0076, yielding a cumulative adjustment factor of 1.0088 through FY 2015.

We then compared estimated aggregate capital Federal rate payments based on the FY 2014 MS-DRG relative weights and the FY 2015 GAFs to estimated aggregate capital Federal rate payments based on the cumulative effects of the FY 2015 MS-DRG classifications and relative weights and the FY 2015 GAFs. The incremental adjustment factor for DRG classifications and changes in relative weights is 0.9987 both nationally and for Puerto Rico. The cumulative adjustment factors for MS-DRG classifications and changes in relative weights and for changes in the GAFs through FY 2015 are 0.9877 nationally and 1.0075 for Puerto Rico. (We note that all the values are calculated with unrounded numbers.) The GAF/DRG budget neutrality adjustment factors are built permanently into the capital rates; that is, they are applied cumulatively in determining the capital Federal rate. This follows the requirement under § 412.308(c)(4)(ii) that estimated aggregate payments each year be no more or less than they would have been in the absence of the annual DRG reclassification and recalibration and changes in the GAFs.

The methodology used to determine the recalibration and geographic adjustment factor (GAF/DRG) budget neutrality adjustment is similar to the methodology used in establishing budget neutrality adjustments under the IPPS for operating costs. One difference is that, under the operating IPPS, the budget neutrality adjustments for the effect of geographic reclassifications are determined separately from the effects of other changes in the hospital wage index and the MS-DRG relative weights. Under the capital IPPS, there is a single GAF/DRG budget neutrality adjustment factor (the national capital rate and the Puerto Rico capital rate are determined separately) for changes in the GAF (including geographic reclassification) and the MS-DRG relative weights. In addition, there is no adjustment for the effects that geographic reclassification has on the other payment parameters, such as the payments for DSH or IME.

The cumulative adjustment factor accounts for the MS-DRG reclassifications and recalibration and for changes in the GAFs. It also incorporates the effects on the GAFs of FY 2015 geographic reclassification decisions made by the MGCRB compared to FY 2014 decisions. However, it does not account for changes in payments due to changes in the DSH and IME adjustment factors.

4. Capital Federal Rate for FY 2015

For FY 2014, we established a capital Federal rate of $429.31 (78 FR 50990). We are establishing an update of 1.5 percent in determining the FY 2015 capital Federal rate for all hospitals. As a result of this update and the budget neutrality factors discussed above, we are establishing a national capital Federal rate of $434.26 for FY 2015. The national capital Federal rate for FY 2015 was calculated as follows:

  • The FY 2015 update factor is 1.015, that is, the update is 1.5 percent.
  • The FY 2015 budget neutrality adjustment factor that is applied to the capital Federal rate for changes in the MS-DRG classifications and relative weights and changes in the GAFs is 0.9986.
  • The FY 2015 outlier adjustment factor is 0.9373.

(We note that, as discussed in section VI.C. of the preamble of this final rule, we are not making an additional MS-DRG documentation and coding adjustment to the capital IPPS Federal rates for FY 2015.)

Because the FY 2015 capital Federal rate has already been adjusted for differences in case-mix, wages, cost-of-living, indirect medical education costs, and payments to hospitals serving a disproportionate share of low-income patients, we are not making additional adjustments in the capital Federal rate for these factors, other than the budget neutrality factor for changes in the MS-DRG classifications and relative weights and for changes in the GAFs.

We are providing the following chart that shows how each of the factors and adjustments for FY 2015 affects the computation of the FY 2015 national capital Federal rate in comparison to the FY 2014 national capital Federal rate. The FY 2015 update factor has the effect of increasing the capital Federal rate by 1.5 percent compared to the FY 2014 capital Federal rate. The GAF/DRG budget neutrality adjustment factor has the effect of decreasing the capital Federal rate by 0.14 percent. The FY 2015 outlier adjustment factor has the effect of decreasing the capital Federal rate by 0.21 percent compared to the FY 2014 capital Federal rate. The combined effect of all the changes will increase the national capital Federal rate by 1.15 percent compared to the FY 2014 national capital Federal rate.

Comparison of Factors and Adjustments: FY 2014 Capital Federal Rate and FY 2015 Capital Federal Rate

FY 2014FY 2015ChangePercent change
Update Factor 1.00901.01501.01501.50
GAF/DRG Adjustment Factor 0.99870.99860.9986−0.14
Outlier Adjustment Factor 0.93930.93730.9979−0.21
Capital Federal Rate429.31434.261.01151.15
The update factor and the GAF/DRG budget neutrality adjustment factors are built permanently into the capital Federal rates. Thus, for example, the incremental change from FY 2014 to FY 2015 resulting from the application of the 0.9986 GAF/DRG budget neutrality adjustment factor for FY 2015 is a net change of 0.9986 (or -0.14 percent).
The outlier reduction factor is not built permanently into the capital Federal rate; that is, the factor is not applied cumulatively in determining the capital Federal rate. Thus, for example, the net change resulting from the application of the FY 2015 outlier adjustment factor is 0.9373/0.9393, or 0.9979 (or −0.21 percent).

In this final rule, we also are providing the following chart that shows how the final FY 2015 capital Federal rate differs from the proposed FY 2015 capital Federal rate as presented in the FY 2015 IPPS/LTCH PPS proposed rule.

Comparison of Factors and Adjustments: Proposed FY 2015 Capital Federal Rate and Final FY 2015 Capital Federal Rate

ProposedFinalChangePercent change
Update Factor1.01501.01501.00000.00
GAF/DRG Adjustment Factor0.99570.99861.00300.30
Outlier Adjustment Factor0.93740.93730.9999−0.01
Capital Federal Rate433.01434.261.00291.29

5. Special Capital Rate for Puerto Rico Hospitals

Section 412.374 provides for the use of a blended payment system for payments made to hospitals located in Puerto Rico under the PPS for acute care hospital inpatient capital-related costs. Accordingly, under the capital PPS, we compute a separate payment rate specific to hospitals located in Puerto Rico using the same methodology used to compute the national Federal rate for capital-related costs. Under the broad authority of section 1886(g) of the Act, beginning with discharges occurring on or after October 1, 2004, capital payments made to hospitals located in Puerto Rico are based on a blend of 25 percent of the Puerto Rico capital rate and 75 percent of the capital Federal rate. The Puerto Rico capital rate is derived from the costs of Puerto Rico hospitals only, while the capital Federal rate is derived from the costs of all acute care hospitals participating in the IPPS (including Puerto Rico).

To adjust hospitals' capital payments for geographic variations in capital costs, we apply a GAF to both portions of the blended capital rate. The GAF is calculated using the operating IPPS wage index, and varies depending on the labor market area or rural area in which the hospital is located. We use the Puerto Rico wage index to determine the GAF for the Puerto Rico part of the capital-blended rate and the national wage index to determine the GAF for the national part of the blended capital rate.

Because we implemented a separate GAF for Puerto Rico in FY 1998, we also apply separate budget neutrality adjustment factors for the national GAF and for the Puerto Rico GAF. However, we apply the same budget neutrality adjustment factor for MS-DRG reclassifications and recalibration nationally and for Puerto Rico. The budget neutrality adjustment factors for the national GAF and for the Puerto Rico GAF and the budget neutrality factor for MS-DRG reclassifications and recalibration (which is the same nationally and for Puerto Rico) are discussed in section III.A.3. of this Addendum.

In computing the payment for a particular Puerto Rico hospital, the Puerto Rico portion of the capital rate (25 percent) is multiplied by the Puerto Rico-specific GAF for the labor market area in which the hospital is located, and the national portion of the capital rate (75 percent) is multiplied by the national GAF for the labor market area in which the hospital is located (which is computed from national data for all hospitals in the United States and Puerto Rico).

For FY 2014, the special capital rate for hospitals located in Puerto Rico was $209.82 (78 FR 50991). With the changes we are making to the factors used to determine the capital Federal rate, the FY 2015 special capital rate for hospitals in Puerto Rico is $209.10.

Comment: One commenter noted that the proposed capital standard Federal rate for Puerto Rico is approximately less than half of the proposed national capital standard Federal rate. The commenter asserted that this “disparity” is “not consistent with the basic reality of Puerto Rico” because average capital costs in Puerto Rico are not that dissimilar from those in the United States.

Response: We appreciate the commenter's attention to the proposed capital Federal rates for Puerto Rico hospitals. While it is not clear what the commenter was specifically requesting, we believe the commenter may have been suggesting that CMS increase the Puerto Rico specific capital Federal rate to reduce the difference between it and the national capital Federal rate. The commenter is correct that the proposed Puerto Rico capital standard Federal rate is approximately half of the proposed national capital standard Federal rate, which has consistently been the difference since those rates were established. The Puerto Rico capital rate is derived from the costs of Puerto Rico hospitals only, while the national capital Federal rate is derived from the costs of all acute care hospitals participating in the IPPS, including Puerto Rico. The commenter did not provide any empirical data to demonstrate that the capital-related costs in Puerto Rico are similar to those in the United States, nor that the blended payment methodology for capital-related costs to hospitals located in Puerto Rico at § 412.374 (that is, 25 percent of the Puerto Rico capital rate and 75 percent of the national capital Federal rate) does not result in appropriate capital IPPS payments for Puerto Rico hospitals. Consequently, we are unable to assess and directly respond to the statements included in the comment. Therefore, in this final rule, we have determined that the Puerto Rico capital Federal rate for FY 2015 is consistent with our current policy.

B. Calculation of the Inpatient Capital-Related Prospective Payments for FY 2015

For purposes of calculating payments for each discharge during FY 2015, the capital Federal rate is adjusted as follows: (Standard Federal Rate) × (DRG weight) × (GAF) × (COLA for hospitals located in Alaska and Hawaii) × (1 + DSH Adjustment Factor + IME Adjustment Factor, if applicable). The result is the adjusted capital Federal rate.

Hospitals also may receive outlier payments for those cases that qualify under the thresholds established for each fiscal year. Section 412.312(c) provides for a single set of thresholds to identify outlier cases for both inpatient operating and inpatient capital-related payments. The outlier thresholds for FY 2015 are in section II.A. of this Addendum. For FY 2015, a case would qualify as a cost outlier if the cost for the case plus the (operating) IME and DSH payments (including both the empirically justified Medicare DSH payment and the estimated uncompensated care payment, as discussed in section II.A.4.g.(1) of this Addendum) is greater than the prospective payment rate for the MS-DRG plus the fixed-loss amount of $24,758.

Currently, as provided under § 412.304(c)(2), we pay a new hospital 85 percent of its reasonable costs during the first 2 years of operation unless it elects to receive payment based on 100 percent of the capital Federal rate. Effective with the third year of operation, we pay the hospital based on 100 percent of the capital Federal rate (that is, the same methodology used to pay all other hospitals subject to the capital PPS).

C. Capital Input Price Index

1. Background

Like the operating input price index, the capital input price index (CIPI) is a fixed-weight price index that measures the price changes associated with capital costs during a given year. The CIPI differs from the operating input price index in one important aspect—the CIPI reflects the vintage nature of capital, which is the acquisition and use of capital over time. Capital expenses in any given year are determined by the stock of capital in that year (that is, capital that remains on hand from all current and prior capital acquisitions). An index measuring capital price changes needs to reflect this vintage nature of capital. Therefore, the CIPI was developed to capture the vintage nature of capital by using a weighted-average of past capital purchase prices up to and including the current year.

We periodically update the base year for the operating and capital input price indexes to reflect the changing composition of inputs for operating and capital expenses. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50603 through 50607), we rebased and revised the CIPI to a FY 2010 base year to reflect the more current structure of capital costs in hospitals. For a complete discussion of this rebasing, we refer readers to the FY 2014 IPPS/LTCH PPS final rule.

2. Forecast of the CIPI for FY 2015

Based on the latest forecast by IHS Global Insight, Inc. (second quarter of 2014), we are forecasting the FY 2010-based CIPI to increase 1.5 percent in FY 2015. This reflects a projected 2.0 percent increase in vintage-weighted depreciation prices (building and fixed equipment, and movable equipment), and a projected 2.7 percent increase in other capital expense prices in FY 2015, partially offset by a projected 1.1 percent decline in vintage-weighted interest expenses in FY 2015. The weighted average of these three factors produces the forecasted 1.5 percent increase for the FY 2010-based CIPI as a whole in FY 2015.

IV. Changes to Payment Rates for Excluded Hospitals: Rate-of-Increase Percentages for FY 2015

Payments for services furnished in children's hospitals, 11 cancer hospitals, and hospitals located outside the 50 States, the District of Columbia and Puerto Rico (that is, short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa) that are excluded from the IPPS are made on the basis of reasonable costs based on the hospital's own historical cost experience, subject to a rate-of-increase ceiling. A per discharge limit (the target amount as defined in § 413.40(a) of the regulations) is set for each hospital based on the hospital's own cost experience in its base year, and updated annually by a rate-of-increase percentage. (We note that, in accordance with § 403.752(a), RNHCIs are also subject to the rate-of-increase limits established under § 413.40 of the regulations.)

In the FY 2015 IPPS/LTCH PPS proposed rule, we proposed that the FY 2015 rate-of-increase percentage for updating the target amounts for the 11 cancer hospitals, children's hospitals, and the short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa, as well as RNHCIs would be the estimated percentage increase in the FY 2015 IPPS operating market basket, in accordance with applicable regulations at § 413.40. As we did in FY 2014, we proposed to use the percentage increase in the FY 2010-based IPPS operating market basket to update these target amounts. Based on IHS Global Insight, Inc.'s 2014 first quarter forecast, we estimated that the FY 2010-based IPPS operating market basket update for FY 2015 was 2.7 percent (that is, the estimate of the market basket rate-of-increase). However, we proposed that if more recent data become available for the final rule, we would use them to calculate the IPPS operating market basket update for FY 2015.

We did not receive any public comments on our proposals.

Based on updated data from IHS Global Insight, Inc.'s 2014 second quarter forecast, we estimate that the final FY 2010-based IPPS operating market basket update for FY 2015 is 2.9 percent (that is, the estimate of the market basket rate-of-increase).

The IRF PPS, the IPF PPS, and the LTCH PPS are updated annually. We refer readers to section VII. of the preamble of this final rule and section V. of the Addendum to this final rule for the update changes to the Federal payment rates for LTCHs under the LTCH PPS for FY 2015. The annual updates for the IRF PPS and the IPF PPS are issued by the agency in separate Federal Register documents.

V. Updates to the Payment Rates for the LTCH PPS for FY 2015

A. LTCH PPS Standard Federal Rate for FY 2015

1. Background

In section VII. of the preamble of this final rule, we discuss our updates to the payment rates, factors, and specific policies under the LTCH PPS for FY 2015.

Under § 412.523(c)(3)(ii) of the regulations, for LTCH PPS rate years beginning RY 2004 through RY 2006, we updated the standard Federal rate annually by a factor to adjust for the most recent estimate of the increases in prices of an appropriate market basket of goods and services for LTCHs. We established this policy of annually updating the standard Federal rate because, at that time, we believed that was the most appropriate method for updating the LTCH PPS standard Federal rate for years after the initial implementation of the LTCH PPS in FY 2003. Therefore, under § 412.523(c)(3)(ii), for RYs 2004 through 2006, the annual update to the LTCH PPS standard Federal rate was equal to the previous rate year's Federal rate updated by the most recent estimate of increases in the appropriate market basket of goods and services included in covered inpatient LTCH services.

In determining the annual update to the standard Federal rate for RY 2007, based on our ongoing monitoring activity, we believed that, rather than solely using the most recent estimate of the LTCH PPS market basket update as the basis of the annual update factor, it was appropriate to adjust the standard Federal rate to account for the effect of documentation and coding in a prior period that was unrelated to patients' severity of illness (71 FR 27818). Accordingly, we established under § 412.523(c)(3)(iii) that the annual update to the standard Federal rate for RY 2007 was zero percent based on the most recent estimate of the LTCH PPS market basket at that time, offset by an adjustment to account for changes in case-mix in prior periods due to the effect of documentation and coding that were unrelated to patients' severity of illness. For RY 2008 through FY 2011, we also made an adjustment for the effect of documentation and coding that was unrelated to patients' severity of illness in establishing the annual update to the standard Federal rate as set forth in the regulations at §§ 412.523(c)(3)(iv) through (c)(3)(vii). For FYs 2012, 2013, and 2014, we updated the standard Federal rate by the most recent estimate of the LTCH PPS market basket at that time, including additional statutory adjustments required by section 1886(m)(3)(A) of the Act as set forth in the regulations at §§ 412.523(c)(3)(viii) through (c)(3)(ix).

Section 1886(m)(3)(A) of the Act, as added by section 3401(c) of the Affordable Care Act, specifies that, for rate year 2010 and each subsequent rate year, any annual update to the standard Federal rate shall be reduced:

  • For rate year 2010 through 2019, by the other adjustment specified in section 1886(m)(3)(A)(ii) and (m)(4) of the Act; and
  • For rate year 2012 and each subsequent year, by the productivity adjustment described in section 1886(b)(3)(B)(xi)(II) of the Act (which we refer to as “the multifactor productivity (MFP) adjustment”) as discussed in section VII.C.2. of the preamble of this final rule.

Section 1886(m)(3)(B) of the Act provides that the application of paragraph (3) of section 1886(m) of the Act may result in the annual update being less than zero for a rate year, and may result in payment rates for a rate year being less than such payment rates for the preceding rate year. (As noted in section VII.C.2.a. of the preamble of this final rule, the annual update to the LTCH PPS occurs on October 1 and we have adopted the term “fiscal year” (FY) rather than “rate year” (RY) under the LTCH PPS beginning October 1, 2010. Therefore, for purposes of clarity, when discussing the annual update for the LTCH PPS, including the provisions of the Affordable Care Act, we use the term “fiscal year” rather than “rate year” for 2011 and subsequent years.)

For FY 2014, consistent with our historical practice, we established an update to the LTCH PPS standard Federal rate based on the full estimated LTCH PPS market basket increase of 2.5 percent and the 0.8 percentage point reductions required by sections 1886(m)(3)(A)(i) and 1886(m)(3)(A)(ii) with 1886(m)(4)(C) of the Act. Accordingly, at § 412.523(c)(3)(x) of the regulations, we established an annual update of 1.7 percent to the standard Federal rate for FY 2014 (78 FR 50761 through 50763).

For FY 2015, as discussed in greater detail in section VII.C.2. of the preamble of this final rule, consistent with our proposal, we are establishing an annual update to the LTCH PPS standard Federal rate based on the full estimated increase in the LTCH PPS market basket, less the MFP adjustment consistent with section 1886(m)(3)(A)(i) of the Act, and less the 0.2 percentage point required by sections 1886(m)(3)(A)(ii) and (m)(4)(E) of the Act. In addition, as discussed in greater detail in section VII.C.2. of the preamble of this final rule, beginning in FY 2014, the annual update will be further reduced by 2.0 percentage points for LTCHs that fail to submit quality reporting data in accordance with the requirements of the LTCHQR Program under section 1886(m)(5) of the Act.

Specifically, in this final rule, based on the best available data, we are establishing an annual update to the standard Federal rate of 2.2 percent, which is based on the full estimated increase in the LTCH PPS market basket of 2.9 percent, less the MFP adjustment of 0.5 percentage point consistent with section 1886(m)(3)(A)(i) of the Act, and less the 0.2 percentage point required by sections 1886(m)(3)(A)(ii) and (m)(4)(E) of the Act. As discussed in greater detail in section VII.C.2.c. of the preamble of this final rule, for LTCHs that fail to submit the required quality reporting data for FY 2015 in accordance with the LTCHQR Program, the annual update is further reduced by 2.0 percentage points as required by section 1886(m)(5) of the Act. Accordingly, we are establishing an annual update to the LTCH PPS standard Federal rate of 0.2 percent for LTCHs that fail to submit the required quality reporting data for FY 2015. This 0.2 percent update is calculated based on the full estimated increase in the LTCH PPS market basket of 2.9 percent, less a MFP adjustment of 0.5 percentage point, less an additional adjustment of 0.2 percentage point required by the statute, and less 2.0 percentage points for failure to submit quality reporting data as required by section 1886(m)(5) of the Act.

2. Development of the FY 2015 LTCH PPS Standard Federal Rate

We continue to believe that the annual update to the LTCH PPS standard Federal rate should be based on the most recent estimate of the increase in the LTCH PPS market basket, including any statutory adjustments. Consistent with our historical practice and as we proposed, for FY 2015, we are applying the annual update to the LTCH PPS standard Federal rate from the previous year. Furthermore, in determining the standard Federal rate for FY 2015, consistent with our proposal, we also are making certain regulatory adjustments. Specifically, we are applying an adjustment factor for the final year of the 3-year phase-in of the one-time prospective adjustment to the standard Federal rate under § 412.523(d)(3), as discussed in greater detail in section VII.C.3. of the preamble of this final rule. In addition, in determining the FY 2015 standard Federal rate, we are applying a budget neutrality adjustment factor for the changes related to the area wage adjustment (that is, changes to the wage data, including the policy to adopt the new OMB delineations, and labor-related share) in accordance with § 412.523(d)(4).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50993and 50993), we established an annual update to the LTCH PPS standard Federal rate of 1.7 percent for FY 2014 based on the full estimated LTCH PPS market basket increase of 2.5 percent, less the MFP adjustment of 0.5 percentage point consistent with section 1886(m)(3)(A)(i) of the Act and less the 0.3 percentage point required by sections 1886(m)(3)(A)(ii) and (m)(4)(C) of the Act. Accordingly, at § 412.523(c)(3)(x), we established an annual update to the standard Federal rate for FY 2014 of 1.7 percent. That is, we applied an update factor of 1.017 to the FY 2013 Federal rate of $40,607.31 to determine the FY 2014 standard Federal rate. We also adjusted the standard Federal rate for FY 2014 by the one-time prospective adjustment factor for FY 2014 of 0.98734 under § 412.523(d)(3)(ii). Furthermore, for FY 2014, we applied an area wage level budget neutrality factor of 1.0010531 to the standard Federal rate to ensure that any changes to the area wage level adjustment (that is, the annual update of the wage index values and labor-related share) would not result in any change (increase or decrease) in estimated aggregate LTCH PPS payments. Consequently, we established a standard Federal rate for FY 2014 of $40,607.31 (calculated as $40,397.96 × 1.017 × 0.98734 × 1.0010531).

In this final rule, we are establishing an annual update to the LTCH PPS standard Federal rate of 2.2 percent (that is, an update factor of 1.022) for FY 2015, based on the full estimated increase in the LTCH PPS market basket of 2.9 percent, less the MFP adjustment of 0.5 percentage point, consistent with section 1886(m)(3)(A)(i) of the Act, and less the 0.2 percentage point required by sections 1886(m)(3)(A)(ii) and (m)(4)(E) of the Act. Therefore, consistent with our proposal, under § 412.523(c)(3)(xi), we are applying a factor of 1.022 to the FY 2014 standard Federal rate of $40,607.31 to determine the FY 2015 standard Federal rate. These factors are based on IGI's second quarter 2014 forecast, which are the best available data at this time. For LTCHs that fail to submit quality reporting data for FY 2015 under the LTCHQR Program, consistent with our proposal, under § 412.523(c)(3)(xi) in conjunction with § 412.523(c)(4), we are reducing the annual update to the LTCH PPS standard Federal rate by an additional 2.0 percentage points consistent with section 1886(m)(5) of the Act. Therefore, we are establishing an annual update to the LTCH PPS standard Federal rate of 0.2 percent (that is, 2.2 percent minus 2.0 percentage points, or an update factor of 1.002) for FY 2015 for LTCHs that fail to submit the required quality reporting data for FY 2015 under the LTCHQR Program. We also are establishing that the standard Federal rate for FY 2015 will be further adjusted by an adjustment factor of 0.98734 for FY 2015 under the final year of the 3-year phase-in of the one-time prospective adjustment at § 412.523(d)(3)(ii). In addition, for FY 2015, we are applying an area wage level budget neutrality factor of 1.0016703 to the standard Federal rate to ensure that any changes to the area wage level adjustment (that is, the annual update of the wage index values and labor-related share) will not result in any change (increase or decrease) in estimated aggregate LTCH PPS payments. Accordingly, we are establishing a standard Federal rate of $41,043.71 (calculated as $40,607.31 × 1.022 × 0.98734 × 1.0016703) for FY 2015. The standard Federal rate of $41,043.71 will apply in determining the payments for FY 2015 discharges from LTCHs that submit quality reporting data for FY 2015 in accordance with the requirements of the LTCHQR Program under section 1886(m)(5) of the Act. For LTCHs that fail to submit quality reporting data for FY 2015 in accordance with the requirements of the LTCHQR Program under section 1886(m)(5) of the Act, we are establishing a standard Federal of $40,240.51 (calculated as $40,607.31 × 1.002 × 0.98734 × 1.0016703) for FY 2015.

B. Adjustment for Area Wage Levels under the LTCH PPS for FY 2015

1. Background

Under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we established an adjustment to the LTCH PPS standard Federal rate to account for differences in LTCH area wage levels under § 412.525(c). The labor-related share of the LTCH PPS standard Federal rate is adjusted to account for geographic differences in area wage levels by applying the applicable LTCH PPS wage index. The applicable LTCH PPS wage index is computed using wage data from inpatient acute care hospitals without regard to reclassification under section 1886(d)(8) or section 1886(d)(10) of the Act.

When we implemented the LTCH PPS, we established a 5-year transition to the full area wage level adjustment. The area wage level adjustment was completely phased-in for cost reporting periods beginning in FY 2007. Therefore, for cost reporting periods beginning on or after October 1, 2006, the applicable LTCH area wage index values are the full LTCH PPS area wage index values calculated based on acute care hospital inpatient wage index data without taking into account geographic reclassification under section 1886(d)(8) and section 1886(d)(10) of the Act. For additional information on the phase-in of the area wage level adjustment under the LTCH PPS, we refer readers to the August 30, 2002 LTCH PPS final rule (67 FR 56015 through 56019) and the RY 2008 LTCH PPS final rule (72 FR 26891).

2. Geographic Classifications (Labor Market Areas) Based on the New OMB Delineations

In adjusting for the differences in area wage levels under the LTCH PPS, the labor-related portion of an LTCH's Federal prospective payment is adjusted by using an appropriate area wage index based on the geographic classification (labor market area) in which the LTCH is located. Specifically, the application of the LTCH PPS area wage level adjustment under existing § 412.525(c) is made based on the location of the LTCH—either in an “urban area,” or a “rural area,” as defined in § 412.503. Under § 412.503, an “urban area” is defined as a Metropolitan Statistical Area (MSAs) (which includes a Metropolitan division, where applicable), as defined by the Executive OMB and a “rural area” is defined as any area outside of an urban area.

The CBSA-based geographic classification (labor market area) definitions currently used under the LTCH PPS, effective for discharges occurring on or after July 1, 2005, are based on the OMB's CBSA definitions that were developed based on 2000 U.S. Census data. As discussed in greater detail in section VII.D. of the preamble of this final rule, OMB announced revisions to the statistical boundaries of its labor market areas for MSAs, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the uses of the delineations of these areas in OMB Bulletin No. 13-01, issued on February 28, 2013 (referred hereinafter as the “new OMB delineations”). As previously stated, at that time, the FY 2014 IPPS/LTCH PPS proposed rule was in the advanced stages of development, and the proposed FY 2014 LTCH PPS area wage indexes had already been developed based on the previous OMB CBSA-based labor market area definitions that are currently used to define CBSA-based labor market areas (referred hereinafter as “CBSA designations”) under the LTCH PPS. Therefore, we did not implement changes to the CBSA designations under the LTCH PPS for FY 2014 based on the new OMB labor market areas delineations that were developed based on 2010 Decennial Census data. Rather, to allow for sufficient time to assess the new changes and their ramifications, we stated that we intended to propose to adopt the new OMB delineations, and the corresponding changes to the area wage index values based on those delineations, under the LTCH PPS for FY 2015 through notice and comment rulemaking. This approach was consistent with the approach used under the IPPS. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50994 through 50995).)

As discussed in section VII.D. of the preamble of this final rule, under the authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, we are adopting the new OMB delineations beginning in FY 2015. We believe that these new OMB delineations are based on the best available data that reflect the local economies and area wage levels of the hospitals that are currently located in these geographic areas. We also believe that the new OMB delineations will ensure that the LTCH PPS area wage level adjustment most appropriately accounts for and reflects the relative hospital wage levels in the geographic area of the hospital as compared to the national average hospital wage level. We note that this policy is consistent with the IPPS policy discussed in section III.B. of the preamble of this final rule. For additional details on our policy to adopt the new OMB delineations, we refer readers to section VII.D. of the preamble of this final rule.

3. LTCH PPS Labor-Related Share

Under the payment adjustment for the differences in area wage levels under § 412.525(c), the labor-related share of an LTCH's PPS Federal prospective payment is adjusted by the applicable wage index for the labor market area in which the LTCH is located. The LTCH PPS labor-related share currently represents the sum of the labor-related portion of operating costs (Wages and Salaries; Employee Benefits; Professional Fees Labor-Related, Administrative and Business Support Services; and All-Other: Labor-Related Services) and a labor-related portion of capital costs using the applicable LTCH PPS market basket. Additional background information on the historical development of the labor-related share under the LTCH PPS and the development of the RPL market basket can be found in the RY 2007 LTCH PPS final rule (71 FR 27810 through 27817 and 27829 through 27830) and the FY 2012 IPPS/LTCH PPS final rule (76 FR 51766 through 51769 and 51808).

For FY 2013, we revised and rebased the market basket used under the LTCH PPS by adopting the newly created FY 2009-based LTCH-specific market basket. In addition, we determined the labor-related share for FY 2013 as the sum of the FY 2013 relative importance of each labor-related cost category of the FY 2009-based LTCH-specific market basket. For more details, we refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53477 through 53479).

Consistent with our historical practice, in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50995 through 50996), we determined the LTCH PPS labor-related share for FY 2014 based on the FY 2014 relative importance of each labor-related cost category, which reflected the different rates of price change for these cost categories between the base year (FY 2009) and FY 2014. Specifically, based on IGI's second quarter 2013 forecast of the FY 2009-based LTCH-specific market basket, we established a labor-related share under the LTCH PPS for FY 2014 of 62.537 percent.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28335), we proposed to establish a labor-related share under the LTCH PPS for FY 2015 of 62.571 percent based on IGI's first quarter 2014 forecast of the FY 2009-based LTCH-specific market basket. Consistent with our historical practice, we also proposed that if more recent data became available, we would use that data to determine the final FY 2015 labor-related share under the LTCH PPS. We did not receive any public comments on this proposal. Therefore, we are adopting the policy as final without modification.

For FY 2015, in this final rule, we are establishing a labor-related share under the LTCH PPS of 62.306 percent based on IGI's second quarter 2014 forecast of the FY 2009-based LTCH-specific market basket. The table below shows the FY 2015 labor-related share relative importance using IGI's second quarter 2014 forecast of the FY 2009-based LTCH-specific market basket. The sum of the relative importance for FY 2015 for operating costs (Wages and Salaries; Employee Benefits; Professional Fees Labor-Related, Administrative and Business Support Services; and All Other: Labor-Related Services) is 58.116 percent. We are establishing that the portion of capital-related costs that is influenced by the local labor market will continue to be estimated to be 46 percent. Because the relative importance for capital-related costs will be 9.109 percent of the FY 2009-based LTCH-specific market basket in FY 2015, we are taking 46 percent of 9.109 percent to determine the labor-related share of capital-related costs for FY 2015, which will result in 4.190 percent (0.46 x 9.109). We then added that 4.190 percent for the capital-related cost amount to the 58.116 percent for the operating cost amount to determine the total labor-related share for FY 2015. Therefore, under the broad authority of section 123 of the BBRA, as amended by section 307(b) of BIPA, to determine appropriate payment adjustments under the LTCH PPS, we are establishing a labor-related share under the LTCH PPS for FY 2015 of 62.306 percent. This labor-related share is determined using the same methodology as used in calculating all previous fiscal years LTCH labor-related shares.

FY 2015 Labor-Related Share Relative Importance Based on the FY 2009-Based LTCH-Specific Market Basket

FY 2015 labor- related share relative importance
Wages and Salaries44.865
Employee Benefits8.072
Professional Fees: Labor-Related2.198
Administrative and Business Support Services0.500
All Other: Labor-Related Services2.481
Subtotal58.116
Proposed Labor-Related Portion of Capital Costs (46%)4.190
Total Labor-Related Share62.306

4. LTCH PPS Wage Index for FY 2015

Historically, we have established LTCH PPS area wage index values calculated from acute care IPPS hospital wage data without taking into account geographic reclassification under sections 1886(d)(8) and 1886(d)(10) of the Act (67 FR 56019). The area wage level adjustment established under the LTCH PPS is based on an LTCH's actual location without regard to the “urban” or “rural” designation of any related or affiliated provider.

In the FY 2014 LTCH PPS final rule (78 FR 50996 through 50997), we calculated the FY 2014 LTCH PPS area wage index values using the same data used for the FY 2014 acute care hospital IPPS (that is, data from cost reporting periods beginning during FY 2010), without taking into account geographic reclassification under sections 1886(d)(8) and 1886(d)(10) of the Act, as these were the most recent complete data available at that time. In that same final rule, we indicated that we computed the FY 2014 LTCH PPS area wage index values consistent with the urban and rural geographic classifications (labor market areas) that were in place at that time, and consistent with the pre-reclassified IPPS wage index policy (that is, our historical policy of not taking into account IPPS geographic reclassifications in determining payments under the LTCH PPS). As with the IPPS wage index, wage data for multicampus hospitals with campuses located in different labor market areas (CBSAs) are apportioned to each CBSA where the campus (or campuses) are located. We also continued to use our existing policy for determining area wage index values for areas where there are no IPPS wage data.

Consistent with our historical methodology, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28336 through 28337), to determine the applicable area wage index values under the LTCH PPS for FY 2015, under the broad authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, to determine appropriate payment adjustments under the LTCH PPS, we proposed to use wage data collected from cost reports submitted by IPPS hospitals for cost reporting periods beginning during FY 2011, without taking into account geographic reclassification under sections 1886(d)(8) and 1886(d)(10) of the Act. We proposed to use FY 2011 wage data because these data are the most recent complete data available. We also noted that these are the same data used to compute the proposed FY 2015 acute care hospital inpatient wage index, as discussed in section III. of the preamble of that proposed rule. We proposed to compute the FY 2015 LTCH PPS area wage index values consistent with the proposed “urban” and “rural” geographic classifications (that is, using the proposed new OMB labor market area delineations), and consistent with our historical policy of not taking into account IPPS geographic reclassifications under sections 1886(d)(8) and 1886(d)(10) of the Act in determining payments under the LTCH PPS. We also proposed to continue to apportion wage data for multicampus hospitals with campuses located in different labor market areas to each CBSA where the campus or campuses are located, consistent with the IPPS policy. Lastly, under our proposed methodology for determining the FY 2015 LTCH PPS area wage index values, we proposed to continue to use our existing policy for determining area wage index values for areas where there are no IPPS wage data. (We refer readers to section V.B.4. of the Addendum to the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28336 through 28337) for additional details regarding our proposals pertaining to the development of the LTCH PPS wage index values for FY 2015, which we are adopting as final without modification in this final rule, as discussed below.)

Comment: One commenter provided information received from a procured contractor that attempted to replicate the proposed FY 2015 LTCH wage index values using the IPPS wage index data from the FY 2011 cost report data that CMS made available on its Web site. As part of that analysis, the contractor also explored the variance between the FY 2014 LTCH PPS wage index values and the proposed FY 2015 LTCH PPS wage index values for certain LTCHs that were projected to experience a relatively significant change in their wage index. In particular, the analysis prepared by the commenter's contractor focused on specific CBSAs (particularly CBSA 23540 and CBSA 34740) that were projected to experience “a significant decline” in their wage index values for FY 2015 when compared to FY 2014, although there has been no change in the constituent of hospitals used to compute the wage index values for these areas. The commenter requested that CMS reexamine the wage data used to calculate the FY 2015 LTCH PPS wage index values for CBSAs that would experience a decrease in their wage index values for FY 2015 when compared to the FY 2014 LTCH PPS wage index values for these CBSAs, and to explain the cause for those decreases.

Response: As requested by the commenter, we reexamined the IPPS wage data used to calculate the FY 2015 LTCH PPS wage index values for CBSAs that were projected to experience a decrease in their wage index values for FY 2015 when compared to the FY 2014 LTCH PPS wage index values for these CBSAs, focusing our attention on the CBSAs referenced by the commenter. We found no issues with the IPPS hospital wage data from the FY 2011 cost reports, or with the calculation of the FY 2015 LTCH PPS wage index values. In exploring the cause for the decrease in the wage index values for CBSAs projected to experience “a significant decline” in their FY 2015 wage index values when compared to the FY 2014 LTCH PPS wage index values for these CBSAs, we found that many of these CBSAs were comprised of three or less hospitals. A labor market area's wage index value is calculated as the ratio of the labor market area's average hourly wage to the national average hourly wage. Labor market areas (CBSAs) with fewer providers are generally subject to less stability in year-to-year wage index values because there is less of an averaging effect, wherein even relatively minor changes in one provider's wage data can produce a relatively “significant” effect on the wage index value for that area. This is because such a change in one provider's wage data has a relatively greater effect on the CBSA's average hourly wage (based solely on the limited number of hospitals in that area) when compared to the effect that such a change has on the national average hourly wage (which is based on wage data from all hospitals). We note that there also are CBSAs that were projected to experience a “significant increase” in their wage index values for the same reason. We believe that these wage index changes are appropriate because these values are based on the most recent data available that reflect the relative hospital wage level in a geographic area (CBSA) in comparison to the national average hospital wage level.

After consideration of the public comments we received, in this final rule, we are finalizing our proposals pertaining to the development of the LTCH PPS wage index values for FY 2015, without modification. Therefore, consistent with our historical methodology, to determine the applicable area wage index values under the LTCH PPS for FY 2015, under the broad authority of section 123 of the BBRA, as amended by section 307(b) of the BIPA, to determine appropriate payment adjustments under the LTCH PPS, we are using wage data collected from cost reports submitted by IPPS hospitals for cost reporting periods beginning during FY 2011, without taking into account geographic reclassification under sections 1886(d)(8) and 1886(d)(10) of the Act. We are using FY 2011 wage data because these data are the most recent complete data available. These are the same data used to compute the FY 2015 acute care hospital inpatient wage index values, as discussed in section III. of the preamble of this final rule. (For our rationale for using IPPS hospital wage data as a proxy for determining the area wage index values used under the LTCH PPS, we refer readers to the FY 2010 IPPS/RY 2010 LTCH PPS final rule (74 FR 44024 through 44025).) The FY 2015 LTCH PPS area wage index values were computed consistent with the “urban” and “rural” geographic classifications (that is, using the new OMB labor market area delineations), as discussed in section VII.D. of the preamble of this final rule, and consistent with the pre-reclassified IPPS wage index policy (that is, our historical policy of not taking into account IPPS geographic reclassifications under sections 1886(d)(8) and 1886(d)(10) of the Act in determining payments under the LTCH PPS). As with the IPPS wage index, we are continuing to apportion wage data for multicampus hospitals with campuses located in different labor market areas to each CBSA where the campus or campuses are located, as discussed in section III.G. of the preamble of this final rule. Furthermore, in determining the FY 2015 LTCH PPS area wage index values, we are continuing to use our existing policy for determining area wage index values for areas where there are no IPPS wage data using the methodology we established in the RY 2009 LTCH PPS final rule. For more information about this methodology, including an explanation of and rationale for our policy for determining LTCH PPS wage index values for areas that have no IPPS wage data, we refer readers to the RY 2009 LTCH PPS final rule (73 FR 26817 through 26818).

There are currently no LTCHs located in labor market areas without IPPS hospital wage data (or IPPS hospitals). However, as discussed in the proposed rule, if an LTCH were to open in one of these labor market areas, LTCH PPS wage index values for such an area would be calculated using our established methodology. Under our existing methodology, the LTCH PPS wage index value for urban CBSAs with no IPPS wage data is determined by using an average of all of the urban areas within the State, and the LTCH PPS wage index value for rural areas with no IPPS wage data is determined by using the unweighted average of the wage indices from all of the CBSAs that are contiguous to the rural counties of the State.

Based on the FY 2011 IPPS wage data that we are using to determine the FY 2015 LTCH PPS area wage index values in this final rule, there are no IPPS wage data for the urban area Hinesville, GA (CBSA 25980). Consistent with the methodology discussed above, we calculated the FY 2015 wage index value for CBSA 25980 as the average of the wage index values for all of the other urban areas within the State of Georgia (that is, CBSAs 10500, 12020, 12060, 12260, 15260, 16860, 17980, 19140, 23580, 31420, 40660, 42340, 46660 and 47580), as shown in Table 12A, which is listed in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site). We note that, as IPPS wage data are dynamic, it is possible that urban areas without IPPS wage data will vary in the future.

Based on FY 2011 IPPS wage data that we are using to determine the FY 2015 LTCH PPS area wage index values in this final rule, there are no rural areas without IPPS hospital wage data. Therefore, as discussed in the proposed rule, it is not necessary to use our established methodology to calculate an LTCH PPS wage index value for proposed rural areas with no IPPS wage data for FY 2015. We note that, as IPPS wage data are dynamic, it is possible that rural areas without IPPS wage data will vary in the future.

For FY 2015, we are adopting the new OMB delineations under the LTCH PPS, as discussed in greater detail in section VII.D. of the preamble of this final rule. Under this policy, there will be some changes to the current CBSA compositions as a result of the new OMB delineations, which will result in the creation of new CBSAs, “urban” counties that are now “rural,” “rural” counties that are now “urban,” and existing CBSAs that are divided into separate boundaries. Under existing § 412.503, an “urban area” is defined as a Metropolitan Statistical Area as defined by the Executive OMB, and a “rural area” is defined as any area outside of an urban area. We are not making any changes to the current definitions of “urban area” and “rural area” because our policy to use the new OMB delineations under the LTCH PPS is consistent with the definitions in existing § 412.503.

As discussed in section VII.D.2.e. of the preamble of this final rule, overall we believe that using the new OMB delineations will result in LTCH PPS area wage index values being more representative of the actual costs of labor in a given area. However, we also recognize that, as a result of our policy to adopt the new OMB delineations, some LTCHs will experience decreases in area wage index values, while other LTCHs will experience increases in area wage index values. Therefore, to mitigate any short-term instability in LTCH PPS payments that could result from our policy to adopt the new OMB delineations, in section VII.D.2.e. of the preamble of this final rule, we are finalizing our proposed transitional wage index policy. Under our transitional wage index policy, any LTCH that will experience a decrease in its area wage index solely as a result of the policy to adopt the new OMB delineations under the LTCH PPS will receive a blended area wage index for FY 2015. That is, for purposes of determining an LTCH's area wage index for FY 2015, we are computing LTCH PPS area wage index values using the area wage data discussed above under both the current (FY 2014) CBSA designations and the new OMB delineations. If the area wage index value under the new OMB delineations is lower than the area wage index value under the FY 2014 CBSA designations, the LTCH will be paid based on a blended area wage index for FY 2015, which will be computed as the sum of 50 percent of each wage index value (referred to as the 50/50 blended wage index), as described below.

Specifically, under the transitional wage index policy that we are establishing in this final rule, to determine the applicable area wage index value for each LTCH that will be effective for discharges occurring on or after October 1, 2014, through September 30, 2015, we computed the following two area wage index values: (1) the wage index values calculated using the new OMB delineations; and (2) the wage index values calculated using the current (FY 2014) CBSA designations. The FY 2015 LTCH area wage index values calculated using the new OMB delineations are presented in Table 12A (for urban areas) and Table 12B (for rural areas) associated with this final rule, which are available via the Internet on the CMS Web site. The FY 2015 LTCH area wage index values calculated using the current (FY 2014) CBSA designations are presented in Table 12C (for urban areas) and Table 12D (for rural areas) associated with this final rule, which are available via the Internet on the CMS Web site. Where applicable, the wage index values in Tables 12C and 12D will be used to calculate a LTCH's 50/50 blended wage index value under the transitional wage index policy. Under our transitional wage index policy, an LTCH will only receive the 50/50 blended area wage index value for FY 2015 if the LTCH's area wage index value under the new OMB delineations (shown in Table 12A or 12B) is lower than the area wage index value under the FY 2014 CBSA designations (shown in Tables 12C or 12D). If an LTCH's area wage index under the new OMB delineations (shown in Tables 12A or 12B) is higher than the wage index under the FY 2014 CBSA designations (shown in Tables 12C or 12D), we will pay the LTCH based on 100 percent of the area wage index under the new OMB delineations shown in Tables 12A or 12B (as such the LTCH will not receive the 50/50 blended area wage index). Furthermore, as discussed below and in section VII.D.2.e. of the preamble of this final rule, we are applying this transitional wage index policy in a budget neutral manner. Each LTCH's labor market area under the new OMB delineations and the current (FY 2014) CBSA-based labor market area designation can be found in the LTCH PPS impact file for this final rule, which is available via the Internet on the CMS Web site.

5. Budget Neutrality Adjustment for Changes to the Area Wage Level Adjustment

Historically, the LTCH PPS wage index and labor-related share are updated annually based on the latest available data. Under § 412.525(c)(2), any changes to the area wage index values or labor-related share are to be made in a budget neutral manner such that estimated aggregate LTCH PPS payments are unaffected; that is, will be neither greater than nor less than estimated aggregate LTCH PPS payments without such changes to the area wage level adjustment. Under this policy, we determine an area wage-level adjustment budget neutrality factor that will be applied to the standard Federal rate to ensure that any changes to the area wage level adjustments are budget neutral such that any changes to the area wage index values or labor-related share would not result in any change (increase or decrease) in estimated aggregate LTCH PPS payments. Accordingly, under § 412.523(d)(4), we apply an area wage level adjustment budget neutrality factor in determining the standard Federal rate, and we also established a methodology for calculating an area wage level adjustment budget neutrality factor. (For additional information on the establishment of our budget neutrality policy for changes to the area wage level adjustment, we refer readers to the FY 2012 IPPS/LTCH PPS final rule (76 FR 51771 through 51773 and 51809).)

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28337 through 28338), in accordance with § 412.523(d)(4), we proposed to apply an area wage level adjustment budget neutrality factor to adjust the standard Federal rate to account for the estimated effect of the adjustments or updates to the area wage level adjustment under § 412.525(c)(1) on estimated aggregate LTCH PPS payments using our existing methodology. In determining the area wage level adjustment budget neutrality factor for FY 2015 under § 412.523(d)(4), we also proposed to include the proposed transitional wage index policy under the proposed adoption of the new OMB delineations (that is, the proposed 50/50 blended area wage index values for LTCHs that would experience a decrease in the their wage index solely as a result of the proposed adoption of the new OMB delineations under the LTCH PPS) to ensure that the proposed changes to the area wage level adjustments would be budget neutral. We did not receive any public comments on our proposals pertaining to the FY 2015 budget neutrality adjustment for changes to the area wage level adjustment. Therefore, in this final rule, we are adopting our proposal as final without modification.

In this final rule, for FY 2015, in accordance with § 412.523(d)(4), we are applying an area wage level adjustment budget neutrality factor to adjust the standard Federal rate to account for the estimated effect of the adjustments or updates to the area wage level adjustment under § 412.525(c)(1) on estimated aggregate LTCH PPS payments using a methodology that is consistent with the methodology we established in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51773). In addition to the updates for FY 2015 to the area wage index data and labor-related share discussed above, as discussed above and in section VII.D.2.e. of the preamble of this final rule, we are establishing a transitional wage index policy to mitigate the impacts of adopting changes to the LTCH PPS labor market areas (CBSAs) based on the new OMB delineations. Because our transitional wage index policy for LTCHs that will experience a decrease in their area wage index solely as a result of the adoption of the new OMB delineations under the LTCH PPS will result in an increase in estimated aggregate LTCH PPS payments without such changes, we are including the 50/50 blended area wage index when determining the area wage level adjustment budget neutrality factor that we are applying to the standard Federal rate under § 412.523(d)(4) to ensure that any changes to the area wage-level adjustments are budget neutral.

For this final rule, using the steps in the methodology described in section VII.D.2.e. of this preamble, we determined a FY 2015 area wage level adjustment budget neutrality factor of 1.0016703. Accordingly, in section V.A.2. of the Addendum to this final rule, to determine the FY 2015 LTCH PPS standard Federal rate, we are applying an area wage level adjustment budget neutrality factor of 1.0016703, in accordance with § 412.523(d)(4). The FY 2015 LTCH PPS standard Federal rate shown in Table 1E of the Addendum to this final rule reflects this adjustment factor.

C. LTCH PPS Cost-of-Living Adjustment (COLA) for LTCHs Located in Alaska and Hawaii

Under § 412.525(b), a cost-of-living adjustment (COLA) is provided for LTCHs located in Alaska and Hawaii to account for the higher costs incurred in those States. Specifically, we apply a COLA to payments to LTCHs located in Alaska and Hawaii by multiplying the nonlabor-related portion of the standard Federal payment rate by the applicable COLA factors established annually by CMS. Higher labor-related costs for LTCHs located in Alaska and Hawaii are taken into account in the adjustment for area wage levels described above.

Prior to FY 2014, we used the most recent updated COLA factors obtained from the U.S. Office of Personnel Management (OPM) Web site at http://www.opm.gov/oca/cola/rates.asp to adjust the LTCH PPS payments for LTCHs located in Alaska and Hawaii. Statutory changes have transitioned the Alaska and Hawaii COLAs to locality pay (phased in over a 3-year period beginning in January 2010, with COLA rates being frozen as of October 28, 2009, and then proportionately reduced to reflect the phase-in of locality pay). For FY 2013, we believed that it was appropriate to use “frozen” COLA factors to adjust payments, while we explored alternatives for updating the COLA factors in the future, and we continued to use the same “frozen” COLA factors used in FY 2012 to adjust the nonlabor-related portion of the standard Federal rate for LTCHs located in Alaska and Hawaii in FY 2013 under § 412.525(b). We also established a methodology to update the COLA factors for Alaska and Hawaii every 4 years (at the same time as the update to the labor-related share of the IPPS market basket), beginning in FY 2014 (77 FR 53712 through 53713). The methodology we established to update the COLA factors is based on a comparison of the growth in the CPIs for Anchorage, Alaska, and Honolulu, Hawaii, relative to the growth in the CPI for the average U.S. city as published by the Bureau of Labor Statistics (BLS). It also incorporates a 25-percent cap on the CPI-updated COLA factors, which is consistent with a statutorily mandated 25-percent cap that was applied to OPM's published COLA factors. We believe that determining updated COLA factors using this methodology would appropriately adjust the nonlabor-related portion of the standard Federal rate for LTCHs located in Alaska and Hawaii. (For additional details on the methodology we established in the FY 2013 IPPS/LTCH PPS final rule to update the COLA factors for Alaska and Hawaii beginning in FY 2014, we refer readers to section VII.D.3. of the preamble of that final rule (77 FR 53481 through 53482).)

For FY 2014, we updated the COLA factors published for Alaska and Hawaii by OPM for 2009 (as these are the last COLA factors OPM published prior to transitioning from COLAs to locality pay) using the methodology that we finalized in the FY 2013 IPPS/LTCH PPS final rule. Under our finalized methodology, we used COLA factors for FY 2014 for the three specified urban areas of Alaska (Anchorage, Fairbanks and Juneau) of 1.23; for the City and County of Honolulu, the County of Kauai, the County of Maui, the County of Kalawao, and “All other” areas of Alaska of 1.25; and for the County of Hawaii of 1.19. For additional details on our policy, we refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50997 through 50998).

Under our finalized policy, we update the COLA factors using the methodology described above every 4 years; the first year began in FY 2014 (77 FR 53482). Therefore, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28338), for FY 2015, under the broad authority conferred upon the Secretary by section 123 of the BBRA, as amended by section 307(b) of the BIPA, to determine appropriate payment adjustments under the LTCH PPS, we proposed to continue to use the COLA factors based on the 2009 OPM COLA factors updated through 2012 by the comparison of the growth in the CPIs for Anchorage, Alaska, and Honolulu, Hawaii, relative to the growth in the CPI for the average U.S. city as established in the FY 2014 IPPS/LTCH PPS final rule. We did not receive any public comments on this proposal. Therefore, in this final rule, we are adopting the policy as final without modification.

Accordingly, in this final rule, for FY 2015, under the broad authority conferred upon the Secretary by section 123 of the BBRA, as amended by section 307(b) of the BIPA, to determine appropriate payment adjustments under the LTCH PPS, we are continuing to use the COLA factors established in the FY 2014 IPPS/LTCH PPS final rule, which were based on the 2009 OPM COLA factors updated through 2012 by the comparison of the growth in the CPIs for Anchorage, Alaska, and Honolulu, Hawaii, relative to the growth in the CPI for the average U.S. city. (We refer readers to the FY 2014 IPPS/LTCH PPS final rule (78 FR 50998) for a discussion of the FY 2014 COLA factors.) Consistent with our historical practice, we are establishing that the COLA factors shown in the table below will be used to adjust the nonlabor-related portion of the standard Federal rate for LTCHs located in Alaska and Hawaii under § 412.525(b).

Cost-of-Living Adjustment Factors for Alaska and Hawaii Hospitals Under the LTCH PPS for FY 2015

Alaska:
City of Anchorage and 80-kilometer (50-mile) radius by road1.23
City of Fairbanks and 80-kilometer (50-mile) radius by road1.23
City of Juneau and 80-kilometer (50-mile) radius by road1.23
All other areas of Alaska1.25
Hawaii:
City and County of Honolulu1.25
County of Hawaii1.19
County of Kauai1.25
County of Maui and County of Kalawao1.25

D. Adjustment for LTCH PPS High-Cost Outlier (HCO) Cases

1. Background

Under the broad authority conferred upon the Secretary by section 123 of the BBRA as amended by section 307(b) of the BIPA, in the regulations at § 412.525(a), we established an adjustment for additional payments for outlier cases that have extraordinarily high costs relative to the costs of most discharges. We refer to these cases as high cost outliers (HCOs). Providing additional payments for outliers strongly improves the accuracy of the LTCH PPS in determining resource costs at the patient and hospital level. These additional payments reduce the financial losses that would otherwise be incurred when treating patients who require more costly care and, therefore, reduce the incentives to underserve these patients. We set the outlier threshold before the beginning of the applicable rate year so that total estimated outlier payments are projected to equal 8 percent of total estimated payments under the LTCH PPS.

Under § 412.525(a) in the regulations (in conjunction with § 412.503), we make outlier payments for any discharges if the estimated cost of a case exceeds the adjusted LTCH PPS payment for the MS-LTC-DRG plus a fixed-loss amount. Specifically, in accordance with § 412.525(a)(3) (in conjunction with § 412.503), we make an additional payment for an HCO case that is equal to 80 percent of the difference between the estimated cost of the patient case and the outlier threshold, which is the sum of the adjusted Federal prospective payment for the MS-LTC-DRG and the fixed-loss amount. The fixed-loss amount is the amount used to limit the loss that a hospital will incur under the outlier policy for a case with unusually high costs. This results in Medicare and the LTCH sharing financial risk in the treatment of extraordinarily costly cases. Under the LTCH PPS HCO policy, the LTCH's loss is limited to the fixed-loss amount and a fixed percentage of costs above the outlier threshold (adjusted MS-LTC-DRG payment plus the fixed-loss amount). The fixed percentage of costs is called the marginal cost factor. We calculate the estimated cost of a case by multiplying the Medicare allowable covered charge by the hospital's overall hospital cost-to-charge ratio (CCR).

Under the LTCH PPS HCO policy at § 412.525(a), we determine a fixed-loss amount, that is, the maximum loss that an LTCH can incur under the LTCH PPS for a case with unusually high costs before the LTCH will receive any additional payments. We calculate the fixed-loss amount by estimating aggregate payments with and without an outlier policy. The fixed-loss amount results in estimated total outlier payments being projected to be equal to 8 percent of projected total LTCH PPS payments. Currently, MedPAR claims data and CCRs based on data from the most recent Provider-Specific File (PSF) (or from the applicable statewide average CCR if an LTCH's CCR data are faulty or unavailable) are used to establish a fixed-loss threshold amount under the LTCH PPS.

2. Determining LTCH CCRs Under the LTCH PPS

a. Background

The following is a discussion of CCRs that are used in determining payments for HCO and SSO cases under the LTCH PPS, at § 412.525(a) and § 412.529, respectively. Although this section is specific to HCO cases, because CCRs and the policies and methodologies pertaining to them are used in determining payments for both HCO and SSO cases (to determine the estimated cost of the case at § 412.529(d)(2)), we are discussing the determination of CCRs under the LTCH PPS for both of these types of cases simultaneously.

In determining both HCO payments (at § 412.525(a)) and SSO payments (at § 412.529), we calculate the estimated cost of the case by multiplying the LTCH's overall CCR by the Medicare allowable charges for the case. In general, we use the LTCH's overall CCR, which is computed based on either the most recently settled cost report or the most recent tentatively settled cost report, whichever is from the latest cost reporting period, in accordance with § 412.525(a)(4)(iv)(B) and § 412.529(f)(4)(ii) for HCOs and SSOs, respectively. (We note that, in some instances, we use an alternative CCR, such as the statewide average CCR in accordance with the regulations at § 412.525(a)(4)(iv)(C) and § 412.529(f)(4)(iii), or a CCR that is specified by CMS or that is requested by the hospital under the provisions of the regulations at § 412.525(a)(4)(iv)(A) and § 412.529(f)(4)(i).) Under the LTCH PPS, a single prospective payment per discharge is made for both inpatient operating and capital-related costs. Therefore, we compute a single “overall” or “total” LTCH-specific CCR based on the sum of LTCH operating and capital costs (as described in Section 150.24, Chapter 3, of the Medicare Claims Processing Manual (Pub. 100-4)) as compared to total charges. Specifically, an LTCH's CCR is calculated by dividing an LTCH's total Medicare costs (that is, the sum of its operating and capital inpatient routine and ancillary costs) by its total Medicare charges (that is, the sum of its operating and capital inpatient routine and ancillary charges).

b. LTCH Total CCR Ceiling

Generally, an LTCH is assigned the applicable statewide average CCR if, among other things, an LTCH's CCR is found to be in excess of the applicable maximum CCR threshold (that is, the LTCH CCR ceiling). This is because CCRs above this threshold are most likely due to faulty data reporting or entry, and CCRs based on erroneous data should not be used to identify and make payments for outlier cases. Therefore, under our established policy, generally, if an LTCH's calculated CCR is above the applicable ceiling, the applicable LTCH PPS statewide average CCR is assigned to the LTCH instead of the CCR computed from its most recent (settled or tentatively settled) cost report data.

In this final rule, using our established methodology for determining the LTCH total CCR ceiling (described above), based on IPPS total CCR data from the March 2014 update of the PSF, consistent with our proposal, we are establishing a total CCR ceiling of 1.346 under the LTCH PPS for FY 2015 in accordance with § 412.525(a)(4)(iv)(C)(2) for HCOs and § 412.529(f)(4)(iii)(B) for SSOs.

c. LTCH Statewide Average CCRs

Our general methodology established for determining the statewide average CCRs used under the LTCH PPS is similar to our established methodology for determining the LTCH total CCR ceiling (described above) because it is based on “total” IPPS CCR data. Under the LTCH PPS HCO policy at § 412.525(a)(4)(iv)(C) and the SSO policy at § 412.529(f)(4)(iii), the MAC may use a statewide average CCR, which is established annually by CMS, if it is unable to determine an accurate CCR for an LTCH in one of the following circumstances: (1) new LTCHs that have not yet submitted their first Medicare cost report (for this purpose, consistent with current policy, a new LTCH is defined as an entity that has not accepted assignment of an existing hospital's provider agreement in accordance with § 489.18); (2) LTCHs whose CCR is in excess of the LTCH CCR ceiling; and (3) other LTCHs for whom data with which to calculate a CCR are not available (for example, missing or faulty data). (Other sources of data that the MAC may consider in determining an LTCH's CCR include data from a different cost reporting period for the LTCH, data from the cost reporting period preceding the period in which the hospital began to be paid as an LTCH (that is, the period of at least 6 months that it was paid as a short-term, acute care hospital), or data from other comparable LTCHs, such as LTCHs in the same chain or in the same region.)

Consistent with our historical practice of using the best available data, in this final rule, using our established methodology for determining the LTCH statewide average CCRs, based on the most recent complete IPPS “total CCR” data from the March 2014 update of the PSF, consistent with our proposal, we are establishing LTCH PPS statewide average total CCRs for urban and rural hospitals that would be effective for discharges occurring on or after October 1, 2014 through September 20, 2015, in Table 8C listed in section VI. of the Addendum to this final rule (and available via the Internet).

Under the changes to the LTCH PPS labor market areas based on the new OMB delineations, all areas in Delaware, the District of Columbia, New Jersey, and Rhode Island would be classified as urban. Therefore, there are no rural statewide average total CCRs listed for those jurisdictions in Table 8C. This policy is consistent with the policy that we established when we revised our methodology for determining the applicable LTCH statewide average CCRs in the FY 2007 IPPS final rule (71 FR 48119 through 48121) and is the same as the policy applied under the IPPS. In addition, although Connecticut and Massachusetts have areas that are designated as rural, there are no short-term, acute care IPPS hospitals or LTCHs located in those areas as of March 2014. Therefore, consistent with our existing methodology, we are using the national average total CCR for rural IPPS hospitals for rural Connecticut and Massachusetts in Table 8C listed in section VI. of the Addendum to this final rule (and available via the Internet).

In addition, consistent with our existing methodology, in determining the urban and rural statewide average total CCRs for Maryland LTCHs paid under the LTCH PPS, consistent with our proposal, we are continuing to use, as a proxy, the national average total CCR for urban IPPS hospitals and the national average total CCR for rural IPPS hospitals, respectively. We are using this proxy because we believe that the CCR data in the PSF for Maryland hospitals may not be entirely accurate (as discussed in greater detail in the FY 2007 IPPS final rule (71 FR 48120)).

d. Reconciliation of LTCH HCO and SSO Payments

We note that under the LTCH PPS HCO policy at § 412.525(a)(4)(iv)(D) and the LTCH PPS SSO policy at § 412.529(f)(4)(iv), the payments for HCO and SSO cases, respectively, are subject to reconciliation. Specifically, any reconciliation of outlier payments is based on the CCR that is calculated based on a ratio of cost-to-charge data computed from the relevant cost report determined at the time the cost report coinciding with the discharge is settled. For additional information, we refer readers to sections 150.26 through 150.28 of the Medicare Claims Processing Manual (Pub. 100-4) as added by Change Request 7192 (Transmittal 2111; December 3, 2010) and the RY 2009 LTCH PPS final rule (73 FR 26820 through 26821).

3. Establishment of the LTCH PPS Fixed-Loss Amount for FY 2015

When we implemented the LTCH PPS, as discussed in the August 30, 2002 LTCH PPS final rule (67 FR 56022 through 56026), under the broad authority of section 123 of the BBRA as amended by section 307(b) of BIPA, we established a fixed-loss amount so that total estimated outlier payments are projected to equal 8 percent of total estimated payments under the LTCH PPS. To determine the fixed-loss amount, we estimate outlier payments and total LTCH PPS payments for each case using claims data from the MedPAR files. Specifically, to determine the outlier payment for each case, we estimate the cost of the case by multiplying the Medicare covered charges from the claim by the LTCH's CCR. Under § 412.525(a)(3) (in conjunction with § 412.503), if the estimated cost of the case exceeds the outlier threshold, we make an outlier payment equal to 80 percent of the difference between the estimated cost of the case and the outlier threshold (that is, the sum of the adjusted Federal prospective payment for the MS-LTC-DRG and the fixed-loss amount).

In the FY 2014 IPPS/LTCH PPS final rule (78 FR 53715), we presented our policies regarding the methodology and data we used to establish the fixed-loss amount of $13,314 for FY 2014, which was calculated using our existing methodology to calculate the fixed-loss amount for FY 2014 (based on the data and the rates and policies presented in that final rule) in order to maintain estimated HCO payments at the projected 8 percent of total estimated LTCH PPS payments. Consistent with our historical practice of using the best data available, in determining the fixed-loss amount for FY 2014, we used the most recent available LTCH claims data and CCR data, that is, LTCH claims data from the March 2013 update of the FY 2012 MedPAR file and CCRs from the March 2013 update of the PSF, as these data were the most recent complete LTCH data available at that time.

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28321), we proposed to continue to use our existing methodology to calculate a fixed-loss amount for FY 2015 using the best available data that would maintain estimated HCO payments at the projected 8 percent of total estimated LTCH PPS payments (based on the rates and policies presented in this proposed rule). Specifically, based on the most recent complete LTCH data available at that time (that is, LTCH claims data from the March 2014 update of the FY 2013 MedPAR file and CCRs from the March 2014 update of the PSF), we proposed to determine a fixed-loss amount for FY 2015 that would result in estimated outlier payments projected to be equal to 8 percent of total estimated payments in FY 2015. Under the broad authority of section 123(a)(1) of the BBRA and section 307(b)(1) of the BIPA, we proposed a fixed-loss amount of $15,730 for FY 2015, and also proposed to make an additional payment for an HCO case that is equal to 80 percent of the difference between the estimated cost of the case and the outlier threshold (the sum of the adjusted Federal LTCH payment for the MS-LTC-DRG and the proposed fixed-loss amount of $15,730).

Comment: One commenter expressed support for the proposed fixed-loss amount, and stated that the proposed increase for FY 2015 is justified. That same commenter also requested that CMS provide its most recent estimate of the percentage payout of high-cost outlier payments for the current fiscal year. Another commenter expressed concern that the proposed increase in the fixed-loss amount would result in significant financial losses for hospitals that treat a comparatively high volume of outlier cases, and recommended that the increase be transitioned in over 2 years to reduce the impact of this increase in the fixed-loss amount.

Response: We appreciate the commenter's support for the proposed fixed-loss amount, and agree that the increase is necessary to maintain estimated HCO payments at the projected 8 percent of total estimated LTCH PPS payments (as explained in the proposed rule). In section I.K. of the regulatory impact analysis in the Appendix to this final rule, we state that we currently estimate that HCO payments will be approximately 7.9 percent of the estimated total LTCH PPS payments in FY 2014 based on the most recent data available.

While we understand the commenter's concern regarding the financial impact an increase in the fixed-loss amount may have on the outlier payments to some LTCH's, we do not believe that the increase should be phased-in over 2 years. The intent of the HCO policy is to provide an additional payment to LTCH cases that have unusually high costs while at the same time balancing an incentive for LTCHs to treat expensive patients and provide cost efficient care. (We refer readers to the FY 2003 LTCH PPS final rule (67 FR 56025) for further details regarding the intent of this policy.) Under our historical HCO policy, this balance is achieved by making outlier payments that are intended to approximate the marginal cost of providing care above the fixed-loss threshold. We believe that phasing-in the increase to the fixed-loss amount would be inconsistent with the intent of the LTCH PPS HCO policy because such a policy would reduce the incentive to provide cost efficient care by resulting in estimated outlier payments that are in excess of 8 percent of total estimated payments in FY 2015. (For additional detail on the rationale for setting the HCO payment “target” at 8 percent of total estimated LTCH PPS payments, we refer readers to the FY 2003 LTCH PPS final rule (67 FR 56022 through 56024).) Furthermore, any auxiliary adjustment to the fixed-loss amount, such as a transition, would result in making outlier payments that would not be directly related to the cost of providing care to unusually costly cases in FY 2015. When we determine the annual fixed-loss amount, we include all payments and policies that would affect actual payments for the current fiscal year in order to ensure the most accurate determination of a fixed-loss amount that would result in estimated outlier payments equaling 8 percent of total estimated for the fiscal year. Including an auxiliary adjustment, such as a transition, that is not relative to the current fiscal year does not lend greater accuracy to the determination of a fixed-loss amount that would result in estimated outlier payments equaling 8 percent of total estimated payments in FY 2015. For these reasons, we continue to believe that our policies are consistent with the original intent of the HCO policy under the LTCH PPS and, therefore, we are not adopting the commenter's suggestion to phase-in the increase to the fixed-loss amount for FY 2015.

In this final rule, after consideration of the public comments we received, we are adopting our proposals related to the calculation of the fixed-loss amount for FY 2015 as final without modification. For FY 2015, consistent with our proposal, we are continuing to use our existing methodology to calculate a fixed-loss amount for FY 2015 using the best available data that would maintain estimated HCO payments at the projected 8 percent of total estimated LTCH PPS payments (based on the rates and policies presented in this final rule). Specifically, for this final rule, we used LTCH claims data from the March 2014 update of the FY 2013 MedPAR file and CCRs from the March 2014 update of the PSF to determine a fixed-loss amount that will result in estimated outlier payments projected to be equal to 8 percent of total estimated payments in FY 2015 because these data are the most recent complete LTCH data available at this time. Under the broad authority of section 123(a)(1) of the BBRA and section 307(b)(1) of BIPA, we are establishing a fixed-loss amount of $14,972 for FY 2015. Therefore, we are making an additional payment for an HCO case that is equal to 80 percent of the difference between the estimated cost of the case and the outlier threshold (the sum of the adjusted Federal LTCH payment for the MS-LTC-DRG and the fixed-loss amount of $14,972).

We note that the fixed-loss amount of $14,792 for FY 2015 is lower than the proposed FY 2015 fixed-loss amount of $15,730. This decrease is primarily a result of updated data used to calculate the fixed-loss amount in this final rule, such as the most recent available LTCH claims data in the MedPAR file, CCRs in the PSF, and the estimate of the LTCH PPS market basket update factors. We also note that the fixed-loss amount of $14,972 for FY 2015 is slightly higher than the FY 2014 fixed-loss amount of $13,314. Based on our payment simulations using the most recent available data at this time, the final increase in the fixed-loss amount for FY 2015 is necessary to maintain the existing requirement that estimated outlier payments equal 8 percent of estimated total LTCH PPS payments. Maintaining the fixed-loss amount at the current level would result in HCO payments that are more than the current regulatory 8-percent requirement because a lower fixed-loss amount would result in more cases qualifying as outlier cases, as well as higher outlier payments for qualifying HCO cases because the maximum loss that an LTCH must incur before receiving an HCO payment (that is, the fixed-loss amount) would be smaller. For these reasons, we believe that raising the fixed-loss amount is appropriate and necessary to maintain that estimated outlier payments would equal 8 percent of estimated total LTCH PPS payments as required under § 412.525(a). (As noted above, for further information on the existing 8 percent HCO “target” requirement, we refer readers to the August 30, 2002 LTCH PPS final rule (67 FR 56022 through 56024).)

4. Application of the Outlier Policy to SSO Cases

As we discussed in the August 30, 2002 final rule (67 FR 56026), under some rare circumstances, an LTCH discharge could qualify as an SSO case (as defined in the regulations at § 412.529 in conjunction with § 412.503) and also as an HCO case. In this scenario, a patient could be hospitalized for less than five-sixths of the geometric average length of stay for the specific MS-LTC-DRG, and yet incur extraordinarily high treatment costs. If the estimated costs exceeded the HCO threshold (that is, the SSO payment plus the fixed-loss amount), the discharge is eligible for payment as an HCO. Therefore, for an SSO case in FY 2015, the HCO payment would be 80 percent of the difference between the estimated cost of the case and the outlier threshold (the sum of the fixed-loss amount of $14,972 and the amount paid under the SSO policy as specified in § 412.529).

E. Update to the IPPS Comparable/Equivalent Amounts To Reflect the Statutory Changes to the IPPS DSH Payment Adjustment Methodology

In the FY 2014 IPPS/LTCH PPS final rule, we established a policy for reflecting the changes to the Medicare IPPS DSH payment adjustment methodology provided for by section 3133 of the Affordable Care Act in the calculation of the “IPPS comparable amount” under the SSO policy at § 412.529 and the “IPPS equivalent amount” under the 25-percent threshold payment adjustment policy at § 412.534 and § 412.536. Historically, the determination of both the “IPPS comparable amount” and the “IPPS equivalent amount” includes an amount for inpatient operating costs “for the costs of serving a disproportionate share of low-income patients.” Under the statutory changes to the Medicare DSH payment adjustment methodology that began in FY 2014, in general, eligible IPPS hospitals receive an empirically justified Medicare DSH payment equal to 25 percent of the amount they otherwise would have received under the statutory formula for Medicare DSH payments prior to the amendments made by the Affordable Care Act. The remaining amount, equal to an estimate of 75 percent of the amount that otherwise would have been paid as Medicare DSH payments, reduced to reflect changes in the percentage of individuals under the age of 65 who are uninsured, is made available to make additional payments to each hospital that qualifies for Medicare DSH payments and that has uncompensated care. The additional uncompensated care payments are based on the hospital's amount of uncompensated care for a given time period relative to the total amount of uncompensated care for that same time period reported by all IPPS hospitals that receive Medicare DSH payments.

To reflect the statutory changes to the Medicare DSH payment adjustment methodology in the calculation of the “IPPS comparable amount” and the “IPPS equivalent amount” under the LTCH PPS, we stated that we will include a reduced Medicare DSH payment amount that reflects the projected percentage of the payment amount calculated based on the statutory Medicare DSH payment formula prior to the amendments made by the Affordable Care Act that will be paid to eligible IPPS hospitals as empirically justified Medicare DSH payments and uncompensated care payments in that year (that is, a percentage of the operating DSH payment amount that has historically been reflected in the LTCH PPS payments that is based on IPPS rates). We also stated that the projected percentage will be updated annually, consistent with the annual determination of the amount of uncompensated care payments that will be made to eligible IPPS hospitals. As explained in the FY 2014 IPPS/LTCH PPS final rule (79 FR 50766 through 50767), we believe that this approach results in appropriate payments under the LTCH PPS and is consistent with our intention that the “IPPS comparable amount” and the “IPPS equivalent amount” under the LTCH PPS closely resemble what an IPPS payment would have been for the same episode of care, while recognizing that some features of the IPPS cannot be translated directly into the LTCH PPS.

For FY 2014, aggregate Medicare IPPS operating DSH payments are projected to be reduced to 95.7 percent of the amount that would otherwise have been paid under the statutory Medicare DSH payment formula prior to the amendments made by the Affordable Care Act. Accordingly, for FY 2014, the calculation of the “IPPS comparable amount” under § 412.529 and the “IPPS equivalent amount” under § 412.534 and § 412.536 includes an applicable operating Medicare DSH payment amount that is equal to 95.7 percent of the operating Medicare DSH payment amount based the current statutory Medicare DSH payment formula (that is, the operating Medicare DSH payment amount historically included in those calculations). (We refer readers the FY 2012 IPPS/LTCH PPS final rule (76 FR 50766).)

In the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28341 through 28342), we discussed that, for FY 2015, based on the latest data available, we project that the reduction in the amount of Medicare DSH payments pursuant to section 1886(r)(1) of the Act, along with the proposed payments for uncompensated care under section 1886(r)(2) of the Act, would result in overall Medicare DSH payments equaling 85.26 percent of the amount of Medicare DSH payments that would otherwise have been made in the absence of amendments made by the Affordable Care Act. Therefore, we proposed that the calculation of the “IPPS comparable amount” under § 412.529 and the “IPPS equivalent amount” under § 412.534 and § 412.536 for FY 2015 includes an applicable operating Medicare DSH payment amount that would be equal to 85.26 percent of the operating Medicare DSH payment amount based on the statutory Medicare DSH payment formula prior to the amendments made by the Affordable Care Act. We also proposed that, consistent with our historical practice of using the most recent data available, if more recent data became available for the final rule, we would use that data to determine the percentage of the operating Medicare DSH payment amount based on the statutory Medicare DSH payment formula prior to the amendments made by the Affordable Care Act used in the calculation of the “IPPS comparable amount” under § 412.529 and the “IPPS equivalent amount” under § 412.534 and § 412.536 for FY 2015.

As discussed in greater detail in section IV.F.3.d.(2) of the preamble of this final rule, based on the most recent data available, our estimate of 75 percent of the amount that would otherwise have been paid as Medicare DSH payments (under the methodology outlined in section 1886(r)(2) of the Act) will be adjusted to 76.19 percent of that amount to reflect the change in the percentage of individuals that are uninsured. The resulting amount will then be used to determine the amount of uncompensated care payments that will be made to eligible IPPS hospitals in FY 2015. In other words, Medicare DSH payments prior to the amendments made by the Affordable Care Act are adjusted to 57.14 percent (the product of 75 percent and 76.19 percent) and the resulting amount will be used to calculate the uncompensated care payments to eligible hospitals. As a result, for FY 2015, we project that the reduction in the amount of Medicare DSH payments pursuant to section 1886(r)(1) of the Act, along with the payments for uncompensated care under section 1886(r)(2) of the Act, will result in overall Medicare DSH payments of 82.14 percent of the amount of Medicare DSH payments that would otherwise have been made in the absence of amendments made by the Affordable Care Act (that is, 25 percent + 57.14 percent = 82.14 percent).

We did not receive any public comments on this proposal and therefore we are adopting the policy as final without modification. In this final rule, for FY 2015, we are establishing that the calculation of the “IPPS comparable amount” under § 412.529 and the “IPPS equivalent amount” under § 412.534 and § 412.536 will include an applicable operating Medicare DSH payment amount that will be equal to 82.14 percent of the operating Medicare DSH payment amount based on the statutory Medicare DSH payment formula prior to the amendments made by the Affordable Care Act.

F. Computing the Adjusted LTCH PPS Federal Prospective Payments for FY 2015

Section 412.525 sets forth the adjustments to the LTCH PPS standard Federal rate. Under § 412.525(c), the standard Federal rate is adjusted to account for differences in area wages by multiplying the labor-related share of the standard Federal rate by the applicable LTCH PPS wage index (FY 2015 values are shown in Tables 12A through 12D listed in section VI. of the Addendum of this final rule and are available via the Internet). The standard Federal rate is also adjusted to account for the higher costs of LTCHs located in Alaska and Hawaii by the applicable COLA factors (the FY 2015 factors are shown in the chart in section V.C. of this Addendum) in accordance with § 412.525(b). In this final rule, we are establishing a standard Federal rate for FY 2015 of $41,043.71 (applicable to discharges from LTCHs that submit the required quality reporting data for FY 2015 in accordance with the LTCHQR Program under section 1886(m)(5) of the Act), as discussed above in section V.A.2. of the Addendum to this final rule. We illustrate the methodology to adjust the LTCH PPS Federal standard rate for FY 2015 in the following example:

Example:

During FY 2015, a Medicare patient is in an LTCH located in Chicago, Illinois (CBSA 16974). The FY 2015 LTCH PPS wage index value for CBSA 16974 is 1.0419 (obtained from Table 12A listed in section VI. of the Addendum of this final rule and available via the Internet on the CMS Web site). The Medicare patient is classified into MS-LTC-DRG 189 (Pulmonary Edema & Respiratory Failure), which has a relative weight for FY 2015 of 0.9098 (obtained from Table 11 listed in section VI. of the Addendum of this final rule and available via the Internet on the CMS Web site). The LTCH submitted quality reporting data for FY 2015 in accordance with the LTCHQR Program under section 1886(m)(5) of the Act.

To calculate the LTCH's total adjusted Federal prospective payment for this Medicare patient in FY 2015, we computed the wage-adjusted Federal prospective payment amount by multiplying the unadjusted FY 2015 standard Federal rate ($41,043.71) by the labor-related share (62.306 percent) and the wage index value (1.0419). This wage-adjusted amount was then added to the nonlabor-related portion of the unadjusted standard Federal rate (37.694 percent; adjusted for cost of living, if applicable) to determine the adjusted Federal rate, which is then multiplied by the MS-LTC-DRG relative weight (0.9098) to calculate the total adjusted Federal LTCH PPS prospective payment for FY 2015 ($38,316.42). The table below illustrates the components of the calculations in this example.

Standard Federal Prospective Payment Rate $41,043.71
Labor-Related Share× 0.62306
Labor-Related Portion of the Federal Rate= $ 25,572.69
Wage Index (CBSA 16974)× 1.0419
Wage-Adjusted Labor Share of Federal Rate= $ 26,644.19
Nonlabor-Related Portion of the Federal Rate ($41,043.71 × 0.37694)+ $ 15,471.02
Adjusted Federal Rate Amount= $ 42,115.21
MS-LTC-DRG 189 Relative Weight× 0.9098
Total Adjusted Federal Prospective Payment= $ 38, 316.42
* LTCH PPS standard Federal rate applicable to discharges from LTCHs that submit the required quality data in accordance with the LTCHQR Program under section 1886(m)(5) of the Act).

VI. Tables Referenced in This Final Rule and Available Only Through the Internet on the CMS Web site

This section lists the tables referred to throughout the preamble of this final rule and in this Addendum. In the past, a majority of these tables were published in the Federal Register as part of the annual proposed and final rules. However, similar to FYs 2012 through 2014, for the FY 2015 rulemaking cycle, the IPPS and LTCH tables will not be published in the Federal Register in the annual IPPS/LTCH PPS proposed and final rules and will be available only through the Internet. Specifically, all IPPS Tables listed below with the exception of IPPS Tables 1A, 1B, 1C, and 1D, and LTCH PPS Table 1E will be available only through the Internet. IPPS Tables 1A, 1B, 1C, and 1D, and LTCH PPS Table 1E are displayed at the end of this section and will continue to be published in the Federal Register as part of the annual proposed and final rules.

As discussed in sections II.G.11. and 13. of the preamble of this final rule, Tables 6A through 6F will not be issued with this FY 2015 final rule because there are no new, revised, or deleted diagnosis or procedure codes for FY 2015. As discussed in section IV.D. of this final rule, section 106 of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93), enacted on April 1, 2014, extended, through the first half of FY 2015 (that is, for discharges occurring before April 1, 2015), the temporary changes in the low-volume hospital definition and methodology for determining the payment adjustment originally made by the Affordable Care Act (and extended by subsequent legislation). We refer the reader to section IV.D. of the preamble of this final rule for complete details on the low-volume hospital payment adjustment. Therefore, Table 14 associated with this final rule lists the FY 2015 low-volume payment adjustments for potentially eligible hospitals that also meet the distance criterion for low-volume hospital status. As discussed in section IV.H.11. of the preamble of this final rule, we are providing proxy FY 2015 readmission payment adjustment factors in Table 15A issued with this final rule. After the completion of the review and corrections process, we will publish the final FY 2015 readmissions payment adjustment factors in Table 15B on the CMS IPPS Web site. In addition, under the HAC Reduction Program established by section 3008 of the Affordable Care Act, a hospital's total payment may be reduced by 1 percent if it is in the lowest HAC performance quartile. However, as discussed in section IV.J. of the preamble of this final rule, we are not providing the hospital-level data (such as a proxy list of providers subject to the HAC Reduction Program in FY 2015 in Table 17) in conjunction with this final rule. Finally, a hospital's Factor 3 is the proportion of the uncompensated care amount that a DSH will receive under section 3133 of the Affordable Care Act. Factor 3 is the hospital's estimated number of Medicaid days and Medicare SSI days relative to the estimate of all DSHs' Medicaid days and Medicare SSI days. Therefore, Table 18 contains the FY 2015 Medicare DSH uncompensated care payment Factor 3 for all hospitals and identifies whether or not a hospital is projected to receive DSH and, therefore, eligible to receive the additional payment for uncompensated care for FY 2015.

Readers who experience any problems accessing any of the tables that are posted on the CMS Web sites identified below should contact Michael Treitel at (410) 786-4552.

The following IPPS tables for this FY 2015 final rule are available only through the Internet on the CMS Web site at: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/index.html. Click on the link on the left side of the screen titled, “FY 2015 IPPS Final Rule Home Page” or “Acute Inpatient—Files for Download”.

Table 2-1.—Hospital Average Hourly Wages for Federal Fiscal Years 2013 (2009 Wage Data), 2014 (2010 Wage Data), and 2015 (2011 Wage Data); and 3-Year Average of Hospital Average Hourly Wages; Based on CBSA Delineations used in FY 2014

Table 2-2.—Acute Care Hospitals Case-Mix Indexes for Discharges Occurring in Federal Fiscal Year 2012; Hospital Wage Indexes for Federal Fiscal Year 2015; Hospital Average Hourly Wages for Federal Fiscal Years 2013 (2009 Wage Data), 2014 (2010 Wage Data), and 2015 (2011 Wage Data; Based on FY 2015 CBSA Delineations); and 3-Year Average of Hospital Average Hourly Wages

Table 3A-1.—FY 2015 and 3-Year* Average Hourly Wage for Acute Care Hospitals in Urban Areas by CBSA; Based on CBSA Delineations Used in FY 2014

Table 3A-2.—FY 2015 and 3-Year* Average Hourly Wage for Acute Care Hospitals in Urban Areas by CBSA; Based on CBSA Delineations Used in FY 2015

Table 3B-1.—FY 2015 and 3-Year* Average Hourly Wage for Acute Care Hospitals in Rural Areas by CBSA; Based on CBSA Delineations Used in FY 2014

Table 3B-2.—FY 2015 and 3-Year* Average Hourly Wage for Acute Care Hospitals in Rural Areas by CBSA; Based on CBSA Delineations Used in FY 2015

Table 4A-1.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals in Urban Areas by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4A-2.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals in Urban Areas by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4B-1.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals in Rural Areas by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4B-2.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals in Rural Areas by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4C-1.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals That Are Reclassified by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4C-2.—Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals That Are Reclassified by CBSA and by State—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4D-1.—States Designated as Frontier, with Acute Care Hospitals Receiving at a Minimum the Frontier State Floor Wage Index; Urban Areas with Acute Care Hospitals Receiving the Statewide Rural Floor or Imputed Floor Wage Index—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4D-2.—States Designated as Frontier, with Acute Care Hospitals Receiving at a Minimum the Frontier State Floor Wage Index; Urban Areas with Acute Care Hospitals Receiving the Statewide Rural Floor or Imputed Floor Wage Index—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4E-1.—Urban CBSAs and Constituent Counties for Acute Care Hospitals—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4E—2.—Urban CBSAs and Constituent Counties for Acute Care Hospitals—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4F-1.—Puerto Rico Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals by CBSA—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 4F-2.—Puerto Rico Wage Index and Capital Geographic Adjustment Factor (GAF) for Acute Care Hospitals by CBSA—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 4J.—Out-Migration Adjustment for Acute Care Hospitals—FY 2015

Table 5.—List of Medicare Severity Diagnosis-Related Groups (MS-DRGs), Relative Weighting Factors, and Geometric and Arithmetic Mean Length of Stay—FY 2015

Table 6I.—Major CC List—FY 2015

Table 6J.—Complete CC List—FY 2015

Table 6K.—Complete List of CC Exclusions—FY 2015

Table 7A.—Medicare Prospective Payment System Selected Percentile Lengths of Stay: FY 2013 MedPAR Update—March 2014 GROUPER V31.0 MS-DRGs

Table 7B.—Medicare Prospective Payment System Selected Percentile Lengths of Stay: FY 2013 MedPAR Update—March 2014 GROUPER V32.0 MS-DRGs

Table 8A.—FY 2015 Statewide Average Operating Cost-to-Charge Ratios (CCRs) for Acute Care Hospitals (Urban and Rural)

Table 8B.—FY 2015 Statewide Average Capital Cost-to-Charge Ratios (CCRs) for Acute Care Hospitals

Table 9A-1.—Hospital Reclassifications and Redesignations—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 9A-2.—Hospital Reclassifications and Redesignations—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 9C-1.—Hospitals Redesignated as Rural under Section 1886(d)(8)(E) of the Act—FY 2015; Based on CBSA Delineations Used in FY 2014.

Table 9C-2.—Hospitals Redesignated as Rural under Section 1886(d)(8)(E) of the Act—FY 2015; Based on CBSA Delineations Used in FY 2015.

Table 10.—New Technology Add-On Payment Thresholds1,2 for Applications for FY 2016

Table 14.—List of Hospitals with Fewer than 1,600 Medicare Discharges Based on the March 2014 Update of the FY 2013 MedPAR File and Potentially Eligible Hospitals' FY 2015 Low-Volume Payment Adjustment for Discharges Occurring Before April 1, 2015 (Eligibility for the low-volume payment adjustment is also dependent upon meeting the mileage criteria specified at § 412.101(b)(2)(ii) of the regulations.)

Table 15A.—FY 2015 Proxy Readmissions Adjustment Factors

Table 16.—Updated Proxy Hospital Inpatient Value-Based Purchasing (VBP) Program Adjustment Factors for FY 2015

Table 18.—FY 2015 Medicare DSH Uncompensated Care Payment Factor 3

The following LTCH PPS tables for this FY 2015 final rule are available only through the Internet on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/LongTermCareHospitalPPS/index.html under the list item for Regulation Number CMS-1607-F.

Table 8C.—FY 2015 Statewide Average Total Cost-to-Charge Ratios (CCRs) for LTCHs (Urban and Rural)

Table 11.—MS-LTC-DRGs, Relative Weights, Geometric Average Length of Stay, Short-Stay Outlier (SSO) Threshold, and “IPPS Comparable Threshold” for Discharges Occurring from October 1, 2014 through September 30, 2015 under the LTCH PPS

Table 12A.—LTCH PPS Wage Index for Urban Areas under the New OMB CBSA Delineations for Discharges Occurring From October 1, 2014 through September 30, 2015

Table 12B.—LTCH PPS Wage Index for Rural Areas under the New OMB CBSA Delineations for Discharges Occurring from October 1, 2014 through September 30, 2015

Table 12C.—LTCH PPS Wage Index for Urban Areas under the Current CBSA Designations for Discharges Occurring from October 1, 2014 through September 30, 2015

Table 12D.—LTCH PPS Wage Index for Rural Areas under the Current CBSA Designations for Discharges Occurring from October 1, 2014 through September 30, 2015

Table 13A.—Composition of Low-Volume Quintiles for MS-LTC-DRGs—FY 2015

Table 13B.—No-Volume MS-LTC-DRG Crosswalk for FY 2015

Table 1A—National Adjusted Operating Standardized Amounts, Labor/Nonlabor (69.6 Percent Labor Share/30.4 Percent Nonlabor Share if Wage Index is Greater Than 1)—FY 2015

Hospital submitted quality data and is a meaningful EHR user (Update = 2.2 percent)Hospital did NOT submit quality data and is a meaningful EHR user (update = 1.475 percent)Hospital submitted quality data and is NOT a meaningful EHR user (update = 1.475 percent)Hospital did NOT submit quality data and is NOT a meaningful EHR user (update = 0.75 percent)
LaborNonlaborLaborNonlaborLaborNonlaborLaborNonlabor
$3,780.13$1,651.09$3,753.31$1,639.38$3,753.31$1,639.38$3,726.50$1,627.66

Table 1B—National Adjusted Operating Standardized Amounts, Labor/Nonlabor (62 Percent Labor Share/38 Percent Nonlabor Share if Wage Index is Less Than or Equal to 1)—FY 2015

Hospital submitted quality data and is a meaningful EHR user (Update = 2.2 percent)Hospital did NOT submit quality data and is a meaningful EHR user (Update = 1.475 percent)Hospital submitted quality data and is NOT a meaningful EHR user (Update = 1.475 percent)Hospital did NOT submit quality data and is NOT a meaningful EHR user (Update = 0.75 percent)
LaborNonlaborLaborNonlaborLaborNonlaborLaborNonlabor
$3,367.36$2,063.86$3,343.47$2,049.22$3,343.47$2,049.22$3,319.58$2,034.58

TABLE 1C—Adjusted Operating Standardized Amounts for Puerto Rico, Labor/Nonlabor (National: 62 Percent Labor Share/38 Percent Nonlabor Share Because Wage Index is Less Than or Equal to 1; Puerto Rico: 63.2 Percent Labor Share/36.8 Percent Nonlabor Share if Wage Index is Greater Than 1 or 62 Percent Labor Share/38 Percent Nonlabor Share if Wage Index is Less Than or Equal to 1—FY 2015

Rates if wage index is greater than 1Rates if wage index is less than or equal to 1
Standardized AmountLaborNonlaborLabor
National Not ApplicableNot Applicable$3,367.36$2,063.86
Puerto Rico$1,608.39$936.54$1,577.86$967.07
For FY 2015, there are no CBSAs in Puerto Rico with a national wage index greater than 1.

Table 1D—Capital Standard Federal Payment Rate—FY 2015

Rate
National$434.26
Puerto Rico$209.10

Table 1E—LTCH Standard Federal Prospective Payment Rate—FY 2015

Full Update (2.2 Percent)Reduced Update* (0.2 Percent)
Standard Federal Rate$41,043.71$40,240.51
* For LTCHs that fail to submit quality reporting data for FY 2015 in accordance with the LTCH Quality Reporting (LTCHQR) Program, the annual update is reduced by 2.0 percentage points as required by section 1886(m)(5) of the Act.

Appendix A: Economic Analyses

I. Regulatory Impact Analysis

A. Introduction

We have examined the impacts of this final rule as required by Executive Order 12866 on Regulatory Planning and Review (September 30, 1993), Executive Order 13563 on Improving Regulation and Regulatory Review (February 2, 2011), the Regulatory Flexibility Act (RFA) (September 19, 1980, Pub. L. 96-354), section 1102(b) of the Social Security Act, section 202 of the Unfunded Mandates Reform Act of 1995 (March 22, 1995, Pub. L. 104-4), Executive Order 13132 on Federalism (August 4, 1999), and the Congressional Review Act (5 U.S.C. 804(2)).

Executive Orders 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. A regulatory impact analysis (RIA) must be prepared for major rules with economically significant effects ($100 million or more in any 1 year).

We have determined that this final rule is a major rule as defined in 5 U.S.C. 804(2). We estimate that the changes for FY 2015 acute care hospital operating and capital payments will redistribute amounts in excess of $100 million to acute care hospitals. The applicable percentage increase to the IPPS rates required by the statute, in conjunction with other payment changes in this final rule, will result in an estimated $654 million decrease in FY 2015 operating payments (or −0.6 percent change) and an estimated $132 million increase in FY 2015 capital payments (or 1.6 percent change). These changes are relative to payments made in FY 2014. The impact analysis of the capital payments can be found in section I.J. of this Appendix. In addition, as described in section I.K. of this Appendix, LTCHs are expected to experience an increase in payments by $62 million in FY 2015 relative to FY 2014.

Our operating impact estimate includes the −0.8 percent documentation and coding adjustment applied to the IPPS standardized amount, which represents part of the recoupment required under section 631 of the ATRA. In addition, our operating payment impact estimate includes the 2.2 percent hospital update to the standardized amount (which includes the estimated 2.9 percent market basket update less 0.5 percentage point for the multifactor productivity adjustment and less 0.2 percentage point required under the Affordable Care Act). The estimates of IPPS operating payments to acute care hospitals do not reflect any changes in hospital admissions or real case-mix intensity, which will also affect overall payment changes.

The analysis in this Appendix, in conjunction with the remainder of this document, demonstrates that this final rule is consistent with the regulatory philosophy and principles identified in Executive Orders 12866 and 13563, the RFA, and section 1102(b) of the Act. This final rule will affect payments to a substantial number of small rural hospitals, as well as other classes of hospitals, and the effects on some hospitals may be significant. Finally, in accordance with the provisions of Executive Order 12866, the Executive Office of Management and Budget has reviewed this final rule.

B. Statement of Need

This final rule is necessary in order to make payment and policy changes under the Medicare IPPS for Medicare acute care hospital inpatient services for operating and capital-related costs as well as for certain hospitals and hospital units excluded from the IPPS. This final rule also is necessary to make payment and policy changes for Medicare hospitals under the LTCH PPS payment system.

C. Objectives of the IPPS

The primary objective of the IPPS is to create incentives for hospitals to operate efficiently and minimize unnecessary costs while at the same time ensuring that payments are sufficient to adequately compensate hospitals for their legitimate costs in delivering necessary care to Medicare beneficiaries. In addition, we share national goals of preserving the Medicare Hospital Insurance Trust Fund.

We believe that the changes in this final rule will further each of these goals while maintaining the financial viability of the hospital industry and ensuring access to high quality health care for Medicare beneficiaries. We expect that these changes will ensure that the outcomes of the prospective payment systems are reasonable and equitable while avoiding or minimizing unintended adverse consequences.

D. Limitations of Our Analysis

The following quantitative analysis presents the projected effects of our policy changes, as well as statutory changes effective for FY 2015, on various hospital groups. We estimate the effects of individual policy changes by estimating payments per case while holding all other payment policies constant. We use the best data available, but, generally, we do not attempt to make adjustments for future changes in such variables as admissions, lengths of stay, or case-mix.

E. Hospitals Included in and Excluded From the IPPS

The prospective payment systems for hospital inpatient operating and capital-related costs of acute care hospitals encompass most general short-term, acute care hospitals that participate in the Medicare program. There were 32 Indian Health Service hospitals in our database, which we excluded from the analysis due to the special characteristics of the prospective payment methodology for these hospitals. Among other short-term, acute care hospitals, hospitals in Maryland are paid in accordance with the Maryland All-Payer Model, and hospitals located outside the 50 States, the District of Columbia, and Puerto Rico (that is, 5 short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa) receive payment for inpatient hospital services they furnish on the basis of reasonable costs, subject to a rate-of-increase ceiling.

As of March 2014, there were 3,396 IPPS acute care hospitals included in our analysis. This represents approximately 56 percent of all Medicare-participating hospitals. The majority of this impact analysis focuses on this set of hospitals. There also are approximately 1,326 CAHs. These small, limited service hospitals are paid on the basis of reasonable costs rather than under the IPPS. IPPS-excluded hospitals and units include IPFs, IRFs, LTCHs, RNHCIs, children's hospitals, 11 cancer hospitals, and 5 short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa, which are paid under separate payment systems. Changes in the prospective payment systems for IPFs and IRFs are made through separate rulemaking. Payment impacts for these IPPS-excluded hospitals and units are not included in this final rule. The impact of the update and policy changes to the LTCH PPS for FY 2015 is discussed in section I.L. of this Appendix.

F. Effects on Hospitals and Hospital Units Excluded From the IPPS

As of March 2014, there were 98 children's hospitals, 11 cancer hospitals, 5 short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands and American Samoa, and 18 RNHCIs being paid on a reasonable cost basis subject to the rate-of-increase ceiling under § 413.40. (In accordance with § 403.752(a) of the regulation, RNHCIs are paid under § 413.40.) Among the remaining providers, 245 rehabilitation hospitals and 897 rehabilitation units, and 431 LTCHs, are paid the Federal prospective per discharge rate under the IRF PPS and the LTCH PPS, respectively, and 490 psychiatric hospitals and 1,136 psychiatric units are paid the Federal per diem amount under the IPF PPS. As stated above, IRFs and IPFs are not affected by the rate updates discussed in this final rule. The impacts of the changes on LTCHs are discussed in section I.K. of this Appendix.

For children's hospitals, the 11 cancer hospitals, the 5 short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa, and RNHCIs, the update of the rate-of-increase limit (or target amount) is the estimated FY 2015 percentage increase in the IPPS operating market basket, consistent with section 1886(b)(3)(B)(ii) of the Act, and §§ 403.752(a) and 413.40 of the regulations. As discussed in section IV. of the preamble of the FY 2014 IPPS/LTCH PPS final rule, we rebased the IPPS operating market basket to a FY 2010 base year. Therefore, we are using the percentage increase in the FY 2010-based IPPS operating market basket to update the target amounts for FY 2015 and subsequent fiscal years for children's hospitals, the 11 cancer hospitals, the 5 short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa, and RNHCIs that are paid based on reasonable costs subject to the rate-of-increase limits. Consistent with current law, based on IHS Global Insight, Inc.'s second quarter 2014 forecast of the FY 2010-based market basket increase, we are estimating that the FY 2015 update based on the IPPS operating market basket is 2.9 percent (that is, the current estimate of the market basket rate-of-increase). However, the Affordable Care Act requires an adjustment for multifactor productivity (currently estimated to be 0.5 percentage point for FY 2015) and a 0.2 percentage point reduction to the market basket update resulting in a 2.2 percent applicable percentage increase for IPPS hospitals that submit quality data and are meaningful EHR users, as discussed in section IV.B. of the preamble of this final rule. Children's hospitals, the 11 cancer hospitals, the 5 short-term acute care hospitals located in the Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa, and RNCHIs that continue to be paid based on reasonable costs subject to rate-of-increase limits under § 413.40 of the regulations are not subject to the reductions in the applicable percentage increase required under the Affordable Care Act. Therefore, for those hospitals paid under § 413.40 of the regulations, the update is the percentage increase in the FY 2015 IPPS operating market basket, estimated at 2.9 percent, without the reductions required under the Affordable Care Act.

The impact of the update in the rate-of-increase limit on those excluded hospitals depends on the cumulative cost increases experienced by each excluded hospital since its applicable base period. For excluded hospitals that have maintained their cost increases at a level below the rate-of-increase limits since their base period, the major effect is on the level of incentive payments these excluded hospitals receive. Conversely, for excluded hospitals with cost increases above the cumulative update in their rate-of-increase limits, the major effect is the amount of excess costs that will not be paid.

We note that, under § 413.40(d)(3), an excluded hospital that continues to be paid under the TEFRA system and whose costs exceed 110 percent of its rate-of-increase limit receives its rate-of-increase limit plus the lesser of: (1) 50 percent of its reasonable costs in excess of 110 percent of the limit, or (2) 10 percent of its limit. In addition, under the various provisions set forth in § 413.40, hospitals can obtain payment adjustments for justifiable increases in operating costs that exceed the limit.

G. Quantitative Effects of the Policy Changes Under the IPPS for Operating Costs

1. Basis and Methodology of Estimates

In this final rule, we are announcing policy changes and payment rate updates for the IPPS for FY 2015 for operating costs of acute care hospitals. The FY 2015 updates to the capital payments to acute care hospitals are discussed in section I.J. of this Appendix.

Based on the overall percentage change in payments per case estimated using our payment simulation model, we estimate that total FY 2015 operating payments will decrease by 0.6 percent compared to FY 2014. In addition to the applicable percentage increase, this amount reflects the FY 2015 recoupment adjustment for documentation and coding described in section II.D. of the preamble of this final rule of −0.8 percent to the IPPS national standardized amounts. The impacts do not reflect changes in the number of hospital admissions or real case-mix intensity, which will also affect overall payment changes.

We have prepared separate impact analyses of the changes to each system. This section deals with the changes to the operating inpatient prospective payment system for acute care hospitals. Our payment simulation model relies on the most recent available data to enable us to estimate the impacts on payments per case of certain changes in this final rule. However, there are other changes for which we do not have data available that will allow us to estimate the payment impacts using this model. For those changes, we have attempted to predict the payment impacts based upon our experience and other more limited data.

The data used in developing the quantitative analyses of changes in payments per case presented below are taken from the FY 2013 MedPAR file and the most current Provider-Specific File (PSF) that is used for payment purposes. Although the analyses of the changes to the operating PPS do not incorporate cost data, data from the most recently available hospital cost reports were used to categorize hospitals. Our analysis has several qualifications. First, in this analysis, we do not make adjustments for future changes in such variables as admissions, lengths of stay, or underlying growth in real case-mix. Second, due to the interdependent nature of the IPPS payment components, it is very difficult to precisely quantify the impact associated with each change. Third, we use various data sources to categorize hospitals in the tables. In some cases, particularly the number of beds, there is a fair degree of variation in the data from the different sources. We have attempted to construct these variables with the best available source overall. However, for individual hospitals, some miscategorizations are possible.

Using cases from the FY 2013 MedPAR file, we simulated payments under the operating IPPS given various combinations of payment parameters. As described above, Indian Health Service hospitals and hospitals in Maryland were excluded from the simulations. The impact of payments under the capital IPPS, or the impact of payments for costs other than inpatient operating costs, are not analyzed in this section. Estimated payment impacts of the capital IPPS for FY 2015 are discussed in section I.J. of this Appendix.

We discuss the following changes below:

  • The effects of the application of the documentation and coding adjustment and the applicable percentage increase (including the market basket update, the multifactor productivity adjustment and the applicable percentage reduction in accordance with the Affordable Care Act) to the standardized amount and hospital-specific rates.
  • The effects of the changes to the relative weights and MS-DRG grouper.
  • The effects of the changes in hospitals' wage index values reflecting updated wage data from hospitals' cost reporting periods beginning during FY 2011, compared to the FY 2010 wage data, and the adoption of the new OMB delineations to calculate the FY 2015 wage index.
  • The combined effects of the recalibration of the MS-DRG relative weights as required by section 1886(d)(4)(C) of the Act and the wage index (including the updated wage data and the adoption of new OMB labor market area delineations), including the wage and recalibration budget neutrality factors.
  • The effects of the geographic reclassifications by the MGCRB (as of publication of this final rule) and the effects of the adoption of new OMB labor market area delineations on these reclassifications, that will be effective for FY 2015.
  • The effects of the rural floor and imputed floor with the application of the national budget neutrality factor applied to the wage index where the rural floor and imputed floor wage index are calculated based on the adoption of the new OMB labor market area delineations.
  • The effects of the adoption of the new labor market area delineations announced by OMB in February 2013 on hospital redesignations.
  • The effects of the 3-year transition for urban hospitals becoming rural under the new OMB delineations and the 1-year transitional blended wage index for hospitals whose FY 2015 wage indexes decrease solely as a result of adopting the new OMB delineations.
  • The effects of the frontier State wage index adjustment under the statutory provision that requires that hospitals located in States that qualify as frontier States to not have a wage index less than 1.0. This provision is not budget neutral.
  • The effects of the implementation of section 1886(d)(13) of the Act, as added by section 505 of Public Law 108-173, which provides for an increase in a hospital's wage index if a threshold percentage of residents of the county where the hospital is located commute to work at hospitals in counties with higher wage indexes.
  • The effects of the policies for implementation of the Hospital Readmissions Reduction Program under section 1886(q) of the Act, as added by section 3025 of the Affordable Care Act, that adjusts a hospital's base operating DRG amount by an adjustment factor to account for a hospital's excess readmissions.
  • The effects of the policies for continued implementation of section 3133 of the Affordable Care Act that reduces Medicare DSH payments to 25 percent of what hospitals had been previously paid under section 1886(d)(5)(F) of the Act and establishes an additional payment to be made to hospitals that receive DSH payments for their relative share of the total amount of uncompensated care.
  • The total estimated change in payments based on the FY 2015 policies relative to payments based on FY 2014 policies that include the applicable percentage increase of 2.2 percent (or 2.9 percent market basket update with a reduction of 0.5 percentage point for the multifactor productivity adjustment, and a 0.2 percentage point reduction, as required under the Affordable Care Act). The total estimated change in payments for FY 2015 reflects the extension of MDH payment status for the first 6 months of FY 2015, in accordance with the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93) enacted on April 1, 2014.

We note that in the FY 2015 IPPS/LTCH PPS proposed rule we provided the effects of section 1886(o) of the Act, as added by section 3008 of the Affordable Care Act, which establishes payment reductions under the HAC Reduction Program. Hospitals ranked in the lowest 25 percent of performance on HACs are subject to a 1-percent reduction in total IPPS payments. We are finalizing policies related to the HAC Reduction Program in this final rule, but as described earlier in this final rule, because the HAC scores are currently undergoing 30-day review and correction by the hospitals, we are not providing hospital-level data or a hospital-level payment impact in conjunction with the FY 2015 IPPS Final Rule. We do provide an estimate of the overall payment impact in section I.H.8. of this Appendix A along with a discussion of the impact of these changes.

To illustrate the impact of the FY 2015 changes, our analysis begins with a FY 2014 baseline simulation model using: the FY 2015 applicable percentage increase of 2.2 percent and the documentation and coding recoupment adjustment of 0.8 percent to the Federal standardized amount; the FY 2014 MS-DRG GROUPER (Version 31.0); the current FY 2014 CBSA designations for hospitals based on the OMB definitions; the FY 2014 wage index; and no MGCRB reclassifications. Outlier payments are set at 5.1 percent of total operating MS-DRG and outlier payments for modeling purposes.

Section 1886(b)(3)(B)(viii) of the Act, as added by section 5001(a) of Public Law 109-171, as amended by section 4102(b)(1)(A) of the ARRA (Pub. L. 111-5) and by section 3401(a)(2) of the Affordable Care Act (Pub. L. 111-148), provides that, for FY 2007 and each subsequent year through FY 2014, the update factor will include a reduction of 2.0 percentage points for any subsection (d) hospital that does not submit data on measures in a form and manner and at a time specified by the Secretary. Beginning in FY 2015, the reduction is one-quarter of such applicable percentage increase determined without regard to section 1886(b)(3)(B)(ix), (xi), or (xii) of the Act, or one-quarter of the market basket update. Therefore, for FY 2015, we are establishing that hospitals that do not submit quality information under rules established by the Secretary and that are meaningful EHR users under section 1886(b)(3)(B)(ix) of the Act will receive an applicable percentage increase of 1.475 percent. At the time that this impact was prepared, 56 hospitals did not receive the full market basket rate-of-increase for FY 2014 because they failed the quality data submission process or did not choose to participate. For purposes of the simulations shown below, we modeled the payment changes for FY 2015 using a reduced update for these 56 hospitals. However, we do not have enough information at this time to determine which hospitals will not receive the full update factor for FY 2015.

Beginning in FY 2015, in accordance with section 1886(b)(3)(B)(ix) of the Act, a hospital that has been identified as not an meaningful EHR user will be subject to a reduction of one-quarter of such applicable percentage increase determined without regard to section 1886(b)(3)(B)(ix), (xi), or (xii) of the Act, or one-quarter of the market basket update. Therefore, for FY 2015, we are establishing that hospitals that are identified as not meaningful EHR users and do submit quality information under section 1886(b)(3)(B)(viii) of the Act will receive an applicable percentage increase of 1.475 percent. Hospitals that are identified as not meaningful EHR users under section 1886(b)(3)(B)(ix) of the Act and also do not submit quality data under section 1886(b)(3)(B)(viii) of the Act will receive an applicable percentage increase of 0.75 percent, which reflects a one-quarter reduction of the market basket update for failure to submit quality data and a one-quarter reduction of the market basket update for being identified as not a meaningful EHR user. For FY 2015, we have yet to finalize a list of hospitals that are not meaningful EHR users under section 1886(b)(3)(B)(ix) of the Act. Therefore, we are not including this adjustment to the standardized amount (for those hospitals that are not meaningful EHR users) in our modeling of aggregate payments for FY 2015. We intend to release a final list of hospitals that are not meaningful EHR users in September 2014. Hospitals identified on this list will be paid based on the applicable standardized amount in Table 1A and Table 1B for discharges occurring in FY 2015.

Each policy change, statutory or otherwise, is then added incrementally to this baseline, finally arriving at an FY 2015 model incorporating all of the changes. This simulation allows us to isolate the effects of each change.

Our final comparison illustrates the percent change in payments per case from FY 2014 to FY 2015. Three factors not discussed separately have significant impacts here. The first factor is the update to the standardized amount. In accordance with section 1886(b)(3)(B)(i) of the Act, we are updating the standardized amounts for FY 2015 using an applicable percentage increase of 2.2 percent. This includes our forecasted IPPS operating hospital market basket increase of 2.9 percent with a reduction of 0.5 percentage point for the multifactor productivity adjustment and a 0.2 percentage point reduction as required under the Affordable Care Act. (Hospitals that fail to comply with the quality data submission requirements and are meaningful EHR users would receive an update of 1.475 percent. This update includes a reduction of one-quarter of the market basket update for failure to submit these data). We note that hospitals that do comply with the quality data submission requirements but are not meaningful EHR users would receive an update of 1.475 percent, which includes a reduction of one-quarter of the market basket update. Furthermore, hospitals that do not comply with the quality data submission requirements and also are not meaningful EHR users would receive an update of 0.75 percent. However, as discussed earlier, we do not have a list of hospitals that are not meaningful EHR users and have not included this adjustment to the standardized amount (for those hospitals that are not meaningful EHR users) in our modeling of aggregate payments for FY 2015. Under section 1886(b)(3)(B)(iv) of the Act, the updates to the hospital-specific amounts for SCHs and MDHs also are equal to the applicable percentage increase, or 2.2 percent if the hospital submits quality data and is a meaningful EHR user. In addition, we are updating the Puerto Rico-specific amount by an applicable percentage increase of 2.2 percent.

A second significant factor that affects the changes in hospitals' payments per case from FY 2014 to FY 2015 is the change in hospitals' geographic reclassification status from one year to the next. That is, payments may be reduced for hospitals reclassified in FY 2014 that are no longer reclassified in FY 2015. Conversely, payments may increase for hospitals not reclassified in FY 2014 that are reclassified in FY 2015.

A third significant factor is that we currently estimate that actual outlier payments during FY 2014 will be 5.71 percent of total MS-DRG payments. When the FY 2014 IPPS/LTCH PPS final rule was published, we projected FY 2014 outlier payments would be 5.1 percent of total MS-DRG plus outlier payments; the average standardized amounts were offset correspondingly. The effects of the higher than expected outlier payments during FY 2014 (as discussed in the Addendum to this final rule) are reflected in the analyses below comparing our current estimates of FY 2014 payments per case to estimated FY 2015 payments per case (with outlier payments projected to equal 5.1 percent of total MS-DRG payments).

2. Analysis of Table I

Table I displays the results of our analysis of the changes for FY 2015. The table categorizes hospitals by various geographic and special payment consideration groups to illustrate the varying impacts on different types of hospitals. The top row of the table shows the overall impact on the 3,396 acute care hospitals included in the analysis.

The next four rows of Table I contain hospitals categorized according to their geographic location: all urban, which is further divided into large urban and other urban; and rural. There are 2,549 hospitals located in urban areas included in our analysis. Among these, there are 1,401 hospitals located in large urban areas (populations over 1 million), and 1,148 hospitals in other urban areas (populations of 1 million or fewer). In addition, there are 847 hospitals in rural areas. The next two groupings are by bed-size categories, shown separately for urban and rural hospitals. The final groupings by geographic location are by census divisions, also shown separately for urban and rural hospitals.

The second part of Table I shows hospital groups based on hospitals' FY 2015 payment classifications, including any reclassifications under section 1886(d)(10) of the Act. For example, the rows labeled urban, large urban, other urban, and rural show that the numbers of hospitals paid based on these categorizations after consideration of geographic reclassifications (including reclassifications under sections 1886(d)(8)(B) and 1886(d)(8)(E) of the Act that have implications for capital payments) are 2,563; 1,413; 1,150; and 833, respectively.

The next three groupings examine the impacts of the changes on hospitals grouped by whether or not they have GME residency programs (teaching hospitals that receive an IME adjustment) or receive Medicare DSH payments, or some combination of these two adjustments. There are 2,357 nonteaching hospitals in our analysis, 795 teaching hospitals with fewer than 100 residents, and 244 teaching hospitals with 100 or more residents.

In the DSH categories, hospitals are grouped according to their DSH payment status, and whether they are considered urban or rural for DSH purposes. The next category groups together hospitals considered urban or rural, in terms of whether they receive the IME adjustment, the DSH adjustment, both, or neither.

The next five rows examine the impacts of the changes on rural hospitals by special payment groups (SCHs, RRCs, and MDHs). There were 193 RRCs, 325 SCHs, and 162 MDHs (MDH status is extended through March 31, 2015 only under Pub. L. 113-93), 124 hospitals that are both SCHs and RRCs, and 15 hospitals that are MDHs and RRCs (MDH status is extended through March 31, 2015 only under Pub. L. 113-93).

The next series of groupings are based on the type of ownership and the hospital's Medicare utilization expressed as a percent of total patient days. These data were taken from the FY 2012 or FY 2011 Medicare cost reports.

The next two groupings concern the geographic reclassification status of hospitals. The first grouping displays all urban hospitals that were reclassified by the MGCRB for FY 2015. The second grouping shows the MGCRB rural reclassifications. The final category shows the impact of the policy changes on the 15 cardiac hospitals.

a. Effects of the Hospital Update and Documentation and Coding Adjustment (Column 2)

As discussed in section II.D. of the preamble of this final rule, this column includes the hospital update, including the 2.9 percent market basket update, the reduction of 0.5 percentage point for the multifactor productivity adjustment, and the 0.2 percentage point reduction in accordance with the Affordable Care Act. In addition, this column includes the FY 2015 documentation and coding recoupment adjustment of −0.8 percent on the national standardized amount as part of the recoupment required by section 631 of the ATRA. As a result, we are making a 1.4 percent update to the national standardized amount. This column also includes the 2.2 percent update to the hospital-specific rates which also includes the 2.9 percent market basket update, the reduction of 0.5 percentage point for the multifactor productivity adjustment, and the 0.2 percentage point reduction in accordance with the Affordable Care Act.

Overall, hospitals will experience a 1.5 percent increase in payments primarily due to the combined effects of the hospital update and documentation and coding adjustment on the national standardized amount and the hospital update to the hospital-specific rate. Hospitals that are paid under the hospital-specific rate, namely SCHs, will experience a 2.2 percent increase in payments; therefore, hospital categories with SCHs paid under the hospital-specific rate will experience increases in payments of more than 1.4 percent.

b. Effects of the Changes to the MS-DRG Reclassifications and Relative Cost-Based Weights With Recalibration Budget Neutrality (Column 3)

Column 3 shows the effects of the changes to the MS-DRGs and relative weights with the application of the recalibration budget neutrality factor to the standardized amounts. Section 1886(d)(4)(C)(i) of the Act requires us annually to make appropriate classification changes in order to reflect changes in treatment patterns, technology, and any other factors that may change the relative use of hospital resources. Consistent with section 1886(d)(4)(C)(iii) of the Act, we calculated a recalibration budget neutrality factor to account for the changes in MS-DRGs and relative weights to ensure that the overall payment impact is budget neutral.

As discussed in section II.E. of the preamble of this final rule, the FY 2015 MS-DRG relative weights will be 100 percent cost-based and 100 percent MS-DRGs. For FY 2015, the MS-DRGs are calculated using the FY 2013 MedPAR data grouped to the Version 32.0 (FY 2015) MS-DRGs. The methodology to calculate the relative weights and the reclassification changes to the GROUPER are described in more detail in section II.H. of the preamble of this final rule.

The “All Hospitals” line in Column 3 indicates that changes due to the MS-DRGs and relative weights will result in a 0.0 percent change in payments with the application of the recalibration budget neutrality factor of 0.997543 on to the standardized amount. Hospital categories that generally treat more surgical cases than medical cases will experience increases in their payments due to the changes to the relative weight methodology. Rural hospitals will experience a 0.3 percent decrease in payments because rural hospitals tend to treat fewer surgical cases than medical cases, while teaching hospitals with more than 100 residents will experience an increase in payments by 0.2 percent as those hospitals treat more surgical cases than medical cases.

c. Effects of the Wage Index Changes (Column 4)

Column 4 shows the impact of updated wage data using FY 2011 cost report data and the new OMB labor market area delineations, with the application of the wage budget neutrality factor. The wage index is calculated and assigned to hospitals on the basis of the labor market area in which the hospital is located. Under section 1886(d)(3)(E) of the Act, beginning with FY 2005, we delineate hospital labor market areas based on the Core Based Statistical Areas (CBSAs) established by OMB. The current statistical areas used in FY 2014 were based on OMB standards published on December 27, 2000 (65 FR 82228) and Census 2000 data and Census Bureau population estimates for 2007 and 2008 (OMB Bulletin No. 10-02).

As stated in the FY 2014 IPPS/LTCH PPS proposed rule (78 FR 27552) and final rule (78 FR 50586), on February 28, 2013, OMB issued OMB Bulletin No. 13-01, which established revised delineations for Metropolitan Statistical Areas, Micropolitan Statistical Areas, and Combined Statistical Areas, and provided guidance on the use of the delineations of these statistical areas. In order to implement these changes for the IPPS, it is necessary to identify the new labor market area delineation for each county and hospital in the country. However, because the bulletin was not issued until February 28, 2013, with supporting data not available until later, and because the changes made by the bulletin and their ramifications needed to be extensively reviewed and verified, we were unable to undertake such a lengthy process before publication of the FY 2014 IPPS/LTCH PPS proposed rule and, thus, did not implement changes to the wage index for FY 2014 based on these new OMB delineations. In the FY 2014 IPPS/LTCH PPS final rule (78 FR 50586), we stated that we intended to propose changes to the wage index based on the new OMB delineations in this FY 2015 proposed rule. As discussed below, in this final rule, we are implementing the new OMB delineations as described in the February 28, 2013 OMB Bulletin No. 13-01, effective beginning with the FY 2015 IPPS wage index.

Section 1886(d)(3)(E) of the Act requires that, beginning October 1, 1993, we annually update the wage data used to calculate the wage index. In accordance with this requirement, the wage index for acute care hospitals for FY 2015 is based on data submitted for hospital cost reporting periods beginning on or after October 1, 2010 and before October 1, 2011. The estimated impact of the updated wage data using the FY 2011 cost report data and the new OMB labor market area delineations on hospital payments is isolated in Column 4 by holding the other payment parameters constant in this simulation. That is, Column 4 shows the percentage change in payments when going from a model using the FY 2014 wage index, based on FY 2010 wage data, the labor-related share of 69.6 percent, under the new OMB delineations and having a 100-percent occupational mix adjustment applied, to a model using the FY 2015 pre-reclassification wage index based on FY 2011 wage data with the labor-related share of 69.6 percent, under the new OMB delineations, also having a 100-percent occupational mix adjustment applied, while holding other payment parameters such as use of the Version 32.0 MS-DRG GROUPER constant). The FY 2015 occupational mix adjustment is based on the CY 2010 occupational mix survey.

In addition, the column shows the impact of the application of the wage budget neutrality to the national standardized amount. In FY 2010, we began calculating separate wage budget neutrality and recalibration budget neutrality factors, in accordance with section 1886(d)(3)(E) of the Act, which specifies that budget neutrality to account for wage index changes or updates made under that subparagraph must be made without regard to the 62 percent labor-related share guaranteed under section 1886(d)(3)(E)(ii) of the Act. Therefore, for FY 2015, we calculated the wage budget neutrality factor to ensure that payments under updated wage data and the labor-related share of 69.6 percent are budget neutral without regard to the lower labor-related share of 62 percent applied to hospitals with a wage index less than or equal to 1.0. In other words, the wage budget neutrality is calculated under the assumption that all hospitals receive the higher labor-related share of the standardized amount. The wage budget neutrality factor is 1.001443, and the overall payment change is zero percent.

Column 4 shows the impacts of updating the wage data using FY 2011 cost reports. Overall, the new wage data and the labor-related share, combined with the wage budget neutrality adjustment, will lead to a 0.0 percent change for all hospitals as shown in Column 4.

In looking at the wage data itself, the national average hourly wage increased 1.02 percent compared to FY 2014. Therefore, the only manner in which to maintain or exceed the previous year's wage index was to match or exceed the national 1.02 percent increase in average hourly wage. Of the 3,387 hospitals with wage data for both FYs 2014 and 2015, 1,572 or 46.4 percent will experience an average hourly wage increase of 1.02 percent or more.

The following chart compares the shifts in wage index values for hospitals due to changes in the average hourly wage data for FY 2015 relative to FY 2014. Among urban hospitals, 4 will experience a decrease of more than 10 percent, with 2 urban hospital experiencing an increase of more than 10 percent. Seventy-six urban hospitals will experience an increase or decrease of at least 5 percent or more but less than or equal to 10 percent. Among rural hospitals, none will experience a decrease of more than 5 percent, but 5 rural hospitals will experience an increase of greater than 5 percent but less than or equal to 10 percent. However, 841 rural hospitals will experience increases or decreases of less than or equal to 5 percent, while 2,220 urban hospitals will experience increases or decreases of less than or equal to 5 percent. Two hundred thirty-nine urban and no rural hospitals will not experience a change in their wage index. These figures reflect changes in the “pre-reclassified, occupational mix-adjusted wage index,” that is, the wage index before the application of geographic reclassification, the rural and imputed floors, the out-migration adjustment, and other wage index exceptions and adjustments. We note that this analysis was performed by applying the new OMB labor market area delineations to the FY 2015 wage data and also by recomputing the FY 2014 final wage data to reflect the new OMB delineations. (We refer readers to sections III.G.2. through III.I. of the preamble of this final rule for a complete discussion of the exceptions and adjustments to the wage index.) We note that the “post-reclassified wage index” or “payment wage index,” the wage index that includes all such exceptions and adjustments (as reflected in Tables 2, 4A, 4B, 4C, and 4F of the Addendum to this final rule, which are available via the Internet on the CMS Web site) is used to adjust the labor-related share of a hospital's standardized amount, either 69.6 percent or 62 percent, depending upon whether a hospital's wage index is greater than 1.0 or less than or equal to 1.0. Therefore, the pre-reclassified wage index figures in the chart below may illustrate a somewhat larger or smaller change than will occur in a hospital's payment wage index and total payment.

The following chart shows the projected impact of changes in the area wage index values for urban and rural hospitals.

Percentage change in area wage index valuesNumber of hospitals
UrbanRural
Increase more than 10 percent20
Increase more than 5 percent and less than or equal to 10 percent185
Increase or decrease less than or equal to 5 percent2,220841
Decrease more than 5 percent and less than or equal to 10 percent580
Decrease more than 10 percent40
Unchanged2390

d. Combined Effects of the MS-DRG and Wage Index Changes (Column 5)

Section 1886(d)(4)(C)(iii) of the Act requires that changes to MS-DRG reclassifications and the relative weights cannot increase or decrease aggregate payments. In addition, section 1886(d)(3)(E) of the Act specifies that any updates or adjustments to the wage index are to be budget neutral. We computed a wage budget neutrality factor of 1.001443 and a recalibration budget neutrality factor of 0.997543 (which is applied to the Puerto Rico-specific standardized amount and the hospital-specific rates). The product of the two budget neutrality factors is the cumulative wage and recalibration budget neutrality factor. The cumulative wage and recalibration budget neutrality adjustment is 0.998982, or approximately 0.10 percent, which is applied to the national standardized amounts. Because the wage budget neutrality and the recalibration budget neutrality are calculated under different methodologies according to the statute, when the two budget neutralities are combined and applied to the standardized amount, the overall payment impact is not necessarily budget neutral. However, in this final rule, we are estimating that the changes in the MS-DRG relative weights and updated wage data with wage and budget neutrality applied will result in a 0.0 percent change in payments.

e. Effects of MGCRB Reclassifications (Column 6)

Our impact analysis to this point has assumed acute care hospitals are paid on the basis of their actual geographic location (with the exception of ongoing policies that provide that certain hospitals receive payments on other bases than where they are geographically located). The changes in Column 6 reflect the per case payment impact of moving from this baseline to a simulation incorporating the MGCRB decisions for FY 2015 and the effects of the adoption of the new OMB labor market area delineations on these reclassifications which affect hospitals' wage index area assignments.

By spring of each year, the MGCRB makes reclassification determinations that will be effective for the next fiscal year, which begins on October 1. The MGCRB may approve a hospital's reclassification request for the purpose of using another area's wage index value. Hospitals may appeal denials of MGCRB decisions to the CMS Administrator. Further, hospitals had 45 days from publication of the IPPS proposed rule in the Federal Register to decide whether to withdraw or terminate an approved geographic reclassification for the following year.

The overall effect of geographic reclassification is required by section 1886(d)(8)(D) of the Act to be budget neutral. Therefore, for purposes of this impact analysis, we are applying an adjustment of 0.990406 to ensure that the effects of the reclassifications under section 1886(d)(10) of the Act are budget neutral (section II.A. of the Addendum to this final rule). Geographic reclassification generally benefits hospitals in rural areas. We estimate that the geographic reclassification will increase payments to rural hospitals by an average of 1.5 percent. By region, all the rural hospital categories will experience increases in payments due to MGCRB reclassifications.

Table 9A listed in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site reflects the reclassifications for FY 2015.

f. Effects of the Rural and Imputed Floor, Including Application of National Budget Neutrality (Column 7)

As discussed in section III.B. of the preamble of the FY 2009 IPPS final rule, the FY 2010 IPPS/RY 2010 LTCH PPS final rule, the FYs 2011, 2012, 2013 and 2014 IPPS/LTCH PPS final rules, and this final rule, section 4410 of Public Law 105-33 established the rural floor by requiring that the wage index for a hospital in any urban area cannot be less than the wage index received by rural hospitals in the same State. We apply a uniform budget neutrality adjustment to the wage index. The imputed floor, which is also included in the calculation of the budget neutrality adjustment to the wage index, was extended in FY 2012 for 2 additional years. In the past, only urban hospitals in New Jersey received the imputed floor. As discussed in the FY 2013 IPPS/LTCH PPS final rule (77 FR 53369), we established an alternative temporary methodology for the imputed floor, which resulted in an imputed floor for Rhode Island for FY 2013. For FY 2014, we extended the imputed rural floor, as calculated under the original methodology and the alternative methodology. For FY 2015, we are extending the imputed rural floor for one year, as calculated under the original methodology and the alternative methodology. As a result, New Jersey, Rhode Island, and Delaware are able to receive an imputed floor. In New Jersey, 15 out of 64 hospitals will receive the imputed floor, and 4 out of 11 hospitals in Rhode Island will receive the imputed floor for FY 2015. In the FY 2015 IPPS/LTCH PPS proposed rule (78 FR 28356), we stated that one out of six hospitals in Delaware would benefit from the imputed floor. However, in this final rule, no hospitals are benefitting from the imputed floor in Delaware because the CBSA wage index for each CBSA in Delaware under the new OMB delineations is equal to or higher than the imputed rural floor.

The Affordable Care Act requires that we apply one rural floor budget neutrality factor to the wage index nationally, and the imputed floor is part of the rural floor budget neutrality factor applied to the wage index nationally. We have calculated an FY 2015 rural floor budget neutrality factor to be applied to the wage index of 0.989507, which reduces wage indexes by 1.0 percent.

Column 7 shows the projected impact of the rural floor and imputed floor with the national rural floor budget neutrality factor applied to the wage index based on the new OMB labor market area delineations. The column compares the post-reclassification FY 2015 wage index of providers before the rural floor and imputed floor adjustment and the post-reclassification FY 2015 wage index of providers with the rural floor and imputed floor adjustment based on the new OMB labor market area delineations. Only urban hospitals can benefit from the rural and imputed floors. Because the provision is budget neutral, all other hospitals (that is, all rural hospitals and those urban hospitals to which the adjustment is not made) will experience a decrease in payments due to the budget neutrality adjustment that is applied nationally to their wage index.

We estimate that 422 hospitals will benefit from the rural and imputed floors in FY 2015, while the remaining 2,974 IPPS hospitals in our model have their wage index reduced by the rural floor budget neutrality adjustment of 0.989507 (or 1.0 percent). We project that, in aggregate, rural hospitals will experience a 0.3 percent decrease in payments as a result of the application of the rural floor budget neutrality because the rural hospitals do not benefit from the rural floor, but have their wage indexes downwardly adjusted to ensure that the application of the rural floor is budget neutral overall. We project hospitals located in urban areas will experience no change in payments because increases in payments by hospitals benefitting from the rural floor offset decreases in payments by nonrural floor urban hospitals whose wage index is downwardly adjusted by the rural floor budget neutrality factor. Urban hospitals in the New England region can expect a 2.8 percent increase in payments primarily due to the application of the rural floor in Massachusetts. Fifty-one urban providers in Massachusetts are expected to receive the rural floor wage index value, including the rural floor budget neutrality of 1.3336, increasing payments overall to Massachusetts by an estimated $156 million. During most past years, there have been no IPPS hospitals located in rural areas in Massachusetts. There was one urban IPPS hospital that was reclassified to rural Massachusetts (under section 1886(d)(8)(E) of the Act) which established the Massachusetts rural floor, but the wage index resulting from that hospital's data was not high enough for any urban hospital to benefit from the rural floor policy. However, for the FY 2012 wage index, the rural floor for Massachusetts was established by a hospital that converted from a CAH to an IPPS hospital that is geographically located in rural Massachusetts. The rural floor in Massachusetts continues to be set by the wage index of the hospital in rural Massachusetts that converted from CAH to IPPS status. We estimate that Massachusetts hospitals will receive approximately a 4.9 percent increase in IPPS payments due to the application of the rural floor in FY 2015.

We wish to make note of a situation that occurred in the rural floor impact calculation for Massachusetts. In FY 2014, CMS calculated that 60 hospitals would benefit from the Massachusetts rural floor, resulting in an estimated $167.6 million being received by Massachusetts hospitals via the national rural floor budget neutrality adjustment. In FY 2015, fewer Massachusetts hospitals, 51 hospitals, have been identified as benefitting from the rural floor, and the fiscal impact of national budget neutrality has been reduced. We have received inquiries from commenters regarding this reduction, speculating whether the addition of one rural hospital in Franklin County, MA reduced the impact of the Massachusetts rural floor. The commenters are correct that the addition of one rural hospital in Massachusetts reduced the impact of the rural floor in FY 2015 as compared to the impact of the rural floor in FY 2014. We refer readers to section II.A.4.(c) of the Addendum to this final rule for a complere discussion on this issue.

Urban Puerto Rico hospitals are expected to experience a 0.0 percent change in payments as a result of the application of a Puerto Rico rural floor with the application of the Puerto Rico rural floor budget neutrality adjustment. We are applying a rural floor budget neutrality factor to the Puerto Rico-specific wage index of 0.991291 or −0.87 percent. The Puerto Rico-specific wage index adjusts the Puerto Rico-specific standardized amount, which represents 25 percent of payments to Puerto Rico hospitals. The increases in payments experienced by the urban Puerto Rico hospitals that benefit from a rural floor are offset by the decreases in payments by the urban Puerto Rico hospitals that do not benefit from the rural floor that have their wage indexes downwardly adjusted by the rural floor budget neutrality adjustment. As a result, overall, urban Puerto Rico hospitals will experience a 0.0 percent change in payments due to the application of the rural floor with rural floor budget neutrality.

There are 15 hospitals out of the 64 hospitals in New Jersey that benefit from the extension of the imputed floor and will receive the imputed floor wage index value under the new OMB labor market area delineations, including the rural floor budget neutrality of 1.121 which we estimate will increase payments to those imputed floor hospitals by $24 million (overall, the State will see an increase in payments of approximately $2.7 million due to the other hospitals in the State experiencing decreases in payments due to the rural floor budget neutrality adjustment). Four Rhode Island hospitals will benefit from the imputed rural floor calculated under the alternative methodology and receive an additional $3.7 million (overall, the State will receive an additional $1.9 million). As mentioned earlier, in the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28356), we stated that one hospital in Delaware would benefit from the imputed floor. However, in this final rule, no hospitals are benefitting from the imputed floor in Delaware because the CBSA wage index for each CBSA in Delaware under the new OMB delineations is equal to or higher than the imputed rural floor.

In response to a public comment addressed in the FY 2012 IPPS/LTCH PPS final rule (76 FR 51593), we are providing the payment impact of the rural floor and imputed floor with budget neutrality at the State level. Column 1 of the table below displays the number of IPPS hospitals located in each State. Column 2 displays the number of hospitals in each State that will receive the rural floor or imputed floor wage index for FY 2015 based on the new OMB labor market area delineations. Column 3 displays the percentage of total payments each State will receive or contribute to fund the rural floor and imputed floor with national budget neutrality based on the new OMB labor market area delineations. The column compares the post-reclassification FY 2015 wage index of providers before the rural floor and imputed floor adjustment and the post-reclassification FY 2015 wage index of providers with the rural floor and imputed floor adjustment with the wage indexes calculated based on the new OMB labor market area delineations. Column 4 displays the estimated payment amount that each State will gain or lose due to the application of the rural floor and imputed floor with national budget neutrality.

Comment: Some commenters requested that CMS include in the FY 2015 IPPS/LTCH PPS final rule an updated detailed State-specific analysis of the effects of nationwide rural floor budget neutrality. In addition, the commenters requested that CMS publish a table showing the cumulative State-specific and aggregate inpatient and outpatient payment impact of the nationwide rural floor with budget neutrality and project the estimated 10-year State-specific effects of continuing the current policy.

Response: We appreciate the commenters' request for additional analysis on the impact of the rural floor on inpatient and outpatient payments. Commenters may request to view the OPPS impacts of the rural floor policy through the public comment period for the CY 2015 OPPS/ASC proposed rule that closes on September 2, 2014. In addition, we are unable to provide a State-by-State impact with a 10-year projection of the rural floor because the rural floor is based on wage data that are updated annually. Therefore, we believe it would be difficult to accurately portray the rural floor in 10-year projections. We have updated our State-by-State rural floor budget neutrality impact analysis for this FY 2015 IPPS/LTCH PPS final rule.

FY 2015 IPPS Estimated Payments Due to Rural Floor and Imputed Floor With National Budget Neutrality

StateNumber of hospitalsNumber of hospitals that will receive the rural floor or imputed floorPercent change in payments due to application of rural floor and imputed floor with budget neutralityDifference (in millions)
(1)(2)(3)(4)
Alabama912−0.5−$8.4
Alaska641.52.2
Arizona579−0.1−1.9
Arkansas450−0.5−5.3
California3092001.9188.8
Colorado4760.22.3
Connecticut318−0.4−6.5
Delaware60−0.6−2.4
Washington, D.C.70−0.6−2.6
Florida16925−0.3−18.6
Georgia1060−0.5−13.3
Hawaii120−0.4−1.3
Idaho140−0.4−1.2
Illinois1270−0.6−28.1
Indiana910−0.6−13.2
Iowa340−0.5−4.5
Kansas530−0.4−3.8
Kentucky651−0.5−7.9
Louisiana1000−0.5−7.0
Maine200−0.5−2.5
Massachusetts61514.9155.6
Michigan950−0.5−22.9
Minnesota510−0.5−10.0
Mississippi640−0.5−5.3
Missouri780−0.5−11.2
Montana124−0.3−0.8
Nebraska230−0.4−2.6
Nevada2460.74.6
New Hampshire1392.210.5
New Jersey64150.12.7
New Mexico252−0.2−0.7
New York1630−0.6−47.4
North Carolina870−0.5−15.8
North Dakota61−0.3−0.9
Ohio13510−0.4−16.9
Oklahoma862−0.5−5.7
Oregon330−0.5−4.7
Pennsylvania15410−0.5−23.3
Puerto Rico52110−0.1
Rhode Island1140.51.9
South Carolina557−0.3−5.0
South Dakota190−0.3−1.1
Tennessee9816−0.2−5.6
Texas3246−0.5−30.3
Utah332−0.4−2.2
Vermont60−0.3−0.7
Virginia791−0.5−12.0
Washington498−0.2−3.0
West Virginia302−0.4−3.1
Wisconsin650−0.5−8.6
Wyoming110−0.2−0.3

g. Impact of the New OMB Delineations (Column 8)

Column 8 shows the effects of the adoption of the new OMB labor market area delineations. This column compares the payments under the rural and imputed floor wage index with rural floor budget neutrality calculated under the new OMB delineations and the payments under the rural and imputed floor wage index with budget neutrality calculated under the current OMB delineations. It does not reflect the 3-year transition for hospitals that are currently located in urban counties that become rural under the new OMB delineations and the 1-year transition to the new OMB delineations where the wage indexes are blended such that hospitals receive 50 percent of their wage index based on the new OMB delineations, and 50 percent of their wage index based on their current labor market area. Rather, it shows the impact of the new OMB delineations fully implemented for FY 2015. Approximately 609 hospitals have their wage index impacted due to the new OMB delineations. Urban New England and rural Middle Atlantic hospitals will experience the largest decreases in payments due to the new OMB delineations being fully implemented for FY 2015, with payment decreases of 0.5 and 0.2 percent, respectively. Urban non-DSH hospitals, nonteaching and non-DSH hospitals, and Lugar hospitals will experience the largest increases in payments due to the new OMB delineations being fully implemented for FY 2015, each with payment increases of 0.2 percent.

h. Application of the CBSA Transition Wage Index With Budget Neutrality (Column 9)

As discussed earlier in this final rule, for FY 2015, we are using the most recent labor market area delineations issued by OMB but we established a transition period in certain circumstances. Specifically, we established a 3-year transition for hospitals that are currently located in an urban county that becomes rural under the new OMB labor market area delineations under which such hospitals will be assigned the urban wage index value of the CBSA in which they are physically located for FY 2014 for a period of 3 fiscal years (that is, for FYs 2015, 2016, and 2017). We also are establishing a 1-year blended wage index for all hospitals that experience any decrease in their actual payment wage index (that is, a hospital's actual wage index used for payment, which accounts for all applicable effects of reclassification and redesignation) exclusively due to the implementation of the new OMB labor market area delineations. We are providing that a post-reclassified wage index with the rural and imputed floor applied be computed based on the hospital's FY 2014 CBSA (that is, using all of its FY 2014 constituent county/ies), and another post-reclassified wage index with the rural and imputed floor applied be computed based on the hospital's new FY 2015 CBSA (that is, the FY 2015 constituent county/ies). We compared these two wage indexes. If the FY 2015 wage index with FY 2015 CBSAs was lower than the FY 2015 wage index with FY 2014 CBSAs, we computed a blended wage index, consisting of 50 percent of each of the two wage indexes added together. This blended wage index is the hospital's wage index for FY 2015. This adjustment only applies to hospitals that will experience a decrease in their actual payment wage index exclusively due to the implementation of the new OMB labor market area delineations. Hospitals that benefit from the new OMB labor market area delineations receive their new wage index based on the new OMB labor market area delineations. We refer readers to section III.B. of the preamble to this final rule for a complete discussion on the transition wage indexes. Lastly, we are applying both the 3-year transition and 50/50 blended wage index adjustments in a budget neutral manner. We are making an adjustment to the standardized amount to ensure that the total payments, including the effect of the transition provisions, equal what payments would have been if we had not provided for these transitional wage indexes.

Column 9 shows the effects of the adoption of the new OMB labor market area delineations, including the 3-year hold harmless provision for hospitals that are currently located in an urban county that becomes rural under the new OMB delineations and the 1-year transition to the new OMB delineations where the wage indexes are blended such that hospitals receive 50 percent of their wage index based on the new OMB delineations and 50 percent of their wage index based on their current labor market area. For FY 2015, we are applying both the 3-year transition and 50/50 blended wage index adjustments in a budget neutral manner, with a budget neutrality factor of 0.998859 (or −0.1 percent) applied to the standardized amount to ensure that the total payments, including the effect of the transition provisions, equal what payments would have been if we had not provided for these transitional wage indexes. This column shows the payment impact of the transitional wage index. For columns 1 through 8, the payment impacts and budget neutrality factors have been calculated under the new OMB delineations. Under the 1-year transition to the new OMB delineations, hospitals that would have experienced a decrease in payments due to the new OMB delineations being fully implemented this year now have those decreases alleviated due to the transition. Urban New England hospitals and Middle Atlantic hospitals will experience a 0.2 percent and 0.3 percent increase respectively in payments due to the application of the transitional wage index with budget neutrality, while urban South Atlantic, East North Central, East South Central, West North Central, West South Central, Mountain and Pacific hospitals will experience a −0.1 percent change in payments due to the transitional budget neutrality adjustment of −0.1 percent applied to the standard Federal rate.

i. Effects of the Application of the Frontier State Wage Index and Out-Migration Adjustment (Column 10)

This column shows the combined effects of the application of section 10324(a) of the Affordable Care Act, which requires that we establish a minimum post-reclassified wage-index of 1.00 for all hospitals located in “frontier States,” and the effects of section 1886(d)(13) of the Act, as added by section 505 of Public Law 108-173, which provides for an increase in the wage index for hospitals located in certain counties that have a relatively high percentage of hospital employees who reside in the county, but work in a different area with a higher wage index. These two wage index provisions are not budget neutral and increase payments overall by 0.1 percent compared to the provisions not being in effect.

The term “frontier States” is defined in the statute as States in which at least 50 percent of counties have a population density less than 6 persons per square mile. Based on these criteria, four States (Montana, North Dakota, South Dakota, and Wyoming) are considered frontier States and 46 hospitals located in those States will receive a frontier wage index of 1.0000. Nevada is also, by definition, a frontier State and was assigned a frontier floor value of 1.0000 for FY 2012, but since then and including in this final rule, its rural floor value has been greater than 1.0000 so it has not been subject to the frontier wage index. Overall, this provision is not budget neutral and is estimated to increase IPPS operating payments by approximately $67 million or approximately 0.1 percent. Rural hospitals located in the Mountain region and urban hospitals located in the West North Central region will experience an increase in payments by 0.6 and 0.8 percent, respectively, because many of the hospitals located in this region are frontier State hospitals.

In addition, section 1886(d)(13) of the Act, as added by section 505 of Public Law 108-173, provides for an increase in the wage index for hospitals located in certain counties that have a relatively high percentage of hospital employees who reside in the county, but work in a different area with a higher wage index. Hospitals located in counties that qualify for the payment adjustment are to receive an increase in the wage index that is equal to a weighted average of the difference between the wage index of the resident county, post-reclassification and the higher wage index work area(s), weighted by the overall percentage of workers who are employed in an area with a higher wage index. There are an estimated 273 providers that will receive the out-migration wage adjustment in FY 2015. Rural hospitals generally qualify for the adjustment, resulting in a 0.1 percent increase in payments. This provision appears to benefit Section 401 hospitals and RRCs in that they will experience a 2.0 percent and 0.6 percent increase in payments, respectively. This out-migration wage adjustment also is not budget neutral, and we estimate the impact of these providers receiving the out-migration increase to be approximately $53 million.

j. Effects of the Reductions Under the Hospital Readmissions Reduction Program (Column 11)

Column 11 shows our estimates of the effects of the policies for reductions in payments under the Hospital Readmissions Reduction Program, which was established under section 3025 of the Affordable Care Act. The Hospital Readmissions Reduction Program requires a reduction to a hospital's base operating DRG payments to account for excess readmissions, which for FY 2015, is based on a hospital's risk-adjusted readmission rate during a 3-year period for five applicable conditions: acute myocardial infarction, heart failure, pneumonia, total hip and total knee arthroplasty and chronic obstructive pulmonary disease. This provision is not budget neutral. A hospital's readmission adjustment is the higher of a ratio of the hospital's aggregate payments for excess readmissions to their aggregate payments for all discharges, or a floor, which has been defined in the statute as 0.97 (or a 3.0 percent reduction) for FY 2015. A hospital's base operating DRG payment (that is, wage-adjusted DRG payment amount, as discussed in section IV.G. of the preamble of this final rule) is the portion of the IPPS payment subject to the readmissions payment adjustment (DSH, IME, outliers and low-volume add-on payments are not subject to the readmissions adjustment). For FY 2015, we have revised the definition of base operating DRG payment for MDHs to include the hospital-specific add-on amount, as discussed earlier in this final rule such that the this hospital-specific add-on amount is also subject to the readmissions payment adjustment. In this final rule, we estimate that 2,638 hospitals will have their base operating DRG payments reduced by their hospital-specific readmissions adjustment, an increase from FY 2014, due to the addition of new readmissions measures in the program. As a result, we estimate that the Hospital Readmissions Reduction Program will result in a 0.2 percent decrease in payments relative to FY 2014. We estimate that the Hospital Readmissions Reduction Program will result in a 0.4 percent decrease in payments relative to no provision (or a decrease of $424 million).

Teaching non-DSH hospitals experience a decrease in payments of 0.3 percent relative to last year, while teaching DSH hospitals experience a 0-1 percent decrease in payments relative to last year. Puerto Rico hospitals will show a 0.0 percent change in payments because they are exempt from the provision.

k. Effects of the Changes to Medicare DSH Payments (Column 12)

Column 12 shows the effects of the adjustments to Medicare DSH payments made under section 3133 of the Affordable Care Act. Under section 3133, hospitals that are eligible to receive Medicare DSH payments will receive 25 percent of the amount they previously would have received under the former statutory formula for Medicare DSH payments. The remainder, equal to an estimate of 75 percent of what otherwise formerly would have been paid as Medicare DSH payments, reduced to reflect changes in the percentage of individuals under age 65 who are uninsured and additional statutory adjustments, is available to make additional payments to each hospital that qualifies for Medicare DSH payments. Each Medicare DSH hospital will receive an additional payment based on its estimated share of the total amount of uncompensated care for all Medicare DSH hospitals. The reduction to Medicare DSH payments is not budget neutral.

For FY 2015, we are establishing that the amount to be distributed on the basis of uncompensated care, which is 75 percent of our estimate of what otherwise would have been paid in Medicare DSH payments (that is, Factor 1), be adjusted to 76.19 percent of that amount to reflect changes in the percentage of individuals under age 65 who are uninsured and additional statutory adjustments (that is, Factor 1 multiplied by Factor 2). In the FY 2015 IPPS/LTCH PPS proposed rule the uncompensated care payment was 75 percent of what otherwise would have been paid for Medicare DSH payment adjustments adjusted by a Factor 2 of 80.36 percent and for FY 2014, the uncompensated care payment was 75 percent of what otherwise would have been paid for Medicare DSH payment adjustments adjusted by a Factor 2 of 94.3 percent. Assuming DSH payments are constant, the FY 2015 uncompensated care amount is approximately 14 percentage points less than the uncompensated care amount that we distributed for FY 2014. As a result, we project that, compared to the empirically justified DSH payments and the uncompensated care payments made last year, payments for FY 2015 will be reduced overall by 1.3 percent as compared to Medicare DSH payments made last year under the first year of the implementation of section 3133 of the Affordable Care Act. The uncompensated care payment methodology has redistributive effects based on a Medicare DSH hospital's low income insured patient days (sum of Medicaid patient days and Medicare SSI patient days) relative to the Medicaid patient days and Medicare SSI patient days for Medicare DSH hospitals, and the final payment amount is not tied to a hospital's discharges.

Rural West South Central and Rural Pacific will experience a 0.3 percent change in DSH and uncompensated care payments. Hospitals with low Medicare utilization (Medicare days are less than 25 percent of total inpatient days) will experience the largest decreases in payments of 3.0 percent.

l. Effects of All FY 2015 Changes (Column 13)

Column 13 shows our estimate of the changes in payments per discharge from FY 2014 and FY 2015, resulting from all changes reflected in this final rule for FY 2015. It includes combined effects of the previous columns in the table.

The average decrease in payments under the IPPS for all hospitals is approximately 0.6 percent for FY 2015 relative to FY 2014. As discussed in section II.D. of the preamble of this final rule, this column includes the FY 2015 documentation and coding recoupment adjustment of -0.8 percent on the national standardized amount as part of the recoupment required under section 631 of the ATRA. In addition, this column includes the annual hospital update of 2.2 percent to the national standardized amount. This annual hospital update includes the 2.9 percent market basket update, the reduction of 0.5 percentage point for the multifactor productivity adjustment, and the 0.2 percentage point reduction under section 3401 of the Affordable Care Act. Hospitals paid under the hospital-specific rate will receive a 2.2 percent hospital update described above. As described in Column 2, the annual hospital update with the documentation and coding recoupment adjustment for hospitals paid under the national standardized amount combined with the annual hospital update for hospitals paid under the hospital-specific rate will result in a 1.5 percent increase in payments in FY 2015 relative to FY 2014. Column 11 shows the estimated 0.2 percent decrease in payments due to the reductions in payments under the Hospital Readmissions Reduction Program relative to FY 2014. Column 12 shows the estimated 1.3 percent decrease in Medicare DSH payments due to the changes made under section 3133 of the Affordable Care Act, which reduces Medicare DSH payments by 75 percent and redistributes the remainder, equal to an estimate of 75 percent of what otherwise would have been paid as Medicare DSH payments, reduced to reflect changes in the percentage of individuals under age 65 who are uninsured and an additional statutory adjustment, to each hospital that qualifies for Medicare DSH payments as an uncompensated care payment based on the hospital's relative share of the total amount of uncompensated care. The impact of moving from our estimate of FY 2014 outlier payments, 5.71 percent, to the estimate of FY 2015 outlier payments, 5.1 percent, will result in a decrease of 0.6 percent in FY 2015 payments relative to FY 2014. Lastly, this column reflects the extension of MDH payment status for the first half of FY 2015, under Public Law 113-93, enacted on April 1, 2014. There also might be interactive effects among the various factors comprising the payment system that we are not able to isolate. For these reasons, the values in Column 13 may not equal the sum of the estimated percentage changes described above. (We note that in the FY 2015 IPPS/LTCH PPS proposed rule we provided the effects of section 1886(o) of the Act, as added by section 3008 of the Affordable Care Act, which establishes payment reductions under the HAC Reduction Program. Hospitals ranked in the lowest 25 percent of performance on HACs are subject to a 1-percent reduction in total IPPS payments. We are finalizing policies related to the HAC Reduction Program in this final rule, but as described earlier in this final rule, because the HAC scores are currently undergoing 30-day review and correction by the hospitals, we are not providing hospital-level data or a hospital-level payment impact in conjunction with the FY 2015 IPPS Final Rule. We do provide an estimate of the overall payment impact in section I.H.8. of this Appendix A along with a discussion of the impact of these changes.)

Overall payments to hospitals paid under the IPPS are estimated to decrease by 0.6 percent for FY 2015. Much of the payment changes among the hospital categories is attributed to the reduction in Medicare DSH payments and the redistribution of a portion of the Medicare DSH payments as an additional payment for hospitals' relative uncompensated care amounts. Hospitals in urban areas will experience a 0.6 percent decrease in payments per discharge in FY 2015 compared to FY 2014. Hospital payments per discharge in rural areas are estimated to decrease by 0.7 percent in FY 2015.

3. Impact Analysis of Table II

Table II presents the projected impact of the changes for FY 2015 for urban and rural hospitals and for the different categories of hospitals shown in Table I. It compares the estimated average payments per discharge for FY 2014 with the estimated average payments per discharge for FY 2015, as calculated under our models. Therefore, this table presents, in terms of the average dollar amounts paid per discharge, the combined effects of the changes presented in Table I. The estimated percentage changes shown in the last column of Table II equal the estimated percentage changes in average payments per discharge from Column 13 of Table I.

Table II—Impact Analysis of Changes for FY 2015 Acute Care Hospital Operating Prospective Payment System

[Payments per discharge]

Number of hospitalsEstimated average FY 2014 payment per dischargeEstimated average FY 2015 payment per dischargeAll FY 2015 changes
(1)(2)(3)(4)
All Hospitals3,39611,19711,129−0.6
By Geographic Location:
Urban hospitals2,54911,56611,496−0.6
Large urban areas1,40112,29612,226−0.6
Other urban areas1,14810,67710,608−0.6
Rural hospitals8478,2388,184−0.7
Bed Size (Urban):
0-99 beds6669,0859,054−0.3
100-199 beds7879,7309,661−0.7
200-299 beds45510,47010,448−0.2
300-499 beds42911,89211,814−0.7
500 or more beds21214,18514,075−0.8
Bed Size (Rural):
0-49 beds3286,7786,695−1.2
50-99 beds3057,8037,686−1.5
100-149 beds1258,1128,099−0.2
150-199 beds508,8568,808−0.5
200 or more beds399,97910,0080.3
Urban by Region:
New England12012,68812,6840
Middle Atlantic32412,76212,752−0.1
South Atlantic40710,42310,327−0.9
East North Central39710,79510,733−0.6
East South Central15310,0449,911−1.3
West North Central16211,31611,275−0.4
West South Central38710,67410,492−1.7
Mountain16211,89511,793−0.9
Pacific38514,62614,6380.1
Puerto Rico528,1497,543−7.4
Rural by Region:
New England2211,18011,080−0.9
Middle Atlantic578,2898,216−0.9
South Atlantic1327,8347,764−0.9
East North Central1168,4748,4840.1
East South Central1657,5137,404−1.4
West North Central1028,9148,9250.1
West South Central1687,1086,974−1.9
Mountain619,4549,5030.5
Pacific2411,08311,2071.1
By Payment Classification:
Urban hospitals2,56311,55111,480−0.6
Large urban areas1,41312,28612,214−0.6
Other urban areas1,15010,64510,576−0.6
Rural areas8338,4548,401−0.6
Teaching Status:
Nonteaching2,3579,3439,296−0.5
Fewer than 100 residents79510,94110,879−0.6
100 or more residents24416,32116,187−0.8
Urban DSH:
Non-DSH6799,8019,8630.6
100 or more beds1,58811,99011,893−0.8
Less than 100 beds3838,4318,366−0.8
Rural DSH:
SCH3737,9077,858−0.6
RRC2129,1909,162−0.3
100 or more beds247,3907,297−1.3
Less than 100 beds1376,3286,247−1.3
Urban teaching and DSH:
Both teaching and DSH84213,17513,063−0.9
Teaching and no DSH13311,02711,1250.9
No teaching and DSH1,1299,7819,709−0.7
No teaching and no DSH4599,2239,2880.7
Special Hospital Types:
RRC1939,3729,316−0.6
SCH3259,5709,6360.7
MDH1627,0736,700−5.3
SCH and RRC12410,28910,3941
MDH and RRC159,1958,450−8.1
Type of Ownership:
Voluntary1,93511,31911,274−0.4
Proprietary8929,9869,900−0.9
Government54212,21412,038−1.4
Medicare Utilization as a Percent of Inpatient Days:
0-2550114,70514,357−2.4
25-502,08111,31111,261−0.4
50-656019,1379,131−0.1
Over 65938,4068,349−0.7
FY 2015 Reclassifications by the Medicare Geographic Classification Review Board:
All Reclassified Hospitals71910,79110,771−0.2
Non-Reclassified Hospitals2,67711,32711,243−0.7
Urban Hospitals Reclassified45011,44611,429−0.1
Urban Nonreclassified Hospitals, FY 20152,05411,61811,535−0.7
All Rural Hospitals Reclassified FY 20152698,7328,702−0.4
Rural Nonreclassified Hospitals FY 20155147,6657,597−0.9
All Section 401 Reclassified Hospitals5010,13010,012−1.2
Other Reclassified Hospitals (Section 1886(d)(8)(B))647,8127,658−2
Specialty Hospitals:
Cardiac Specialty Hospitals1512,30312,5672.1

H. Effects of Other Policy Changes

In addition to those policy changes discussed above that we are able to model using our IPPS payment simulation model, we are making various other changes in this final rule. Generally, we have limited or no specific data available with which to estimate the impacts of these changes. Our estimates of the likely impacts associated with these other changes are discussed below.

1. Effects of Policy on MS-DRGs for Preventable HACs, Including Infections

In section II.F. of the preamble of this final rule, we discuss our implementation of section 1886(d)(4)(D) of the Act, which requires the Secretary to identify conditions that are: (1) high cost, high volume, or both; (2) result in the assignment of a case to an MS-DRG that has a higher payment when present as a secondary diagnosis; and (3) could reasonably have been prevented through application of evidence-based guidelines. For discharges occurring on or after October 1, 2008, hospitals will not receive additional payment for cases in which one of the selected conditions was not present on admission, unless, based on data and clinical judgment, it cannot be determined at the time of admission whether a condition is present. That is, the case will be paid as though the secondary diagnosis were not present. However, the statute also requires the Secretary to continue counting the condition as a secondary diagnosis that results in a higher IPPS payment when doing the budget neutrality calculations for MS-DRG reclassifications and recalibration. Therefore, we will perform our budget neutrality calculations as though the payment provision did not apply, but Medicare will make a lower payment to the hospital for the specific case that includes the secondary diagnosis. Thus, the provision results in cost savings to the Medicare program.

We note that the provision will only apply when one or more of the selected conditions are the only secondary diagnosis or diagnoses present on the claim that will lead to higher payment. Medicare beneficiaries will generally have multiple secondary diagnoses during a hospital stay, such that beneficiaries having one MCC or CC will frequently have additional conditions that also will generate higher payment. Only a small percentage of the cases will have only one secondary diagnosis that would lead to a higher payment. Therefore, if at least one nonselected secondary diagnosis that leads to higher payment is on the claim, the case will continue to be assigned to the higher paying MS-DRG and there will be no Medicare savings from that case. In addition, as discussed in section II.F.3. of the preamble of this final rule, it is possible to have two severity levels where the HAC does not affect the MS-DRG assignment or for an MS-DRG not to have severity levels. In either of these circumstances, the case will continue to be assigned to the higher paying MS-DRG and there will be no Medicare savings from that case.

The HAC payment provision went into effect on October 1, 2008. Our savings estimates for the next 5 fiscal years are shown below:

YearSavings (in millions)
FY 2015$27
FY 201629
FY 201731
FY 201834
FY 201936

In section IV.J. of the preamble of this final rule, we are making changes to the HAC Reduction Program for FY 2015. We refer readers to section I.H.6. of this Appendix A for a discussion of the impact of these changes.

2. Effects of Policy Relating to New Medical Service and Technology Add-On Payments

In section II.I. of the preamble to this final rule, we discuss five applications (Dalbavancin, Heli-FXTM EndoAnchor System, CardioMEMSTM HF (Heart Failure) Monitoring System, MitraClip® System, and Responsive Neurostimulator (RNS®) System) for add-on payments for new medical services and technologies for FY 2015, as well as the status of the new technologies that were approved to receive new technology add-on payments in FY 2014. We note that one of the applications (for the Watchman® System) discussed in the proposed rule withdrew its application prior to the publication of this final rule.

As explained in the preamble to this final rule, add-on payments for new medical services and technologies under section 1886(d)(5)(K) of the Act are not required to be budget neutral. As discussed in section II.I.4. of the preamble of this final rule, we are approving three of the five applications (CardioMEMSTM HF Monitoring System, MitraClip® System, and RNS® System) for new technology add-on payments for FY 2015. As we proposed, in this final rule, we also are continuing to make new technology add-on payments in FY 2015 for KcentraTM, Argus® II Retinal Prosthesis System, the Zilver® PTX® Drug Eluting Peripheral Stent, Voraxaze®, and the Zenith® F. Graft (because all of these technologies are still within the 3-year anniversary of the product's entry onto the market). We note that new technology add-on payments per case are limited to the lesser of: (1) 50 percent of the costs of the new technology; or (2) 50 percent of the amount by which the costs of the case exceed the standard MS-DRG payment for the case. Because it is difficult to predict the actual new technology add-on payment for each case, our estimates below are based on the increase in add-on payments for FY 2015 as if every claim that would qualify for a new technology add-on payment would receive the maximum add-on payment. Based on the applicant's estimate from FY 2013, we currently estimate that new technology add-on payments for Voraxaze® will increase overall FY 2015 payments by $6,300,000. Based on the applicant's estimate from FY 2013, we currently estimate that new technology add-on payments for the Zenith® F. Graft will increase overall FY 2015 payments by $4,085,750. Based on the applicant's estimate for FY 2014, we currently estimate that new technology add-on payments for KcentraTM will increase overall FY 2015 payments by $5,449,888. Based on the applicant's estimate for FY 2014, we currently estimate that new technology add-on payments for the Argus® II Retinal Prosthesis System will increase overall FY 2015 payments by $3,601,437. Based on the applicant's estimate for FY 2014, we currently estimate that new technology add-on payments for the Zilver® PTX® Drug Eluting Peripheral Stent will increase overall FY 2015 payments by $20,463,000. Based on the applicant's estimate for FY 2015, we currently estimate that new technology add-on payments for the CardioMEMSTM HF Monitoring System will increase overall FY 2015 payments by $11,315,625 (maximum add-on payment of $8,875 * 1,275 patients). Based on the applicant's estimate for FY 2015, we currently estimate that new technology add-on payments for the MitraClip® System will increase overall FY 2015 payments by $27,000,000 (maximum add-on payment of $15,000 * 1,800 patients). Based on the applicant's estimate for FY 2015, we currently estimate that new technology add-on payments for the RNS® System will increase overall FY 2015 payments by $12,932,500 (maximum add-on payment of $18,475 * 700 patients).

3. Effects of Changes to List of MS-DRGs Subject to Postacute Care Transfer and DRG Special Pay Policy

In section IV.A. of the preamble of this final rule, we discuss changes to the list of MS-DRGs subject to the postacute care transfer and DRG special payment policies. As reflected in Table 5 listed in section VI. of the Addendum to this final rule and available via the Internet on the CMS Web site, using criteria set forth in regulation at § 412.4, we evaluated MS-DRG charge, discharge, and transfer data to determine which MS-DRGs qualify for the postacute care transfer and DRG special pay policies. We note that we are making no change to these payment policies in this FY 2015 final rule. We are changing the status of certain MS-DRGs as a result of revisions to the MS-DRGs for FY 2015. We are changing the status of five MS-DRGs to qualify for the postacute care transfer policy in FY 2015. One additional MS-DRG that qualified under the policy in FY 2014 does not qualify in FY 2015, and we are changing the status accordingly. Finally, five MS-DRGs now qualify for the MS-DRG special pay policy in FY 2015 after not qualifying in FY 2014, and we are adding them to the list of qualifying MS-DRGs. Column 4 of Table I in this Appendix A shows the effects of the changes to the MS-DRGs and relative payment weights with the application of the recalibration budget neutrality factor to the standardized amounts. Section 1886(d)(4)(C)(i) of the Act requires us annually to make appropriate classification changes in order to reflect changes in treatment patterns, technology, and any other factors that may change the relative use of hospital resources. The analysis and methods determining the changes due to the MS-DRGs and relative payment weights accounts for and includes changes in MS-DRG postacute care transfer and special pay policy statuses. We refer readers to section I.G. of this Appendix for a more detailed discussion of payment impacts due to MS-DRG reclassification policies.

4. Effects of the Payment Adjustment for Low-Volume Hospitals for FY 2015

In section V.D. of the preamble of this final rule, we discuss the provisions of the Protecting Access to Medicare Act of 2014 (Pub. L.113-93) that extend for an additional year, through March 31, 2015, the temporary changes to the low-volume hospital definition and the methodology for determining the payment adjustment made by the Affordable Care Act for FYs 2011 and 2012, and extended through FY 2013 by the ATRA, and the first half of FY 2014 by the Pathway for SGR Reform Act (Pub. L. 113-67). Therefore, to qualify for the low-volume hospital payment adjustment for FY 2015 discharges occurring before April 1, 2015, under section 1886(d)(12) of the Act, a hospital must have less than 1,600 Medicare discharges and be located more than 15 miles from other IPPS hospitals The payment adjustment for eligible low-volume hospital FY 2015 discharges occurring before April 1, 2015, is a continuous, linear sliding scale adjustment ranging from an additional 25 percent payment adjustment to qualifying hospitals with 200 or fewer Medicare discharges to no additional payment to hospitals with 1,600 or more Medicare discharges.

Beginning with FY 2015 discharges occurring on or after April 1, 2015, in accordance with section 1886(d)(12) of the Act, the low-volume hospital definition and payment adjustment methodology revert back to the statutory requirements that were in effect prior to the amendments made by the Affordable Care Act as amended by subsequent legislation. Therefore, effective for FY 2015 discharges occurring on or after April 1, 2015 and subsequent years, in order to qualify as a low-volume hospital, a subsection (d) hospital must be more than 25 road miles from another subsection (d) hospital and have less than 200 discharges (that is, less than 200 discharges total, including both Medicare and non-Medicare discharges) during the fiscal year.

Based on the latest available data, we estimate that approximately 593 hospitals will qualify as a low-volume hospital in FY 2014 and in FY 2015 for discharges occurring before April 1, 2015. With the statutory changes to the low-volume hospital payment adjustment, we estimate only approximately five hospitals will continue to qualify as a low-volume hospital for FY 2015 discharges occurring on or after April 1, 2015. We project that the expiration of the temporary changes to the low-volume hospital definition and the payment adjustment methodology originally made by the Affordable Care Act and extended by subsequent legislation will result in a decrease in payments of approximately $152 million in FY 2015 as compared to the low-volume hospital payments in FY 2014. This estimate accounts for our projection of the five IPPS low-volume hospitals in FY 2014 that are expected to continue to receive a low-volume hospital payment adjustment of an additional 25 percent for FY 2015 discharges occurring on or after April 1, 2015.

5. Effects of Policies Related to IME Medicare Part C Add-On Payments to SCHs Paid According to Their Hospital-Specific Rates

In section IV.E.2. of the preamble of this final rule, we discuss our finalized policy related to IME add-on payments for Medicare Part C patients to SCHs that are paid according to their hospital-specific rates. Payments based on the Federal rate are based on the IPPS standardized amount and include all applicable IPPS add-on payments, such as outliers, DSH, and IME, while payments based on the hospital-specific rate include no add-on payments. The hospital-specific rate generally reflects the additional costs incurred by a teaching hospital for its Medicare Part A patients. However, the hospital-specific rate does not reflect the costs associated with Medicare Part C patients and there is currently no payment mechanism for SCHs paid based on their hospital-specific rate to receive the IME add-on payment for Medicare Part C patients. Accordingly, we are providing all SCHs that are subsection (d) teaching hospitals, IME add-on payments for applicable discharges of Medicare Part C patients in accordance with section 1886(d)(11) of the Act, regardless of whether the SCH is paid based on the Federal rate or its hospital-specific rate. In addition, we also are establishing that for purposes of the comparison of payments based on the Federal rate and payments based on the hospital-specific rate, IME payments under section 1886(d)(11) of the Act for Medicare Part C patients will no longer be included as part of the Federal rate payment. Because the IPPS Federal rate used in the MDH payment methodology is the same IPPS Federal rate that is used in the SCH payment methodology, this change to the comparison of payments based on the Federal rate and payments based on the hospital-specific rate also applies to the Federal rate payment amount used to determine payment to MDHs that are teaching hospitals (that is, in the determination of the payment amount in addition to the Federal rate payment that is equal to 75 percent of the amount by which the hospital-specific rate payment exceeds the Federal rate payment), as discussed in section IV.E.2. of the preamble of this final rule.

We estimate that the policy at section IV.E.2. of the preamble of this final rule will result in an increase in payments to approximately 45 hospitals that are both SCHs or MDHs and teaching hospitals of approximately $5.3 million in FY 2015.

6. Effects of the Extension of the MDH Program for the First Half of FY 2015

In section V.G. of the preamble of this final rule, we briefly discuss the statutory extension of the MDH program through March 31, 2015, that is, through the first half of FY 2015, by section 106 of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93). Hospitals that qualify as MDHs receive the higher of operating IPPS payments made under the Federal standardized amount or the payments made under the Federal standardized amount plus 75 percent of the amount by which the hospital-specific rate (a hospital-specific cost-based rate) exceeds the Federal standardized amount. Based on the latest available data we have for 177 MDHs, we project that 166 MDHs will receive the blended payment (that is, the Federal standardized amount plus 75 percent of the amount by which the hospital-specific rate exceeds the Federal standardized amount) for the first half of FY 2015 (that is, for discharges occurring through March 31, 2015). We estimate that those hospitals will experience an overall increase in payments of approximately $70.7 million as compared to our previous estimates of payments to these hospitals for FY 2015 prior to the extension of the MDH program through March 31, 2015, by section 106 of Public Law 113-93.

7. Effects of Changes Under the FY 2015 Hospital Value-Based Purchasing (VBP) Program

Section 1886(o)(1)(B) of the Act directs the Secretary to make value-based incentive payments under the Hospital VBP Program to hospitals that meet performance standards during the performance period for discharges occurring on or after October 1, 2012. These incentive payments will be funded for FY 2015 through a reduction to the FY 2015 base operating DRG payment for each discharge of 1.50 percent, as required by section 1886(o)(7)(B) of the Act. The applicable percentage for FY 2016 is 1.75 percent and for FY 2017 and subsequent years, it is 2 percent. We are required to ensure that the total amount available for value-based incentive payments is equal to the total amount of reduced payments for all hospitals for the fiscal year, as estimated by the Secretary.

We refer readers to the Hospital Inpatient VBP Program final rule (76 FR 26490 through 26547), the CY 2012 OPPS/ASC final rule with comment period (76 FR 74527 through 74547), the FY 2013 IPPS/LTCH PPS final rule (77 FR 53567 through 53614), the FY 2014 IPPS/LTCH PPS final rule (78 FR 50677 through 50707), and the CY 2014 OPPS/ASC final rule with comment period (78 FR 75120 through 75121) for further explanation of the details of the Hospital VBP Program.

We specifically refer readers to the FY 2013 IPPS/LTCH PPS final rule (77 FR 53582 through 53592) and the FY 2014 IPPS/LTCH PPS final rule (78 FR 50678 through 50679), for discussions of the measures and other policies that we adopted for the FY 2015 and FY 2016 Hospital VBP Programs.

In section IV.I. of the preamble of this final rule, we estimate the available pool of funds for value-based incentive payments in the FY 2015 Hospital VBP Program, which, in accordance with section 1886(o)(7)(C)(iii) of the Act, will be 1.50 percent of base operating DRG payments, or a total of approximately $1.4 billion. This estimated available pool for FY 2015 is based on the historical pool of hospitals that were eligible to participate in the FY 2014 Hospital VBP Program and the payment information from the March 2014 update to the FY 2013 MedPAR file.

The estimated impacts of the FY 2015 Hospital VBP Program by hospital characteristic, found in the table below, are based on historical TPSs. We used the FY 2014 Hospital VBP Program TPSs to calculate the proxy adjustment factors used for this impact analysis. These are the most recently available scores that hospitals were given an opportunity to review and correct. The proxy adjustment factors use estimated annual base operating DRG payment amounts derived from the March 2014 update to the FY 2013 MedPAR file. The proxy adjustment factors can be found in Table 16 associated with this final rule (available via the Internet on the CMS Web site).

The impact analysis shows that, for the FY 2015 Hospital VBP Program, the number of hospitals that will receive an increase in base operating DRG payment amount is slightly lower than the number of hospitals that will receive a decrease. Among urban hospitals, those in the New England, South Atlantic, East North Central, West North Central, and West South Central regions will have an increase, on average, in base operating DRG payment amount, and among rural hospitals, those in the New England and East North Central regions will have an increase, on average, in base operating DRG payment amounts.

Both urban and rural hospitals in the Middle Atlantic, East South Central, Mountain, and Pacific regions and rural hospitals in the South Atlantic, West North Central, and West South Central regions will receive an average decrease in base operating DRG payment amount. As the percent of DSH payments increases, we see a decrease in base operating DRG payment amount, while as the Medicare utilization (MCR) percent increases, we see an increase in base operating DRG payment amount.

Nonteaching and teaching hospitals will have an average decrease in base operating DRG payment amount.

Impact Analysis of Base Operating DRG Payment Amount Changes Resulting From the FY 2015 Hospital VBP Program

Number of hospitalsAverage (percent)
By Geographic Location:
All Hospitals2,728−0.038
Large Urban1,113−0.021
Other Urban910−0.030
Rural Area705−0.074
Urban hospitals2,023−0.025
0-99 beds3070.025
100-199 beds677−0.043
200-299 beds431−0.032
300-499 beds401−0.033
500 or more beds207−0.010
Rural hospitals705−0.074
0-49 beds161−0.042
50-99 beds296−0.088
100-149 beds148−0.074
150-199 beds55−0.106
200 or more beds45−0.067
By Region:
Urban By Region2,023−0.025
New England1120.058
Middle Atlantic279−0.076
South Atlantic3460.002
East North Central3500.052
East South Central121−0.043
West North Central1340.054
West South Central2480.003
Mountain130−0.086
Pacific303−0.155
Rural By Region705−0.074
New England210.044
Middle Atlantic64−0.150
South Atlantic136−0.024
East North Central1140.036
East South Central114−0.019
West North Central82−0.052
West South Central101−0.178
Mountain45−0.299
Pacific28−0.247
By MCR Percent:
0-25260−0.119
25-501,788−0.034
50-65605−0.016
Over 65460.003
By DSH Percent:
0-251,2530.013
25-501,220−0.064
50-65141−0.121
Over 65114−0.222
By Teaching Status:
Teaching933−0.041
Non-Teaching1,795−0.036

Actual FY 2015 Hospital VBP Program TPSs will not be reviewed and corrected by hospitals until after this FY 2015 IPPS/LTCH PPS final rule has been published. Therefore, the same historical universe of eligible hospitals and corresponding TPSs from the FY 2014 Hospital VBP Program are used for this updated impact analysis.

8. Effects of Changes to the HAC Reduction Program for FY 2015

In section IV.J. of the preamble of this final rule, we are establishing measures, scoring, and a risk adjustment methodology to implement the FY 2015 payment reduction under the HAC Reduction Program. Section 1886(p) of the Act, as added under section 3008(a) of the Affordable Care Act, establishes an adjustment to hospital payments for HACs, or a HAC Reduction program, under which payments to applicable hospitals are adjusted to provide an incentive to reduce HACs, effective for discharges occurring on October 1, 2014 and for subsequent program years.

We note that hospitals will have a payment impact for the first time in FY 2015. For FY 2015, we are presenting the overall impact of the HAC Reduction Program provision along with other IPPS payment provision impacts in section I.G. of this Appendix A. The table and analyses that we are presenting below show the distributional effect of the measures and scoring system for the HAC Reduction Program included in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707 through 50729).

For FY 2015, we note that we finalized a Total HAC Score methodology in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50707 through 50729) that assigns weights for Domain 1 and Domain 2 at 35 percent and 65 percent, respectively. Based on this methodology, the table below presents data on the proportion of hospitals, by structural characteristic, in the worst performing quartile based on the 35/65 weighting scheme.

The data for this simulation are derived from the AHRQ PSI results based on Medicare FFS discharges from July 2011 through June 2013, using version 4.5a of the AHRQ software, and CDC measure results were used based on Standard Infection Ratios (SIRs) calculated with data reported to the National Healthcare Safety Network for infections occurring between January 2012 and December 2013. To analyze the results by hospital characteristic, the FY 2015 proposed rule impact file were used. Of the 3,352 hospitals included in this analysis, 3,310 hospitals were included for geographic location, bed size, region, DSH percent, and teaching status; 3,270 for ownership; and 3,196 for MCR percent. These differences in denominator are due to the source of the hospital characteristic data. This analysis does not include Maryland hospitals as Maryland hospitals are exempt by waiver from the HAC Reduction Program in FY 2015.

The percentage of hospitals for each characteristic (column 3) indicates the percent of hospitals in each level of characteristic. For example, with regard to geographic region, 40.4 percent of hospitals (or 1,338 hospitals) are characterized as large urban; 33.8 percent of hospitals (or 1,119 hospitals) are characterized as other urban; and 25.8 percent of hospitals (or 853 hospitals) are characterized as rural. The percentage of hospitals in the worst performing quartile (column 5) indicates the proportion of hospitals for each characteristic that would be penalized. For example, in regards to geographic location, 26.6 percent of hospitals (or 356 hospitals) characterized as large urban will be subject to a payment adjustment; 23.0 percent of hospitals (or 257 hospitals) characterized as other urban will be subject to a payment adjustment; and 13.2 percent of hospitals (or 113 hospitals) characterized as rural will be subject to a payment adjustment.

With regard to geographic location of urban hospitals by bed size, 15.7 percent of hospitals (or 98 hospitals) characterized as urban hospitals with bed size of 0-99 beds will be subject to a payment adjustment; 20.7 percent of hospitals (or 155 hospitals) characterized as urban hospitals with bed size of 100-199 beds will be subject to a payment adjustment; 29.7 percent of hospitals (or 136 hospitals) characterized as urban hospitals with bed size of 200-299 beds will be subject to a payment adjustment; 27.7 percent of hospitals (or 72 hospitals) characterized as urban hospitals with bed size of 300-399 beds will be subject to a payment adjustment; 41.2 percent of hospitals (or 63 hospitals) characterized as urban hospitals with bed size of 400-499 beds will be subject to a payment adjustment; and 42.0 percent of hospitals (or 89 hospitals) characterized as urban hospitals with bed size of 500 or more beds will be subject to a payment adjustment.

With regard to geographical location of rural hospitals by bed size, 11.7 percent of hospitals (or 39 hospitals) characterized as rural hospitals with bed size of 0-49 beds will be subject to a payment adjustment; 12.5 percent of hospitals (or 37 hospitals) characterized as rural hospitals with bed size of 50-99 beds will be subject to a payment adjustment; 12.6 percent of hospitals (or 17 hospitals) characterized as rural hospitals with bed size of 100-149 beds will be subject to a payment adjustment; 18.0 percent of hospitals (or 9 hospitals) characterized as rural hospitals with bed size of 150-199 beds will be subject to a payment adjustment; and 29.7 percent of hospitals (or 11 hospitals) characterized as rural hospitals with bed size of 200 or more beds will be subject to a payment adjustment.

With regard to region of urban hospitals, 30.3 percent of hospitals (or 36 hospitals) characterized as urban in the New England region will be subject to a payment adjustment; 30.2 percent of hospitals (or 96 hospitals) characterized as urban in the Mid-Atlantic region will be subject to a payment adjustment; 24.3 percent of hospitals (or 98 hospitals) characterized as urban in the South Atlantic region will be subject to a payment adjustment; 22.5 percent of hospitals (or 88 hospitals) characterized as urban in the East North Central region will be subject to a payment adjustment; 22.1 percent of hospitals (or 33 hospitals) characterized as urban in the West South Central region will be subject to a payment adjustment; 26.1 percent of hospitals (or 42 hospitals) characterized as urban in the East North Central region will be subject to a payment adjustment; 15.9 percent of hospitals (or 60 hospitals) characterized as urban in the West South Central region will be subject to a payment adjustment; 33.3 percent of hospitals (or 54 hospitals) characterized as urban in the Mountain region will be subject to a payment adjustment; and 28.2 percent of hospitals (or 106 hospitals) characterized as urban in the Pacific region will be subject to a payment adjustment.

With regard to region of rural hospitals, 18.2 percent of hospitals (or 4 hospitals) characterized as rural in the New England region will be subject to a payment adjustment; 12.5 percent of hospitals (or 7 hospitals) characterized as rural in the Mid-Atlantic region will be subject to a payment adjustment; 16.9 percent of hospitals (or 22 hospitals) characterized as rural in the South Atlantic region will be subject to a payment adjustment; 12.2 percent of hospitals (or 14 hospitals) characterized as rural in the East North Central region will be subject to a payment adjustment; 8.8 percent of hospitals (or 14 hospitals) characterized as rural in the West South Central region will be subject to a payment adjustment; 15.0 percent of hospitals (or 16 hospitals) in the East North Central region will be subject to a payment adjustment; 9.6 percent of hospitals (or 16 hospitals) in the West South Central region will be subject to a payment adjustment; 26.8 percent of hospitals (or 19 hospitals) in the Mountain region will be subject to a payment adjustment; and 3.8 percent of hospitals (or 1 hospitals) in the Pacific region will be subject to a payment adjustment.

With regard to the DSH percent characteristic, 19.4 percent of hospitals (or 309 hospitals) characterized in the 0-24 DSH percent will be subject to a payment adjustment; 22.0 percent of hospitals (or 304 hospitals) characterized in the 25-49 DSH percent will be subject to a payment adjustment; 38.1 percent of hospitals (or 67 hospitals) characterized in the 50-64 DSH percent will be subject to a payment adjustment; and 28.9 percent of hospitals (or 46 hospitals) characterized in the 65 and over DSH percent will be subject to a payment adjustment.

With regard to the teaching status characteristic, 17.0 percent of hospitals (or 391 hospitals) characterized as nonteaching will be subject to a payment adjustment; 25.7 percent of hospitals (or 198 hospitals) characterized as fewer than 100 residents will be subject to a payment adjustment; and 56.4 percent of hospitals (or 137 hospitals) characterized as 100 or more residents will be subject to a payment adjustment.

With regard to the urban teaching and DSH characteristic, 35.6 percent of hospitals (or 294 hospitals) characterized as teaching and DSH will be subject to a payment adjustment; 25.0 percent of hospitals (or 32 hospitals) characterized as teaching and no DSH will be subject to a payment adjustment; 19.5 percent of hospitals (or 207 hospitals) characterized as no teaching and DSH will be subject to a payment adjustment; 18.2 percent of hospitals (or 80 hospitals) characterized as no teaching and no DSH will be subject to a payment adjustment; and 13.2 percent of hospitals (or 113 hospitals) characterized as nonurban will be subject to a payment adjustment.

With regard to the type of ownership characteristic, 22.7 percent of hospitals (or 429 hospitals) characterized as voluntary will be subject to a payment adjustment; 18.7 percent of hospitals (or 160 hospitals) characterized as proprietary will be subject to a payment adjustment; and 25.0 percent of hospitals (or 131 hospitals) characterized as government will be subject to a payment adjustment.

With regard to the MCR percent characteristic, 37.4 percent of hospitals (or 145 hospitals) characterized in the 0-24 MCR percent will be subject to a payment adjustment; 22.6 percent of hospitals (or 447 hospitals) characterized in the 25-49 MCR percent will be subject to a payment adjustment; 14.4 percent of hospitals (or 101 hospitals) characterized in the 50-64 MCR percent will be subject to a payment adjustment; and 9.4 percent of hospitals (or 12 hospitals) characterized in the 65 and over MCR percent will be subject to a payment adjustment.

Proportion of Hospitals in the Worst Performing Quartile (>75th Percentile) of the Total HAC Score by Hospital Characteristic for the FY 2015 HAC Reduction Program

Hospital characteristicsHospitals in the worst performing quartile
CharacteristicNumber of hospitals Percent Number of hospitalsPercent within characteristic
Total 3,352100.072621.7
By geographic location:
All hospitals:
Large urban 1,33840.435626.6
Other urban1,11933.825723.0
Rural85325.811313.2
Urban hospitals:
0-99 beds62625.59815.7
100-199 beds74830.415520.7
200-299 beds45818.613629.7
300-399 beds26010.67227.7
400-4991536.26341.2
500 or more beds2128.68942.0
Rural hospitals:
0-49 beds33439.23911.7
50-99 beds29734.83712.5
100-149 beds13515.81712.6
150-199 beds505.9918.0
200 or more beds374.31129.7
By region:
Urban by region:
New England1194.83630.3
Mid-Atlantic31812.99630.2
South Atlantic40416.49824.3
East North Central39115.98822.5
West South Central1496.13322.1
East North Central1616.64226.1
West South Central37715.36015.9
Mountain1626.65433.3
Pacific37615.310628.2
Rural by region:
New England222.6418.2
Mid-Atlantic566.6712.5
South Atlantic13015.22216.9
East North Central11513.51412.2
West South Central15918.6148.8
East North Central10712.51615.0
West South Central16719.6169.6
Mountain718.31926.8
Pacific263.013.8
By DSH percent:
0-241,59248.130919.4
25-491,38341.830422.0
50-641765.36738.1
65 and over1594.84628.9
By teaching status:
Non-teaching2,29769.439117.0
Fewer than 100 residents77023.319825.7
100 or more residents2437.313756.4
By urban teaching and DSH:
Teaching and DSH82725.029435.6
Teaching and no DSH1283.93225.0
No teaching and DSH1,06232.120719.5
No teaching and no DSH44013.38018.2
Non-urban85325.811313.2
By type of ownership:
Voluntary1,89057.842922.7
Proprietary85726.216018.7
Government52316.013125.0
By MCR percent:
0-2438812.114537.4
25-491,97761.944722.6
50-6470322.010114.4
65 and over1284.0129.4
Source: FY 2015 HAC Reduction Program Final Rule Results provided by R&A contract. Scores are based on AHRQ PSI 90 data from July 2011 through June 2013 and CLABSI and CAUTI results from January 2012 to December 2013. Hospital Characteristics are based on FY 2015 Proposed Rule Impact File released May 20, 2014.
The total number of hospitals with hospital characteristic data (3,310 for geographic location, bed size, region, DSH percent and teaching status; 3,270 for type of ownership; and 3,196 for MCR) do not add up to the total number of hospitals eligible for the HAC Reduction program (3,352) because 42 hospitals are not included in the FY 2015 impact file and not all hospitals have data for all characteristics.
This column is the percent of all hospitals with each characteristic that were eligible for the program and included in the FY 15 impact file. Percents may not sum to 100 due to rounding.
This column is the percent of hospitals within each characteristic that are in the worse performing quartile.
Total excludes the 46 Maryland hospitals.
Large Urban hospitals are hospitals located in large urban areas (populations over 1 million).
A hospital is considered a teaching hospital if it has an IME adjustment factor for Operation PPS (TCHOP) greater than zero and is considered a DSH hospital if it has a DSH patient percentage greater than zero.

9. Effects of Policy Changes Relating to Payments for Direct GME and IME

Under section IV.K.2. of the preamble of this final rule, we discuss our revisions to simplify and streamline the timing of CMS's policies related to when the FTE resident caps, the 3-year rolling average, and the IRB ratio cap would become effective for new teaching hospitals, by stating that the FTE resident caps, rolling average, and IRB ratio cap will be effective simultaneously, beginning with the applicable hospital's cost reporting period that coincides with or follows the start of the sixth program year of the first new program started. We are specifying that this policy regarding the effective dates of the FTE residency caps, rolling average, and IRB ratio cap for FTE residents in new programs is consistent with the methodology for calculation of the FTE resident caps as described in the FY 2013 IPPS/LTCH PPS final rule, and implemented at 42 CFR 413.79(e)(1) and (3). That is, this policy is effective for urban hospitals that have not yet had FTE resident caps established under § 413.79(e)(1), and for rural hospitals, on or after October 1, 2012. This policy will increase the amount of time that the new programs will be exempt from the FTE resident caps by several months, depending on the cost reporting period of the new teaching hospital. The estimate of possible cost of this policy is less than $5 million a year and, therefore, is negligible.

In section IV.K.3.a. of the preamble of this final rule, we discuss our policies related to the effect of new OMB labor market area delineations on certain teaching hospitals training residents in rural areas. Under existing regulations a new teaching hospital has 5 years from when it first begins training residents in its first new program to grow its cap. If the teaching hospital is a rural teaching hospital, it can continue to receive permanent cap adjustments even after the initial 5-year cap-building period ends if it trains residents in a new program. As a result of the implementation of the new OMB delineations, some teaching hospitals may be redesignated from being located in a rural area to an urban area, thereby losing their ability to increase their caps again after their initial 5-year cap-building period. Effective October 1, 2014, if a rural hospital has received a letter of accreditation for a new program and/or started training residents in the new program prior to being redesignated as urban, it can continue growing that program for the remainder of the cap-building period and receive a permanent cap adjustment for that new program. Once the cap-building period for the new program that was started while the hospital was still rural expires, the teaching hospital that has been redesignated as urban will no longer be able to receive any additional permanent cap adjustments.

In section IV.K.3.b. of the preamble of this final rule, we discuss our policy change related to a redesignated hospital's participation in a rural track program. Under existing regulations, if an urban hospital rotates residents to a separately accredited rural track program at a rural site(s) for more than one-half of the duration of the program, the urban hospital may receive an adjustment to its cap for training those FTE residents, referred to as the rural track FTE limitation. We are providing that, effective October 1, 2014, if a rural hospital participating in a rural track is in an area redesignated by OMB as urban after residents started training in the rural track and during the period that is used to calculate the urban hospital's rural track FTE limitation, the urban hospital may still receive a cap adjustment for that rural track. We also are providing that, effective October 1, 2014, if the rural hospital participating in the rural track is in an area redesignated as urban, the redesignated urban hospital can continue to be considered a rural hospital for purposes of the rural track for a transition period that would begin effective with the implementation date of the new OMB delineations and last through the end of the second residency training year following implementation of the new OMB delineations. However, during that transition period, either the rural hospital that has been redesignated as urban must reclassify as rural under § 412.103 for purposes of IME payment only, or the urban hospital must find a new geographically rural site to participate as the rural site for purposes of the rural track, in order for the urban hospital to receive payment under § 413.79(k)(1) or (k)(2) for the rural track program after the transition period ends.

We estimate that these policies discussed under IV.K.3.a. and b. of the preamble of this final rule will have a very minimal, if any, impact on Medicare expenditures. These policies will only be applied to, at the most, very few hospitals (if any at all) and will only apply once every 10 years as a result of OMB changes in labor market area delineations due to a recent Census.

In sections IV.K.5.a. and b. of the preamble of this final rule, we are making some changes to the current application process for and awarding of cap slots from closed hospitals under section 5506 of the Affordable Care Act that will be effective for hospital closures announced on or after October 1, 2014. We are providing an alternative interpretation of the statutory provision at section 5506(d) of the Affordable Care Act, which provides that the Secretary give consideration to the effect of the permanent awarding of slots under section 5506 of the Affordable Care Act to any temporary cap adjustments to a hospital received under § 413.79(h) of the regulations to ensure that there be no duplication of FTE cap slots. In this final rule, we are interpreting the statutory language at section 5506(d) in a manner that will permit us to apply the concept of ensuring no duplication of FTE resident slots on a hospital-by-hospital basis, such that if a hospital is both receiving a temporary cap adjustment under § 413.79(h) and is applying under section 5506 for permanent cap slots, it will not be able to receive a permanent cap adjustment until an equivalent amount of displaced residents graduate. However, if a hospital is applying under section 5506 for permanent cap slots and did not receive a temporary cap adjustment under § 413.79(h), that hospital will not have to wait until displaced residents that are training at another hospital graduate to be awarded any permanent cap slots under section 5506. We estimate that this revised policy could result in a slight increase in Medicare expenditures in a rare event a section 5506 cap adjustment may be provided to one hospital before a temporary cap adjustment expires at another hospital. However, we are unable to estimate whether this will occur with any future hospital closures where section 5506 is applied because we do not know how many, if any, residents will be displaced. Furthermore, we believe that any temporary duplicate payment will be a rare occurrence as most hospitals that are receiving a temporary cap adjustment under § 413.79(h) will also receive a permanent cap adjustment under section 5506. In this instance the hospital will only be able to receive the permanent cap adjustment once the temporary cap adjustment for an equivalent number of FTE residents expires, in which case there would be no duplication of FTE resident slots.

In addition, under section IV.K.5.c. of the preamble of this final rule, we are revising the ranking criteria used to award slots under section 5506. First, we are no longer allowing hospitals to apply for cap relief, which is included under current Ranking Criterion Eight. This change means that hospitals will be awarded slots under section 5506 for taking over a closed hospital's residency training program, having participated with a closed hospital in a Medicare GME affiliated group, taking over part of a closed hospital's program, expanding or starting a new geriatrics program, expanding or starting a new primary care or general surgery program, and expanding or starting a new nonprimary care or nongeneral surgery program. Second, Ranking Criterion One currently applies to hospitals that are assuming (or have assumed) an entire program from the hospital that closed. We are revising this Ranking Criterion to provide priority to a hospital whose FTE resident caps were erroneously reduced by CMS under section 5503 of the Affordable Care Act, contrary to the specific statutory exception at section 1886(h)(8)(A)(ii)(I) of the Act, and the CMS Central Office was made aware of the error prior to the posting of the FY 2015 proposed rule. We do not believe there is any cost associated with these policies. We will continue assigning all of the closed hospital's slots; only the specific hospitals awarded the slots may change.

10. Effects of Implementation of Rural Community Hospital Demonstration Program

In section IV.L. of the preamble of this final rule, we discuss our implementation of section 410A of Public Law 108-173, as amended, which requires the Secretary to conduct a demonstration that would modify reimbursement for inpatient services for up to 30 rural community hospitals. Section 410A(c)(2) requires that “[i]n conducting the demonstration program under this section, the Secretary shall ensure that the aggregate payments made by the Secretary do not exceed the amount which the Secretary would have paid if the demonstration program under this section was not implemented.” As discussed in section IV.L. of the preamble of this final rule, in the IPPS final rules for each of the previous 10 fiscal years, we have estimated the additional payments made by the program for each of the participating hospitals as a result of the demonstration. In order to achieve budget neutrality, we are adjusting the national IPPS rates by an amount sufficient to account for the added costs of this demonstration. In other words, we are applying budget neutrality across the payment system as a whole rather than across the participants of this demonstration. The language of the statutory budget neutrality requirement permits the agency to implement the budget neutrality provision in this manner. The statutory language requires that “aggregate payments made by the Secretary do not exceed the amount which the Secretary would have paid if the demonstration . . . was not implemented” but does not identify the range across which aggregate payments must be held equal.

We are adjusting the national IPPS rates according to the methodology set forth elsewhere in this final rule. The adjustment to the national IPPS rates to account for estimated demonstration cost for FY 2014 for the 7 “pre-expansion” participating hospitals that are currently participating in the demonstration and the 15 additional hospitals participating as a result of the expansion of the demonstration under the Affordable Care Act is $54,177,144. In addition, in this final rule, we are adding to the adjustment of the national IPPS rates the amount by which the actual costs of the demonstration for FY 2008 (as shown in the finalized cost reports for cost reporting periods beginning in FY 2008 for the hospitals that participated in the demonstration during FY 2008) exceed the budget neutrality offset amount that was finalized in the FY 2008 IPPS final rule ($10,389,771). Thus, the resulting total ($64,566,915) is the amount for which an adjustment to inpatient rates for FY 2015 is calculated.

11. Effects of Changes Related to Reclassification as Rural for CAHs

In section VI.D.2. of the preamble of this final rule, we discuss our policies relating to reclassifications of CAHs as a result of the adoption of the new OMB labor market area delineations. A facility is eligible for designation as a CAH only if it is either physically located in a rural area or has been reclassified as rural under 42 CFR 412.103. CAHs can be affected by the recent OMB labor market area delineations because facilities that are currently participating as CAHs that were previously located in rural areas may now be located in urban areas as a result of the new delineations. Previously, in both in the FY 2005 IPPS final rule and the FY 2010 IPPS/LTCH PPS final rule, we revised the regulations to give currently participating CAHs 2 years, from the effective date of the earlier OMB designations, to reclassify as rural facilities. However, these regulation changes were specific to a particular timeframe. As we are implementing the latest OMB labor market area delineations in this final rule, we are providing that, effective October 1, 2014, currently participating CAHs that are located in an area that has been redesignated from rural to urban under the new delineations will again be treated as rural for 2 years from the date the new OMB delineations are implemented. An affected CAH will have 2 years from the date the redesignation becomes effective to reclassify as rural and thereby retain its CAH status. If a CAH fails to reclassify within those 2 years, it can no longer participate in Medicare as a CAH. However, unlike in previous years when the regulation changes were specific to a particular timeframe, the change that we are making to the regulations is not specific to a particular timeframe but will also apply to future OMB labor market area delineations. We estimate that this policy will have little or no impact on Medicare expenditures because we expect that virtually all of the affected CAHs will be granted rural status by the State in which they are located and, therefore, will be able to apply for reclassification as rural under § 412.103 in order to retain their CAH status.

12. Effects of Revision of the Requirements for Physician Certification of CAH Inpatient Services

In section VI.D.3. of the preamble of this final rule, we discuss the statutory requirement for physician certification of CAH inpatient services. For inpatient CAH services to be payable under Medicare Part A, section 1814(a)(8) of the Act requires that a physician certify that the individual may reasonably be expected to be discharged or transferred to a hospital within 96 hours after admission to the CAH. These statutory requirements are addressed in the regulations at 42 CFR 424.15. In order to provide CAHs with additional flexibility in meeting certification requirements, we are amending the regulation text at § 424.11(d)(5) to remove the phrase “or critical access hospital inpatient”. In addition, we are revising the regulations at § 424.15(b) to read as follows: “Certification begins with the order for inpatient admission. All certification requirements must be completed, signed, and documented in the medical record no later than 1 day before the date on which the claim for the inpatient CAH service is submitted.” We do not believe there is any significant impact on Medicare expenditures associated with these changes because we are simply providing CAHs with additional flexibility in meeting the statutory requirement for physician certification of CAH inpatient services. The underlying statutory requirement itself is unchanged.

13. Effects of Changes Relating to Technical Correction to Administrative Appeals by Providers and Judicial Review

In section VIII. of the preamble to this final rule, we discuss the technical correction to the regulations to eliminate provider dissatisfaction as a requirement for PRRB jurisdiction over appeals based on untimely contractor determinations as well as the change in terminology in Part 405 and Part 413 from “intermediary” or “fiscal intermediary” to “contractor”. There is no impact to the provider resulting from these provisions.

I. Effects of Update to the Reasonable Compensation Equivalent (RCE) Limits for Compensation for Physician Services Provided in Providers

In section VI.B. of the preamble of this final rule, we discuss our finalized policy to update and revise the methodology used to calculate the reasonable compensation equivalent (RCE) limits for compensation for physician services provided in providers, in accordance with our regulations at 42 CFR 415.70(f)(2). For CY 2015, we estimate that 59 cancer and children's hospitals and 46 IPPS teaching hospitals will be subject to the RCE limits. We estimate the costs associated with the updated RCE limits for CY 2015 to be approximately $40 million. We do not expect this RCE limit update to impact a significant number of small, rural entities; therefore, a full impact analysis is not required.

J. Effects of Changes in the Capital IPPS

1. General Considerations

For the impact analysis presented below, we used data from the March 2014 update of the FY 2013 MedPAR file and the March 2014 update of the Provider-Specific File (PSF) that is used for payment purposes. Although the analyses of the changes to the capital prospective payment system do not incorporate cost data, we used the March 2014 update of the most recently available hospital cost report data (FYs 2011 and 2012) to categorize hospitals. Our analysis has several qualifications. We use the best data available and make assumptions about case-mix and beneficiary enrollment as described below.

Due to the interdependent nature of the IPPS, it is very difficult to precisely quantify the impact associated with each change. In addition, we draw upon various sources for the data used to categorize hospitals in the tables. In some cases (for instance, the number of beds), there is a fair degree of variation in the data from different sources. We have attempted to construct these variables with the best available sources overall. However, it is possible that some individual hospitals are placed in the wrong category.

Using cases from the March 2014 update of the FY 2013 MedPAR file, we simulated payments under the capital IPPS for FY 2014 and FY 2015 for a comparison of total payments per case. Any short-term, acute care hospitals not paid under the general IPPS (for example, Indian Health Service hospitals and hospitals in Maryland) are excluded from the simulations.

The methodology for determining a capital IPPS payment is set forth at § 412.312. The basic methodology for calculating capital IPPS payments in FY 2015 is as follows:

(Standard Federal Rate) × (DRG weight) × (GAF) × (COLA for hospitals located in Alaska and Hawaii) × (1 + DSH Adjustment Factor + IME adjustment factor, if applicable).

In addition to the other adjustments, hospitals may also receive outlier payments for those cases that qualify under the threshold established for each fiscal year. We modeled payments for each hospital by multiplying the capital Federal rate by the GAF and the hospital's case-mix. We then added estimated payments for indirect medical education, disproportionate share, and outliers, if applicable. For purposes of this impact analysis, the model includes the following assumptions:

  • We estimate that the Medicare case-mix index will increase by 0.5 percent in both FYs 2014 and 2015.
  • We estimate that Medicare discharges will be approximately 11.6 million in FY 2014 and 11.7 million in FY 2015.
  • The capital Federal rate was updated beginning in FY 1996 by an analytical framework that considers changes in the prices associated with capital-related costs and adjustments to account for forecast error, changes in the case-mix index, allowable changes in intensity, and other factors. As discussed in section III.A.1.a. of the Addendum to this final rule, the update is 1.5 percent for FY 2015.
  • In addition to the FY 2015 update factor, the FY 2015 capital Federal rate was calculated based on a GAF/DRG budget neutrality adjustment factor of 0.9986 and an outlier adjustment factor of 0.9373. As discussed in section VI.C. of the preamble of this final rule, we are not making an additional MS-DRG documentation and coding adjustment to the capital IPPS Federal rates for FY 2015.

2. Results

We used the actuarial model described above to estimate the potential impact of our changes for FY 2015 on total capital payments per case, using a universe of 3,396 hospitals. As described above, the individual hospital payment parameters are taken from the best available data, including the March 2014 update of the FY 2013 MedPAR file, the March 2014 update to the PSF, and the most recent cost report data from the March 2014 update of HCRIS. In Table III, we present a comparison of estimated total payments per case for FY 2014 and estimated total payments per case for FY 2015 based on the FY 2015 payment policies. Column 2 shows estimates of payments per case under our model for FY 2014. Column 3 shows estimates of payments per case under our model for FY 2015. Column 4 shows the total percentage change in payments from FY 2014 to FY 2015. The change represented in Column 4 includes the 1.5 percent update to the capital Federal rate and other changes in the adjustments to the capital Federal rate. The comparisons are provided by: (1) Geographic location; (2) region; and (3) payment classification.

The simulation results show that, on average, capital payments per case in FY 2015 are expected to increase as compared to capital payments per case in FY 2014. This expected increase is due primarily to the approximately 1.2 percent increase in the capital Federal rate for FY 2015 as compared to the FY 2014 capital Federal rate. (For a discussion of the determination of the capital Federal rate, we refer readers to section III.A. of the Addendum to this final rule.) Overall, across all hospitals, the changes to the GAFs are expected to have no net effect on capital payments. However, regionally, the effects of the changes to the GAFs on capital payments are consistent with the projected changes in payments due to changes in the wage index (and policies affecting the wage index) as shown in Table I in section I.G. of this Appendix.

Overall, there is an increase in capital payments per case due to the effects of changes to the MS-DRG reclassifications and recalibrations, with more of this increase expected for urban hospitals. However, this increase is offset by projected changes in outlier payments for both urban and rural hospitals. Rural areas are expected to experience an offset to the projected increase in capital payments per case due to the effects of changes to the GAFs.

The net impact of these changes is an estimated 1.5 percent change in capital payments per case from FY 2014 to FY 2015 for all hospitals (as shown below in Table III).

The geographic comparison shows that, on average, all hospitals are expected to experience an increase in capital IPPS payments per case in FY 2015 as compared to FY 2014. As we stated above, these expected increases are primarily due to the increase in the capital Federal rate. Capital IPPS payments per case for hospitals in “large urban areas” are expected to have an estimated increase of 1.7 percent, while hospitals in rural areas, on average, are expected to experience a 1.0 percent increase in capital payments per case from FY 2014 to FY 2015. Capital IPPS payments per case for “other urban hospitals” are estimated to increase 1.4 percent. The primary factor contributing to the difference in the projected increase in capital IPPS payments per case for urban hospitals as compared to rural hospitals is the increase in capital payments to urban hospitals due to changes to the MS-DRG relative weights and the effect of changes in the GAFs. The increase in capital payments due to changes to the MS-DRG relative weights is slightly lower for rural hospitals than it is for urban hospitals. In addition, rural hospitals are expected to experience a slight decrease in capital payments due to the effect of changes in the GAFs, while urban hospitals are expected to experience a slight increase in capital payments due to the effect of changes in the GAFs.

The comparisons by region show that the estimated increases in capital payments per case from FY 2014 to FY 2015 in urban areas range from a 2.4 percent increase for the Pacific urban region to a 0.9 percent increase for the West South Central urban region. For rural regions, the Pacific rural region is expected to experience the largest increase in capital IPPS payments per case of 2.4 percent, while the Mountain rural region is projected to have the smallest increase in capital payments per case of 0.5 percent, compared to FY 2014 payments per case. Unlike most other urban and rural regions where changes in the GAFs either contribute to a projected decrease in capital payments or only a small increase in capital payments, the changes in the GAFs are a primary contributor to the expected increase in capital IPPS payments per case for the Pacific urban and rural regions. A larger than average decrease in capital payments per case for the Mountain rural area due to the change in outliers offsets the projected increases to that area's capital payments per case in FY 2015 compared to FY 2014.

Hospitals of all types of ownership (that is, voluntary hospitals, government hospitals, and proprietary hospitals) are estimated to experience an increase in capital payments per case from FY 2014 to FY 2015. The increase in capital payments for voluntary hospitals is estimated at 1.6 percent, and for proprietary and government hospitals the increase is estimated to be 1.4 percent.

Section 1886(d)(10) of the Act established the MGCRB. Hospitals may apply for reclassification for purposes of the wage index for FY 2015. Reclassification for wage index purposes also affects the GAFs because that factor is constructed from the hospital wage index. To present the effects of the hospitals being reclassified as of the publication of this final rule for FY 2015, we show the average capital payments per case for reclassified hospitals for FY 2015. Urban reclassified hospitals are expected to experience an increase in capital payments of 2.1 percent, whereas for urban nonreclassified hospitals, the expected increase is 1.4 percent. The estimated percentage increase for rural reclassified hospitals is 1.0 percent, and for rural nonreclassified hospitals, the estimated percentage increase is 0.7 percent. Other reclassified hospitals (that is, hospitals reclassified under section 1886(d)(8)(B) of the Act) are expected to experience the largest increase (2.2 percent) in capital payments from FY 2014 to FY 2015.

Table III—Comparison of Total Payments per Case

[FY 2014 payments compared to FY 2015 payments]

Number of hospitalsAverage FY 2014 payments/caseAverage FY 2015 payments/caseChange
By Geographic Location:
All hospitals3,3968568691.5
Large urban areas (populations over 1 million)1,4019449601.7
Other urban areas (populations of 1 million of fewer)1,1488248351.4
Rural areas8475835881.0
Urban hospitals2,5498909031.6
0-99 beds6667337390.9
100-199 beds7877727831.4
200-299 beds4558128261.7
300-499 beds4299089221.6
500 or more beds2121,0661,0821.6
Rural hospitals8475835881.0
0-49 beds3284744791.1
50-99 beds3055425460.7
100-149 beds1255825881.0
150-199 beds506366431.0
200 or more beds397097171.1
By Region:
Urban by Region2,5498909031.6
New England1209841,0011.7
Middle Atlantic3249589782.0
South Atlantic4078028121.3
East North Central3978568681.4
East South Central1537647721.0
West North Central1628808921.3
West South Central3878238300.9
Mountain1629079181.2
Pacific3851,1201,1482.4
Puerto Rico524084121.1
Rural by Region8475835881.0
New England228128231.4
Middle Atlantic575665751.6
South Atlantic1325555590.7
East North Central1166076131.0
East South Central1655345380.9
West North Central1026196240.8
West South Central1685155180.7
Mountain616536570.5
Pacific247497672.4
[There are no rural hospitals in Puerto Rico]
By Payment Classification:
All hospitals3,3968568691.5
Large urban areas (populations over 1 million)1,4139439591.7
Other urban areas (populations of 1 million of fewer)1,1508238351.4
Rural areas8335945990.8
Teaching Status:
Non-teaching2,3577287381.4
Fewer than 100 Residents7958378501.5
100 or more Residents2441,2101,2311.7
Urban DSH:
100 or more beds1,5889119251.6
Less than 100 beds3836496561.0
Rural DSH:
Sole Community (SCH/EACH)3735305351.0
Referral Center (RRC/EACH)2126566610.8
Other Rural:
100 or more beds245525520.0
Less than 100 beds1374654690.8
Urban teaching and DSH:
Both teaching and DSH8429901,0051.6
Teaching and no DSH1338919071.8
No teaching and DSH1,1297627741.6
No teaching and no DSH4597887991.4
Rural Hospital Types:
Non special status hospitals2,5758909041.5
RRC/EACH1937177301.8
SCH/EACH3256526591.1
SCH, RRC and EACH1247117201.3
Hospitals Reclassified by the Medicare Geographic Classification Review Board:
FY2015 Reclassifications:
All Urban Reclassified4508869042.1
All Urban Non-Reclassified2,0548939061.4
All Rural Reclassified2696216281.0
All Rural Non-Reclassified5145335360.7
Other Reclassified Hospitals (Section 1886(d)(8)(B))595815942.2
Type of Ownership:
Voluntary1,9358688821.6
Proprietary8927767871.4
Government5428959081.4
Medicare Utilization as a Percent of Inpatient Days:
0-255011,0231,0381.5
25-502,0818718841.5
50-656017177281.5
Over 65936486541.0

K. Effects of Payment Rate Changes and Policy Changes Under the LTCH PPS

1. Introduction and General Considerations

In section VII. of the preamble of this final rule and section V. of the Addendum to this final rule, we set forth the annual update to the payment rates for the LTCH PPS for FY 2015. In the preamble of this final rule, we specify the statutory authority for the provisions that are presented, identify those policies, and present rationales for our decisions as well as alternatives that were considered. In this section of Appendix A to this final rule, we discuss the impact of the changes to the payment rate, factors, and other payment rate policies related to the LTCH PPS that are presented in the preamble of this final rule in terms of their estimated fiscal impact on the Medicare budget and on LTCHs.

Currently, there are 422 LTCHs included in this impacts analysis, which includes data for 80 nonprofit (voluntary ownership control) LTCHs, 330 proprietary LTCHs, and 12 LTCHs that are government-owned and operated. (We note that, although there are currently approximately 430 LTCHs, for purposes of this impact analysis, we excluded the data of all inclusive rate providers and the LTCHs that are paid in accordance with demonstration projects, consistent with the development of the FY 2015 MS-LTC-DRG relative weights (discussed in section VII.B.3.c. of the preamble of this final rule)). In the impact analysis, we used the payment rate, factors, and policies presented in this final rule, including the 2.2 percent annual update for LTCHs that submit quality data in accordance with section 1886(m)(5)(C) of the Act, which is based on the full estimated increase of the LTCH PPS market basket and the reductions required by sections 1886(m)(3) and (m)(4) of the Act, the final year of the phase-in of a one-time prospective adjustment factor of 0.98734 (approximately -1.3 percent), the update to the MS-LTC-DRG classifications and relative weights, the update to the wage index values, including the implementation of the new OMB delineations, and labor-related share, and the best available claims and CCR data to estimate the change in payments for FY 2015. (As discussed in section VII.C. of the preamble of this final rule, in accordance with section 1886(m)(5)(C) of the Act, for LTCHs that fail to submit quality data, the annual update to the LTCH PPS standard Federal rate is reduced by 2.0 percentage points in FY 2015.)

The standard Federal rate for FY 2014 is $40,607.31 for LTCHs that submit quality data in accordance with the requirements of section 1886(m)(5)(C) of the Act. For FY 2015, we are establishing a standard Federal rate of $41,043.71 (for LTCHs that submit quality data in accordance with the requirements of section 1886(m)(5)(C) of the Act, which reflects the 2.2 percent annual update to the standard Federal rate, and the area wage budget neutrality factor of 1.0016703 to ensure that the changes in the wage index, including the implementation of the new OMB delineations, and labor-related share do not influence aggregate payments, and the final year of the phase-in of a one-time prospective adjustment factor of 0.98734. For LTCHs that fail to submit data for the LTCHQR Program, in accordance with section 1886(m)(5)(C) of the Act, we are establishing a standard Federal rate of $40,240.51. This reduced standard Federal rate reflects the updates described above as well as the required 2.0 percentage point reduction to the annual update for failure to submit data to the LTCHQR Program. We note that the factors described above to determine the FY 2015 standard Federal rate are applied to the FY 2014 Federal standard rate set forth under § 412.523(c)(3)(ix)(A) (that is, $40,607.31).

Based on the best available data for the 422 LTCHs in our database, we estimate that the annual update to the standard Federal rate for FY 2015, the update to the MS-LTC-DRG classifications relative weights for FY 2015 (discussed in section VII.B. of the preamble to this final rule), and the changes to the area wage adjustment for FY 2015 (discussed in section V.B. of the Addendum to this final rule), in addition to an estimated increase in HCO payments will result in an increase in estimated payments from FY 2014 of approximately $62 million. Based on the 422 LTCHs in our database, we estimate that the FY 2015 LTCH PPS payments would be approximately $5.614 billion, as compared to estimated FY 2014 LTCH PPS payments of approximately $5.552 billion. Because the combined distributional effects and estimated changes to the Medicare program payments are over approximately $100 million, this final rule is considered a major economic rule, as defined in this section. We note that the approximate $62 million for the projected increase in estimated aggregate LTCH PPS payments from FY 2014 to FY 2015 does not reflect changes in LTCH admissions or case-mix intensity in estimated LTCH PPS payments, which also will affect overall payment changes. In addition, it does not reflect the estimated change in aggregate LTCH PPS payments due the projected impact of certain other LTCH PPS policy changes, which are discussed below in section I.K.3.b. of this Appendix.

The projected 1.1 percent increase in estimated payments per discharge from FY 2014 to FY 2015 is attributable to several factors, including the 2.2 percent annual update to the standard Federal rate (or 0.2 percent annual update for LTCHs that failed to submit data under the requirements of the LTCHQR Program), a one-time prospective adjustment factor for FY 2015 of 0.98734 (approximately −1.3 percent), and projected increases in estimated HCO payments. Although the net effect of the 2.2 percent annual update and the approximate −1.3 percent one-time prospective adjustment factor is approximately 0.9 percent (that is, 2.2 percent−1.3 percent = 0.9 percent), Table IV (column 6) shows the estimated change attributable solely to the annual update to the standard Federal rate (2.2 percent for LTCHs that submit quality data under the requirements of the LTCHQR Program and 0.2 percent for LTCHs that failed submit quality data under the requirements of the LTCHQR Program), including a one-time prospective adjustment factor for FY 2015 under the final year of the phase-in (approximately−1.3 percent), is projected to result in an increase of 0.8 percent in payments per discharge from FY 2014 to FY 2015, on average, for all LTCHs. In addition to the 2.2 percent annual update for FY 2015, and a −1.3 percent one-time prospective adjustment factor for FY 2015, this estimated increase in aggregate LTCH PPS payments of 0.8 percent shown in column 6 of Table IV also includes estimated payments for SSO cases that are paid using special methodologies that are not affected by the annual update to the standard Federal rate. Therefore, for all hospital categories, the projected increase in payments based on the standard Federal rate is slightly less than the net effect of the 2.2 percent annual update and the approximate −1.3 percent one-time prospective adjustment factor (or 0.9 percent) for FY 2015. Because we are applying an area wage level budget neutrality factor to the standard Federal rate, the annual update to the wage data, including the implementation of the new OMB delineations, and labor-related share does not impact the increase in aggregate payments.

As discussed in section V.B. of the Addendum to this final rule, we are updating the wage index values for FY 2015 based on the most recent available data and the adoption of the new OMB labor market area delineations. Under our adoption of the new OMB delineations, we are establishing and applying a transitional blended wage index for FY 2015 for LTCHs that will have a lower wage index value under those delineations, as discussed in section VII.D.2. of the preamble of this final rule. Therefore, this column reflects the blended wage index that is calculated as a 50/50 blend of the wage index under the current CBSA designations and the wage index under the new OMB delineations under our transitional wage index policy. In addition, we are slightly lowering the labor-related share from 62.537 percent to 62.306 percent under the LTCH PPS for FY 2015, based on the most recent available data on the relative importance of the labor-related share of operating and capital costs based on the FY 2009-based LTCH-specific market basket. We also are applying an area wage level budget neutrality factor of 1.0016703, which increases the standard Federal rate by approximately 0.17 percent. Therefore, the changes to the wage data, including the adoption of the new OMB delineations, and labor-related share do not result in a change in estimated aggregate LTCH PPS payments.

Table IV below shows the impact of the payment rate and the policy changes on LTCH PPS payments for FY 2015 presented in this final rule by comparing estimated FY 2014 payments to estimated FY 2015 payments. The projected increase in payments from FY 2014 to FY 2015 of 1.1 percent is attributable to the impacts of the change to the standard Federal rate (0.9 percent in Column 6) and the effect of the estimated slight increase in payments for HCO cases (0.1 percent) and an estimated increase in payments for SSO cases (0.2 percent). We currently estimate total HCO payments are projected to increase slightly from FY 2014 to FY 2015 in order to ensure that the estimated HCO payments will be 8 percent of the total estimated LTCH PPS payments in FY 2015. An analysis of the most recent available LTCH PPS claims data (that is, FY 2013 claims data from the March 2014 update of the MedPAR file) indicates that the FY 2014 HCO threshold of $13,314 (as established in the FY 2014 IPPS/LTCH PPS final rule) may-result in HCO payments in FY 2015 that are slightly below the estimated 8 percent. Specifically, we currently estimate that HCO payments will be approximately 7.9 percent of the estimated total LTCH PPS payments in FY 2014. We estimate that the impact of the slight increase in HCO payments will result in approximately a 0.1 percent increase in estimated payments from FY 2014 to FY 2015, on average, for all LTCHs. Furthermore, in calculating the estimated HCO payments for FYs 2014 and 2015, we increased estimated costs by the applicable market basket percentage increase as projected by our actuaries. This increase in estimated costs also results in a projected increase in SSO payments of approximately 0.2 percent relative to last year. The net result of these projected changes in HCO and SSO payments in FY 2015 is an estimated change in aggregate payments of 0.3 percent. We note that estimated payments for all SSO cases comprise approximately 12 percent of the estimated total LTCH PPS payments, and estimated payments for HCO cases comprise approximately 8 percent of the estimated total FY 2015 LTCH PPS payments. Payments for HCO cases are based on 80 percent of the estimated cost of the case above the HCO threshold, while the majority of the payments for SSO cases (approximately 60 percent) are based on the estimated cost of the case.

In addition to the projected increase in LTCH PPS payments per discharge of approximately $62 million (1.1 percent) from FY 2014 to FY 2015, as shown in Table IV below, we also estimate that the net effect of the projected impact of certain other LTCH PPS policy changes (that is, the reinstatement of the moratorium on the full implementation of the “25-percent policy” payment adjustment; the reinstatement of the moratorium on the development of new LTCHs and LTCH satellite facilities and additional LTCH beds; the revocation of onsite discharges and readmissions policy; and the payment adjustment for “subclause (II)” LTCHs) will result in a $116 million increase in aggregate LTCH PPS payments in FY 2015. The individual impact of these policy changes are discussed in greater detail below in section I.K.3.b. of this Appendix.

As we discuss in detail throughout this final rule, based on the most recent available data, we believe that the provisions of this final rule relating to the LTCH PPS will result in an increase in estimated aggregate LTCH PPS payments and that the resulting LTCH PPS payment amounts will result in appropriate Medicare payments.

2. Impact on Rural Hospitals

For purposes of section 1102(b) of the Act, we define a small rural hospital as a hospital that is located outside of an urban area and has fewer than 100 beds. As shown in Table IV, we are projecting a 1.2 percent increase in estimated payments per discharge for FY 2015 as compared to FY 2014 for rural LTCHs that will result from the changes presented in this final rule, as well as the effect of estimated changes to HCO and SSO payments. This estimated impact is based on the data for the 22 rural LTCHs in our database (out of 422 LTCHs) for which complete data were available.

The estimated increase in LTCH PPS payments from FY 2014 to FY 2015 for rural LTCHs (1.2 percent) is slightly greater than the national average increase (1.1 percent). The estimated increase in LTCH PPS payments from FY 2014 to FY 2015 for rural LTCHs is primarily due to the increase to the standard Federal rate.

3. Anticipated Effects of LTCH PPS Payment Rate Changes and Policy Changes

a. Budgetary Impact

Section 123(a)(1) of the BBRA requires that the PPS developed for LTCHs “maintain budget neutrality.” We believe that the statute's mandate for budget neutrality applies only to the first year of the implementation of the LTCH PPS (that is, FY 2003). Therefore, in calculating the FY 2003 standard Federal rate under § 412.523(d)(2), we set total estimated payments for FY 2003 under the LTCH PPS so that estimated aggregate payments under the LTCH PPS were estimated to equal the amount that would have been paid if the LTCH PPS had not been implemented.

As discussed above in section I.K.1. of this Appendix, we project an increase in aggregate LTCH PPS payments per discharge in FY 2015 relative to FY 2014 of approximately $62 million based on the 422 LTCHs in our database. In addition, as discussed below in section I.K.3.b. of this Appendix, we also estimate that the net effect of the projected impact of certain other LTCH PPS policy changes will result in a $116 million increase in aggregate LTCH PPS payments in FY 2015.

b. Impact of Certain LTCH PPS Policy Changes

(1) Reinstatement of the Moratorium on the Full Implementation of the “25-Percent Policy” Payment Adjustment (§ 412.534 and § 412.536) and Reinstatement of the Moratorium on the Development of New LTCHs and LTCH Satellites and Additional LTCH beds (§ 412.23(e) and §§ 412.23(e)(6) and (7))

Section 1206(b) of Public Law 113-67 provides for the retroactive reinstatement and extension, for an additional 4 years, of the moratorium on the full implementation of the 25-percent threshold payment adjustment (referred to as the “25-percent policy” payment adjustment) established under section 114(c) of the MMSEA, as amended by section 4302(a) of the ARRA and sections 3106(c) and 10312(a) of the Affordable Care Act. As discussed in section VII.E. of the preamble of this final rule, we are reinstating this payment adjustment retroactively for LTCH cost reporting periods beginning on or after July 1, 2013 or October 1, 2013, as applicable under the regulations at § 412.534 and § 412.536.

Section 1206(b)(2) of Public Law 113-67, as amended by section 112(b) of the Protecting Access to Medicare Act of 2-14 (Pub. L. 113-93), provides for moratoria on the establishment of new LTCHs and LTCH satellite facilities and on bed increases in LTCHs effective for the period beginning April 1, 2014, and ending September 30, 2017. This statutory provision also provides specific exceptions to the moratorium on the establishment of new LTCHs and LTCH satellites. We are implementing this policy under the regulations at § 412.23(e) and §§ 412.23(e)(6) and (7), respectively. For additional details, refer to section VII.G. of the preamble of this final rule.

Our Office of the Actuary projects that the reinstatement of “25-percent policy” adjustment policy will result in approximately a $120 million increase in aggregate LTCH PPS payments in FY 2015. In addition, our Office of the Actuary projects that the portion of the moratoria on the establishment of new LTCHs and LTCH satellite facilities and additional LTCH beds that will occur during FY 2015 is estimated to result in approximately a $30 million reduction in aggregate LTCH PPS payments in FY 2015. Therefore, we project our implementation of both of these statutory provisions will result in approximately a $90 million increase in aggregate LTCH PPS payments in FY 2015.

(2) Revocation of On-Site Discharges and Readmissions Policy (§ 412.532)

As discussed in section VII.F. of the preamble of this final rule, we are removing the discharge and readmission requirement specified in the regulations under § 412.532 (referred to as the “5-percent payment threshold”). Under the “5-percent payment threshold” policy, if an LTCH (or a LTCH satellite facility) directly readmits more than 5 percent of its total Medicare inpatients discharged from an “on-site facility” (for example, a co-located acute care hospital, an IRF, or a SNF, or in the case of a LTCH satellite facility, that is co-located with an LTCH), all such discharges to the co-located “on-site facility” and the readmissions to the LTCH are treated as one discharge for that cost reporting period, and, as such, one LTCH PPS payment is made on the basis of each patient's initial principal diagnosis. We estimate that the discontinuation of the “5-percent payment threshold” policy will result in an increase of approximately $20 million in aggregate LTCH PPS payments in FY 2015. (We note, as also discussed in section VII.F. of the preamble of this final rule, after consideration of public comments, we are not finalizing the proposed revision the fixed-day thresholds under the greater than 3-day interruption of stay policy under § 412.531.)

(3) Payment Adjustment for “Subclause (II)” LTCHs (§ 412.526)

Section 1206(d) of Public Law 113-67 requires the Secretary to evaluate payments and regulations governing “hospitals which are classified under subclause (II) of subsection (d)(1)(B)(iv)”. In addition, based on the result of such evaluations, the statute authorizes the Secretary to adjust the payment rates for this type of hospital and to adjust regulations governing a subclause (II) LTCH that otherwise apply to subclause (I) LTCHs. As discussed in section VII.H. of the preamble of this final rule, under new § 412.526, we are applying a payment adjustment under the LTCH PPS to a subclause (II) LTCH beginning in FY 2015 that will result in payments to this type of LTCH resembling those under the reasonable cost TEFRA payment system model. Our Office of the Actuary projects that the payment adjustment for “subclause (II)” LTCHs will increase aggregate LTCH PPS payments in FY 2015 by approximately $6 million.

c. Impact on Providers

The basic methodology for determining a per discharge LTCH PPS payment is set forth under § 412.515 through § 412.536. In addition to the basic MS-LTC-DRG payment (the standard Federal rate multiplied by the MS-LTC-DRG relative weight), we make adjustments for differences in area wage levels, a COLA for LTCHs located in Alaska and Hawaii, and SSOs. Furthermore, LTCHs may also receive HCO payments for those cases that qualify based on the threshold established each year.

To understand the impact of the changes to the LTCH PPS payments presented in this final rule on different categories of LTCHs for FY 2015, it is necessary to estimate payments per discharge for FY 2014 using the rates, factors (including the FY 2014 GROUPER (Version 31.0), and relative weights and the policies established in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50753 through 50760 and 51002). It is also necessary to estimate the payments per discharge that will be made under the LTCH PPS rates and factors, and GROUPER (Version 32.0) for FY 2015 (as discussed in section VII. of the preamble of this final rule and section V. of the Addendum to this final rule). These estimates of FY 2014 and FY 2015 LTCH PPS payments are based on the best available LTCH claims data and other factors, such as the application of inflation factors to estimate costs for SSO and HCO cases in each year. We also evaluated the change in estimated FY 2014 payments to estimated FY 2015 payments (on a per discharge basis) for each category of LTCHs. We are establishing a standard Federal rate for FY 2015 of $41,043.71 (for LTCHs that submit quality data under the requirements of the LTCHQR Program), which includes the 2.2 percent annual update, the area wage budget neutrality factor of 1.0016703, and a one-time prospective adjustment to the standard Federal rate for FY 2015 of 0.98734 (approximately −1.3 percent). For LTCHs that fail to submit data to the LTCH Quality Reporting Program, we are establishing a standard Federal rate for FY 2015 of $40,240.51 that includes a 2.0 percentage point reduction applied to the annual update under the requirements of section 1886(m)(5)(C) of the Act in addition to the other adjustments noted above.

Hospital groups were based on characteristics provided in the OSCAR data, FY 2010 through FY 2012 cost report data in HCRIS, and PSF data. Hospital groups included the following:

  • Location: large urban/other urban/rural.
  • Participation date.
  • Ownership control.
  • Census region.
  • Bed size.

To estimate the impacts of the payment rates and policy changes among the various categories of existing providers, we used LTCH cases from the FY 2013 MedPAR file to estimate payments for FY 2014 and to estimate payments for FY 2015 for 422 LTCHs. We believe that the discharges based on the FY 2013 MedPAR data for the 422 LTCHs in our database, which includes 330 proprietary LTCHs, provide sufficient representation in the MS-LTC-DRGs containing discharges for patients who received LTCH care for the most commonly treated LTCH patients' diagnoses.

d. Calculation of Prospective Payments

For purposes of this impact analysis, to estimate per discharge payments under the LTCH PPS, we simulated payments on a case-by-case basis using LTCH claims from the FY 2013 MedPAR files. For modeling estimated LTCH PPS payments for FY 2014, we used the FY 2014 standard Federal rate (that is, $40,607.31 for LTCHs that submit quality data under the requirements of the LTCHQR Program and $39,808.74 for LTCHs that failed to submit quality data under the requirements of the LTCHQR Program) used to make payments for LTCH discharges occurring on or after October 1, 2013 through September 30, 2014).

For modeling estimated LTCH PPS payments for FY 2015, we used the FY 2015 standard Federal rate of $41,043.71 (for LTCHs that submit quality data under the requirements of the LTCHQR Program), which includes a one-time prospective adjustment of 0.98734 for FY 2015 for the final year of the 3-year phase-in. For LTCHs that we project to have failed to submit the requisite quality data for FY 2015 under the LTCH Quality Reporting Program, we used the FY 2015 standard Federal rate of $40,240.51, which reflects the 2.0 percentage points reduction required by section 1886(m)(5)(C) of the Act. The FY 2015 standard Federal rates also include the application of an area wage level budget neutrality factor of 1.0016703 (as discussed in section V.B.5. of the Addendum to this final rule). Furthermore, in modeling estimated LTCH PPS payments for both FY 2014 and FY 2015 in this impact analysis, we applied the FY 2014 and the FY 2015 adjustments for area wage levels and the COLA for LTCHs located in Alaska and Hawaii. Specifically, we adjusted for differences in area wage levels in determining estimated FY 2014 payments using the current LTCH PPS labor-related share of 62.537 percent (78 FR 50995 through 50996) and the wage index values established in the Tables 12A and 12B listed in the Addendum to the FY 2014 IPPS/LTCH PPS final rule (which are available via the Internet on the CMS Web site. We also applied the FY 2014 COLA factors shown in the table in section V.C. of the Addendum to that final rule (78 FR 50997 through 50998) to adjust the FY 2014 nonlabor-related share (37.463 percent) for LTCHs located in Alaska and Hawaii. Similarly, we adjusted for differences in area wage levels in determining the estimated FY 2015 payments using the FY 2015 LTCH PPS labor-related share of 62.306 percent and the FY 2015 wage index values, including the 50/50 blended wage index, determined from the wage index values presented in Tables 12A through 12D listed in section VI. of the Addendum to this final rule (and available via the Internet). We also applied the FY 2015 COLA factors shown in the table in section V.C. of the Addendum to this final rule to the FY 2015 nonlabor-related share (37.694 percent) for LTCHs located in Alaska and Hawaii.

As discussed above, our impact analysis reflects an estimated change in payments for SSO cases, as well as an estimated increase in payments for HCO cases (as described in section V.D. of the Addendum to this final rule). In modeling payments for SSO and HCO cases in FY 2015, we applied an inflation factor of 5.0 percent (determined by OACT) to estimate the costs of each case using the charges reported on the claims in the FY 2013 MedPAR files and the best available CCRs from the March 2014 update of the PSF. Furthermore, in modeling estimated LTCH PPS payments for FY 2015 in this impact analysis, we used the FY 2015 fixed-loss amount of $14,972 (as discussed in section V.D. of the Addendum to this final rule).

These impacts reflect the estimated “losses” or “gains” among the various classifications of LTCHs from FY 2014 to FY 2015 based on the payment rates and policy changes presented in this final rule. Table IV illustrates the estimated aggregate impact of the LTCH PPS among various classifications of LTCHs.

  • The first column, LTCH Classification, identifies the type of LTCH.
  • The second column lists the number of LTCHs of each classification type.
  • The third column identifies the number of LTCH cases.
  • The fourth column shows the estimated payment per discharge for FY 2014 (as described above).
  • The fifth column shows the estimated payment per discharge for FY 2015 (as described above).
  • The sixth column shows the percentage change in estimated payments per discharge from FY 2014 to FY 2015 due to the annual update to the standard Federal rate (as discussed in section V.A.2. of the Addendum to this final rule, including the 2.0 percentage point reduction to the update to the standard Federal rate for LTCHs that fail to submit data to the LTCHQR Program) and the final year of the phase-in of a one-time prospective adjustment factor for FY 2015.
  • The seventh column shows the percentage change in estimated payments per discharge from FY 2014 to FY 2015 for changes to the area wage level adjustment (that is, the wage indexes, including the implementation of the new OMB delineations, and the labor-related share), including the application of an area wage level budget neutrality factor (as discussed in section V.B. of the Addendum to this final rule. This column includes the wage index calculated as a 50/50 blend of the wage index under the current CBSA designations and the wage index under the new OMB delineations under our transitional wage index policy for the implementation of the new OMB delineations.
  • The eighth column shows the percentage change in estimated payments per discharge from FY 2014 (Column 4) to FY 2015 (Column 5) for all changes (and includes the effect of estimated changes to HCO and SSO payments).

Table IV—Impact of Payment Rate and Policy Changes to LTCH PPS Payments for FY 2015

[Estimated FY 2014 payments compared to estimated FY 2015 payments]

LTCH ClassificationNumber of LTCHsNumber of LTCH PPS casesAverage FY 2014 LTCH PPS payment per caseAverage FY 2015 LTCH PPS payment per case Percent change in estimated payments per discharge from FY 2014 to FY 2015 for the annual update to the federal rate Percent change in estimated payments per discharge from FY 2014 to FY 2015 for changes to the area wage level adjustment with budget neutrality Percent change in payments per discharge from FY 2014 to FY 2015 for all changes
(1)(2)(3)(4)(5)(6)(7)(8)
ALL PROVIDERS422138,28140,14940,6000.80.01.1
BY LOCATION:
RURAL225,69535,36135,7700.8−0.11.2
URBAN400132,58640,35540,8080.80.01.1
LARGE20076,55942,56143,0600.80.11.2
OTHER20056,02737,34137,7300.8−0.11.0
BY PARTICIPATION DATE:
BEFORE OCT. 1983165,20937,15138,0390.80.92.4
OCT. 1983-SEPT. 19934416,84143,30643,7780.8−0.11.1
OCT. 1993-SEPT. 200218162,87039,35439,7540.8−0.11.0
OCTOBER 2002 and AFTER18153,36140,38340,8450.80.01.1
BY OWNERSHIP TYPE:
VOLUNTARY8018,69641,09941,6740.80.21.4
PROPRIETARY330117,76739,91640,3500.80.01.1
GOVERNMENT121,81845,49145,7500.8−0.40.6
BY REGION:
NEW ENGLAND146,95936,46837,3390.81.02.4
MIDDLE ATLANTIC298,54542,86143,6260.80.91.8
SOUTH ATLANTIC6118,60942,49142,8480.8−0.20.8
EAST NORTH CENTRAL7020,16041,69942,1650.80.21.1
EAST SOUTH CENTRAL318,96239,38039,7450.8−0.40.9
WEST NORTH CENTRAL266,47339,50039,9860.80.11.2
WEST SOUTH CENTRAL13448,29035,66835,9680.8−0.40.8
MOUNTAIN326,80943,15443,6920.80.11.2
PACIFIC2513,47450,14350,8250.80.21.4
BY BED SIZE:
BEDS: 0-24242,59135,09735,3700.9−0.30.8
BEDS: 25-4920047,30139,15639,5650.8−0.11.0
BEDS: 50-7411737,62140,74741,2580.80.11.3
BEDS: 75-1244522,10741,90742,4160.80.21.2
BEDS: 125-1992215,38739,06539,4920.8−0.11.1
BEDS: 200 +1413,27441,31241,7080.8−0.21.0
Estimated FY 2015 LTCH PPS payments based on the payment rate and factor changes presented in the preamble of and the Addendum to this final rule.
Percent change in estimated payments per discharge from FY 2014 to FY 2015 for the annual update to the standard Federal rate and the one-time prospective adjustment factor for FY 2015 as discussed in section V.A.2. of the Addendum to this final rule.
Percent change in estimated payments per discharge from FY 2014 to FY 2015 for changes to the area wage level adjustment under § 412.525(c) (as discussed in section V.B. of the Addendum to this final rule).
Percent change in estimated payments per discharge from FY 2014 LTCH PPS (shown in Column 4) to FY 2015 LTCH PPS (shown in Column 5), including all of the changes to the rates and factors presented in the preamble of and the Addendum to this final rule. Note, this column, which shows the percent change in estimated payments per discharge for all changes, does not equal the sum of the percent changes in estimated payments per discharge for the annual update to the standard Federal rate (column 6) and the changes to the area wage level adjustment with budget neutrality (Column 7) due to the effect of estimated changes in both estimated payments to SSO cases that are paid based on estimated costs and aggregate HCO payments (as discussed in this impact analysis), as well as other interactive effects that cannot be isolated.

e. Results

Based on the most recent available data for 422 LTCHs, we have prepared the following summary of the impact (as shown above in Table IV) of the LTCH PPS payment rate and policy changes presented in this final rule. The impact analysis in Table IV shows that estimated payments per discharge are expected to increase 1.1 percent, on average, for all LTCHs from FY 2014 to FY 2015 as a result of the payment rate and policy changes presented in this final rule, including an estimated slight increase in HCO payments. This estimated 1.1 percent increase in LTCH PPS payments per discharge from the FY 2014 to FY 2015 for all LTCHs (as shown in Table IV) was determined by comparing estimated FY 2015 LTCH PPS payments (using the payment rates and factors discussed in this final rule) to estimated FY 2014 LTCH PPS payments (as described in section I.K.3.d. of this Appendix).

We are establishing a standard Federal rate of $41,043.71 (or a standard Federal rate of $40,240.51 for LTCHs that failed to submit data under the requirements of the LTCHQR Program) for FY 2015. Specifically, we are updating the standard Federal rate for FY 2015 by 2.2 percent, which is based on the latest estimate of the LTCH PPS market basket increase (2.9 percent), the reduction of 0.5 percentage point for the MFP adjustment, and the 0.2 percentage point reduction consistent with sections 1886(m)(3) and (m)(4) of the Act. For LTCHs that fail to submit quality data under the requirements of the LTCHQR Program, as required by section 1886(m)(5)(C) of the Act, a 2.0 percentage point reduction is applied to the annual update to the standard Federal rate. In addition, we are applying a one-time prospective adjustment factor for FY 2015 of 0.98734 (approximately -1.3 percent) to the standard Federal rate for the final year of the 3-year phase-in.

We noted earlier in this section that, for most categories of LTCHs, as shown in Table IV (Column 6), the payment increase due to the 2.2 percent annual update to the standard Federal rate and the application of a one-time prospective adjustment for FY 2015 of approximately −1.3 percent for the final year of the 3-year phase-in is projected to result in approximately a 0.8 percent increase in estimated payments per discharge for all LTCHs from FY 2014 to FY 2015.

In addition, our estimate of the changes in payments due to the update to the standard Federal rate also reflects estimated payments for SSO cases that are paid using special methodologies that are not affected by the update to the standard Federal rate. For these reasons, we estimate that payments may increase by less than 0.9 percent for certain hospital categories due to the annual update to the standard Federal rate and the application of the final phase of the one-time prospective adjustment for FY 2015.

(1) Location

Based on the most recent available data, the vast majority of LTCHs are located in urban areas. Only approximately 5 percent of the LTCHs are identified as being located in a rural area, and approximately 4 percent of all LTCH cases are treated in these rural hospitals. The impact analysis presented in Table IV shows that the average percent increase in estimated payments per discharge from FY 2014 to FY 2015 for all hospitals is 1.1 percent for all changes. For rural LTCHs, the percent change for all changes is estimated to be a 1.2 percent increase, while for urban LTCHs, we estimate the increase will be 1.1 percent. Large urban LTCHs are projected to experience an increase of 1.2 percent in estimated payments per discharge from FY 2014 to FY 2015, while other urban LTCHs are projected to experience an increase of 1.0 percent in estimated payments per discharge from FY 2014 to FY 2015, as shown in Table IV.

(2) Participation Date

LTCHs are grouped by participation date into four categories: (1) Before October 1983; (2) between October 1983 and September 1993; (3) between October 1993 and September 2002; and (4) October 2002 and after. Based on the most recent available data, the categories of LTCHs with the largest percentage of LTCH cases (approximately 45 percent) are in hospitals that began participating in the Medicare program between October 1993 and September 2002, and they are projected to experience a 1.0 percent increase in estimated payments per discharge from FY 2014 to FY 2015, as shown in Table IV.

Approximately 4 percent of LTCHs began participating in the Medicare program before October 1983, and these LTCHs are projected to experience a higher than average percent increase (2.4 percent) in estimated payments per discharge from FY 2014 to FY 2015, as shown in Table IV. Approximately 10 percent of LTCHs began participating in the Medicare program between October 1983 and September 1993. These LTCHs are projected to experience a 1.1 percent increase in estimated payments from FY 2014 to FY 2015. LTCHs that began participating in the Medicare program after October 1, 2002, which treat approximately 39 percent of all LTCH cases, are projected to experience a 1.1 percent increase in estimated payments from FY 2014 to FY 2015.

(3) Ownership Control

LTCHs are grouped into three categories based on ownership control type: voluntary, proprietary, and government. Based on the most recent available data, approximately 19 percent of LTCHs are identified as voluntary (Table IV). The majority (nearly 78 percent) of LTCHs are identified as proprietary while government-owned and operated LTCHs represent about 3 percent of LTCHs. Based on ownership type, voluntary LTCHs are expected to experience an above average increase in payments of 1.4 percent; proprietary LTCHs are expected to experience an increase of 1.1 percent in payments, while government-owned and operating LTCHs are expected to experience an increase in payments that is less than the national average of 0.6 percent from FY 2014 to FY 2015.

(4) Census Region

Estimated payments per discharge for FY 2015 are projected to increase for LTCHs located in all regions in comparison to FY 2014. Of the 9 census regions, we project that the increase in estimated payments per discharge will have the largest positive impact on LTCHs in the New England and Middle Atlantic regions (2.4 percent and 1.8 percent, respectively as shown in Table IV). The estimated percent increase in payments per discharge from FY 2014 to FY 2015 for those regions is largely attributable to the changes in the area wage level adjustment.

In contrast, LTCHs located in the South Atlantic and West South Central regions are projected to experience the smallest increase in estimated payments per discharge from FY 2014 to FY 2015. The lower than national average estimated increase in payments of 0.8 percent is primarily due to estimated decreases in payments associated with the changes to the area wage level adjustment.

(5) Bed Size

LTCHs are grouped into six categories based on bed size: 0-24 beds; 25-49 beds; 50-74 beds; 75-124 beds; 125-199 beds; and greater than 200 beds. Most bed size categories are projected to receive either a slightly higher or slightly lower than average increase in estimated payments per discharge from FY 2014 to FY 2015. We project that small LTCHs (0-24 beds) will experience a 0.8 percent increase in payments, which is less than the nation average mostly due to decreases in the area wage level adjustment, while large LTCHs (200+ beds) will experience a 1.0 percent increase in payments. LTCHs with between 75 and 124 beds are expected to experience an above average increase in payments per discharge from FY 2014 to FY 2015 (1.2 percent).

4. Effect on the Medicare Program

As noted previously, we project that the provisions of this final rule will result in an increase in estimated aggregate LTCH PPS payments in FY 2015 relative to FY 2014 of approximately $62 million (or approximately 1.1 percent) for the 422 LTCHs in our database.

5. Effect on Medicare Beneficiaries

Under the LTCH PPS, hospitals receive payment based on the average resources consumed by patients for each diagnosis. We do not expect any changes in the quality of care or access to services for Medicare beneficiaries under the LTCH PPS, but we continue to expect that paying prospectively for LTCH services will enhance the efficiency of the Medicare program.

L. Effects of Requirements for the Hospital Inpatient Quality Reporting (IQR) Program

In section IX.A. of the preamble of this final rule, we discuss our requirements for hospitals to report quality data under the Hospital IQR Program in order to receive the full annual percentage increase for the FY 2017 payment determination. We are removing a total of 19 measures from the Hospital IQR Program for the FY 2017 payment determination and subsequent years, which begins in the CY 2015 reporting period. The first five measures are: (1) AMI-1 Aspirin at arrival (NQF #0132); (2) AMI-3 ACEI/ARB for left ventricular systolic dysfunction (NQF #0137); (3) AMI-5 Beta-blocker prescribed at discharge (NQF #0160); (4) SCIP INF-6 Appropriate Hair Removal; and (5) Participation in a systematic database for cardiac surgery (NQF #0113). Of these five measures, the first four are currently suspended. The fifth measure was recommended by the MAP for removal because it is “topped-out.” We believe that an additional 14 chart-abstracted measures are “topped out,” based on the previously adopted criteria, and we are removing them from the FY 2017 payment determination and subsequent years measure set. However, we are retaining the electronic clinical quality measure version of 10 of these chart-abstracted measures for Hospital IQR Program reporting as discussed in section IX.A.7.f. of the preamble of this final rule.

We also are adding one chart-abstracted measure for the FY 2017 payment determination and subsequent years in this final rule: Severe sepsis and septic shock: management bundle (NQF #0500).

We are incorporating refinements for several measures for the FY 2017 payment determination and subsequent years that were previously adopted in the Hospital IQR Program. These refinements have either arisen out of the NQF endorsement maintenance process, or during our internal efforts to harmonize measure approaches. The measure refinements include the following: (1) Refining the planned readmission algorithm for all seven readmission measures included in the Hospital IQR Program; (2) modifying the hip/knee readmission and complication measure cohorts to exclude index admissions with a secondary fracture diagnosis; and (3) modifying the hip/knee complication measure to not count as complications coded as “present on admission” (POA) during the index admission. We do not anticipate any hospital burden associated with these revisions, as each is based on claims submitted by hospitals for payment purposes.

Information is not available to determine the precise number of hospitals that would not meet the requirements to receive the full annual percentage increase for the FY 2017 payment determination. Historically, an average of 100 hospitals that participate in the Hospital IQR Program do not receive the full annual percentage increase in any fiscal year. We anticipate that because of the new requirements we are finalizing for reporting for the FY 2017 payment determination, the number of hospitals not receiving the full annual percentage increase may be higher than average. The highest number of hospitals failing to meet program requirements was approximately 200 after the introduction of new NHSN reporting requirements. If the number of hospitals failing does increase because of new requirements, we anticipate that over the long run, this number will decline as hospitals gain more experience with these requirements.

In the FY 2014 IPPS/LTCH PPS final rule, we estimated that the burden for the FY 2016 payment determination was 1,775 hours annually per hospital and 5.86 million hours across all 3,300 hospitals participating in the Hospital IQR Program (78 FR 50956). However, we have re-estimated the total number of hours associated with the requirements finalized for the FY 2016 payment determination to be 1,309 hours per hospital or a total of 4.3 million hours for all hospitals using more recent information from the clinical data warehouse than was available in August 2013.

As discussed in section XIII.B.6. of the preamble of this final rule, we estimate that our proposals for the adoption and removal of measures will result in an overall reduction in the total burden for hospitals for the FY 2017 payment determination for reporting chart-abstracted and structural measures, completing forms, reviewing reports, and submitting validation templates of 160 hours per hospital or 0.5 million hours across all hospitals compared to the total burden for participating hospitals in the Hospital IQR Program for the FY 2016 payment determination. The numbers included in our finalized policy more accurately reflect the burden associated with our program than the estimates provided in our proposal. As a result, the total burden for approximately 3,300 hospitals for the FY 2017 payment determination will be 1,149 hours per hospital or 3.8 million hours across all hospitals. This burden estimate includes both the newly finalized measures and the measures we are continuing. The burden estimates in this final rule are the estimates for which we are requesting OMB approval.

The table below describes the hospital burden associated with the Hospital IQR Program requirements.

Burden Impact of Hospital IQR Program Requirements for FY 2017

Hospital IQR Program RequirementNumber of hospitals impactedBurden per hospital for previously finalized requirementsBurden per hospital for all requirements as finalized (continuing, removed, added)Net change in burden per hospital
Chart-abstracted and structural measures, forms3,3001,291 hours1,131 hours−160 hours.
Review reports for claims-based measures3,3004 hours4 hours0.
Reporting of voluntary electronic clinical quality measures in place of chart-abstracted measuresUnknown*−385 hours−425 hours−40 hours.
Validation templatesUp to 600**72 hours72 hours0.
Electronic clinical quality measure validation testUp to 100**016 hours16 hours.
Validation charts photocopyingUp to 600$8,640$8,496$−144.
* This number is unknown at the time this table was prepared because final submission deadlines have not passed. Because the burden associated with participation is negative, we assumed this number to be 0 in summary calculations included in the narrative.
** Maximum numbers were used in summary calculations included in the narrative.

We estimate that the total burden associated with the voluntary electronic clinical quality measure reporting option will be similar to the burden outlined for hospitals in the Medicare EHR Incentive Program Stage 2 final rule (77 FR 53968 through 54162). In this rule, we finalize a policy allowing hospitals to submit data for a maximum of 16 measures that can be used to satisfy partial requirements for both programs. We estimate that each hospital that participates in the voluntary electronic quality measure reporting option could realize a maximum reduction in burden of up to approximately 425 hours by submitting data for all 12 required chart-abstracted measures that are also electronically specified.

M. Effects of Requirements for the PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program for FY 2017

In section IX.B. of the preamble of this final rule, we discuss our policies for the quality data reporting program for PPS-exempt cancer hospitals (PCHs), which we refer to as the PPS-Exempt Cancer Hospital Quality Reporting (PCHQR) Program. The PCHQR Program is authorized under section 1866(k) of the Act, which was added by section 3005 of the Affordable Care Act. In this final rule, we are requiring that PCHs submit data on one additional measure beginning with the FY 2017 program which will increase the total number of measures in the FY 2017 PCHQR measure set to 19 measures. We also are updating the specifications for the five previously finalized clinical process/oncology care measures to require PCHs to report all-patient data for each of these measures, and to adopt a new sampling methodology that PCHs can use to report these measures, as well as the newly finalized EBRT for bone metastases measure. We also are providing PCHs with two reporting options to report the clinical process/oncology care, SCIP, and clinical process/cancer specific treatment measures.

The impact of the new requirements for the PCHQR Program is expected to be minimal overall because some PCHs are already submitting previously adopted quality measure data to CMS. As a result, these PCHs are familiar with our IT infrastructure and programmatic operations. In addition to fostering transparency and facilitating public reporting, we believe our requirements uphold our goals in improving quality of care and achieving better health outcomes, which outweighs burden.

One expected effect of the PCHQR Program is to keep the public informed of the quality of care provided by PCHs. We will publicly display quality measure data collected under the PCHQR Program as required under the Act. These data will be displayed on the Hospital Compare Web site. The goals of making these data available to the public in a user-friendly and relevant format, include, but are not limited to: (1) Allowing the public to compare PCHs in order to make informed health care decisions regarding care setting; and (2) providing information about current trends in health care. Furthermore, PCHs can use their own health care quality data for many purposes such as in risk management programs, healthcare associated infection prevention programs, and research and development activities, among others.

N. Effects of Requirements for the Long-Term Care Hospital Quality Reporting (LTCHQR) Program for FY 2015 Through FY 2018

In section IX.C. of the preamble of this final rule, we discuss the implementation of section 1886(m)(5) of the Act, which was added by section 3004(a) of the Affordable Care Act. Section 1886(m)(5) of the Act provides that, for rate year 2014 and each subsequent year, any LTCH that does not submit data to the Secretary in accordance with section 1886(m)(5)(C) of the Act shall receive a 2-percentage point reduction to the annual update to the standard Federal rate for discharges for the hospital during the applicable fiscal year. In the FY 2012 IPPS/LTCH PPS final rule (76 FR 51839 through 51840), we estimated that only a few LTCHs would not receive the full annual percentage increase in any fiscal year as a result of failure to submit data under the LTCHQR Program. Information is not available to determine the precise number of LTCHs that would not meet the requirements to receive the full annual percentage increase for the FY 2016 payment determination. At the time that this analysis was prepared, 8 of the 442 active Medicare-certified LTCHs did not receive the full annual percentage increase for the FY 2014 payment determination. We believe that a majority of LTCHs will continue to collect and submit data for the FY 2015 payment determination and subsequent years because they will continue to view the LTCHQR Program as an important step in improving the quality of care patients receive in the LTCHs. We believe that the burden associated with the LTCHQR Program is the time and effort associated with data collection. There are approximately 442 LTCHs currently reporting quality data to CMS.

In this final rule, we are retaining seven previously finalized measures, revising two previously finalized measures, and are finalizing three additional quality measures for inclusion in the LTCHQR Program. In section IX.C.7. of the preamble of this final rule, we are finalizing three new quality measures for inclusion in the LTCHQR Program affecting the FY 2018 payment determination and subsequent years: (1) Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function; (2) Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support; and (3) National Healthcare Safety Network (NHSN) Ventilator-Associated Event (VAE) Outcome Measure.

Six of the previously adopted and newly finalized measures will be collected via the NHSN. In section IX.C.7.b. of the preamble of this final rule, we are finalizing our proposal to collect the NHSN VAE Outcome Measure. Normally, we would only discuss the burden associated with those measures that were proposed or finalized in any given rule. Because we have access to information that now indicates our previous calculations for the CAUTI, CLABSI, MRSA, and CDI were incorrect (we estimated in the FY 2014 IPPS/LTCH PPS final rule (78 FR 50959 through 50964) that LTCHs would submit six infection events per month for each of these measures), we offer below the recalculation of the associated burden. Based on submissions to the NHSN, we now estimate that each LTCH will make approximately 7 NHSN submissions per month: 1 MRSA event; 1 CDI event; 2 CLABSI events; 3 CAUTI events (84 events per LTCH annually). This equates to a total of approximately 37,128 submissions of events to the NHSN from all LTCHs per year (includes CAUTI, CLABSI, MRSA, and CDI). The CDC estimated the public reporting burden of the collection of information for each measure to include the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. MRSA and CDI events are estimated to require an average of 15 minutes per response (10 minutes of clinical (RN) time, and 5 minutes of clerical (Medical Record or Healthcare Information Technician)). CAUTI is estimated to require an average of 29 minutes per response, and CLABSI events are estimated to require an average of 32 minutes per response. In addition, each LTCH must also complete a Patient Safety Monthly Reporting Plan estimated at 35 minutes per Plan and a Denominator for Specialty Care Area, which is estimated at 5 hours per month. Based on this estimate, we expect each LTCH will expend 8.6 hours per month for each LTCH, 103.2 hours annually for each LTCH or 45,614.4 hours annually for all LTCHs reporting to the NHSN.

In addition, each LTCH must submit the Influenza Vaccination Coverage among Healthcare Personnel (NQF #0431), which the CDC estimates will take 10 minutes annually per LTCH, or an additional 73.66 hours for all LTCHs annually. In total, the burden we have recalculated for all previously finalized measures (including CAUTI, CLABSI, MRSA, CDI, HCP, Patient Safety Monthly Reporting plan, and Denominator for Specialty Care Area) will equal 103.4 hours annually per LTCH or 45,072.8 hours for all LTCHs annually.

For the newly finalized VAE measure, which will also be reported by LTCHs through the CDC's NHSN, the CDC estimates that each LTCH will submit 1 VAE per month, which will require approximately 22 minutes of clinical time per response. This equates to 22 minutes per LTCH monthly, 4.4 hours per LTCH annually, and 1,944.8 hours for all LTCHs annually. According to the US Bureau of Labor and Statistics, the mean hourly wage for a registered nurse (RN) is $33.13 ; the mean hourly wage for a medical records and health information technician is $16.81. However, in order to account for overhead and fringe benefits, we have doubled the mean hourly wage, making it $66.26 for an RN and $33.62 for a Medical Record or Health Information Technician. We estimate that the annual cost per each LTCH for the previously finalized measures, for which we have recalculated burden (including CAUTI, CLABSI, MRSA, CDI, HCP, Patient Safety Monthly Reporting plan, and Denominator for Specialty Care Area) to be $6,770.10 and that the total yearly cost to all LTCHs for the submission of data to NHSN will be $2,992,384.20. We estimate that the total cost for the newly finalized VAE measure will be $291.54 per LTCH annually, or $128,860.68 for all LTCHs annually.

According to the U.S. Bureau of Labor Statistics, the mean hourly wage for a Registered Nurse is $31.48. See: http://www.bls.gov/ooh/healthcare/registered-nurses.htm. Fringe benefits are calculated at a rate of 36.25 percent in accordance with OMB Circular A-76, Attachment C, Table C.1. After adding the fringe benefits, the total hourly cost for an RN is $42.89.

The All-Cause Unplanned Readmission Measure for 30 Days Post-Discharge from Long-Term Care Hospitals is a Medicare claims-based measure; because claims-based measures can be calculated based on data that are already reported to the Medicare program for payment purposes, we believe there will be no additional impact.

The remaining five measures will be collected utilizing the LTCH CARE Data Set. The burden estimates associated with OMB control number 0938-1163 estimate that each LTCH has an impact data collection burden of 243.24 hours or $6,755.84 associated with collection of the LTCH CARE Data Set, which includes the following three measures: Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (NQF #0678); Percent of Residents or Patients Who Were Assessed and Appropriately Given the Seasonal Influenza Vaccine (NQF #0680); and the Application of Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674).

We are also finalizing our proposal to use the LTCH CARE Data Set to report the two additional newly finalized measures—Functional Outcome Measure: Change in Mobility among Long-Term Care Hospital Patients Requiring Ventilator Support; and Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function—for the FY 2018 payment determination and subsequent years. In addition, the LTCH CARE Data Set will be used to report the previously finalized measure. We estimate the additional elements for two newly finalized measures will take 13.5 minutes of nursing/clinical staff time to report data for Admission assessment and 13 minutes of nursing/clinical staff time to report data for Discharge assessment, for a total of 26.5 minutes. In accordance with OMB control number 0920-0666, we estimate 202,050 discharges from all LTCHs annually, with an additional burden of 26.5 minutes. This would equate to 89,238.75 total hours or 201.9 hours per LTCH. We believe this work will be completed by RN staff. As previously noted, per the US Bureau of Labor and Statistics, the mean hourly wage for a registered nurse (RN) is $33.13. However, in order to account for overhead and fringe benefits, we have doubled the mean hourly wage, making it $66.26 for an RN. The total cost related to the two newly finalized functional status measures referenced above is estimated at $13,377.89 per LTCH annually, or $5,913,027.38 for all LTCHs annually.

According to the U.S. Bureau of Labor Statistics, the mean hourly wage for a Registered Nurse is $31.48. See: http://www.bls.gov/ooh/healthcare/registered-nurses.htm. Fringe benefits are calculated at a rate of 36.25 percent in accordance with OMB Circular A-76, Attachment C, Table C.1. After adding the fringe benefits, the total hourly cost for an RN is $42.89.

As discussed in section IX.C.7.a.1 of the preamble of this final rule, in response to several public comments concerned that the proposed functional status measures are excessively burdensome and that some of the included data items used to collect the data for the measures had “low response rates” during demonstration testing, we have decided to reduce the number of LTCH CARE Data Set data items required for the measure Percent of Long-Term Care Hospital Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function. We have reduced the number of data items for this quality measure from the originally proposed 45 to 35. We estimate that this reduction effectively reduces the annual cost per LTCH from the originally estimated $13,377.89 to $10,348.82 annually, and reduces the annual cost for all LTCHs from the originally estimated $5,913,027.38 to $4,574,178.44. This equates to a reduction of $3,029.07 per LTCH annually, and $1,338,851.38 for all LTCHs annually.

Lastly, as discussed in section IX.C.11. of the preamble of this final rule, in response to public comments, we are not finalizing our proposal to validate the accuracy of LTCH data at this time.

In summary, the total cost for all previously finalized HAI and vaccination measures (CAUTI, CLABSI, MRSA, CDI, HCP, Patient Safety Monthly Reporting plan, and Denominator for Specialty Care Area) reported through the CDC's NHSN, that we have recalculated based on new information regarding the number of infection events reported by LTCHs per month, is $6,770.10 per LTCH annually, or $2,992,384.20 for all LTCHs annually. The total cost per LTCH for the three newly finalized measures in this final rule (Functional Outcome Measure: Change in Mobility among Inpatients requiring Ventilator Support, Percent of LTCH Inpatients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function, and Ventilator-Associated Events) is $10,640.36 per LTCH annually, or $4,703,039.12 for all LTCHs annually.

Comment: Several commenters expressed concern over the burden associated with collection the two functional status measures we proposed.

Response: For a full discussion of the public comments, our responses, and our associated analysis of the reduction in required data items for the measure Patients with an Admission and Discharge Functional Assessment and a Care Plan That Addresses Function, we refer readers to the comment and response portion of section IX.C.7.a.1 of the preamble of this final rule. As we discuss above, as a result of our response, we have reduced our estimate of the burden for these measures, as we are finalizing them, by $1,338,851.38.

Comment: One commenter expressed concern over the burden with which the LTCH program is growing year to year, noting that there has been a 300 percent increase in burden each year, and that hospitals cannot endure such increases. This commenter further noted that the total cost for the LTCHQR Program to all LTCHs, with the inclusion of the three additional finalized measures in this rule is close to $12 million, while the initially estimated cost for the LTCHQR Program in the FY 2012 IPPS/LTCH PPS final rule was $750,000.

Response: We believe that the commenter's reference to $750,000 is a reference to our estimate in section IX.b.6 the FY 2012 IPPS/LTCH PPS final rule of the costs of submitting the CAUTI and CLABSI data to NHSN (76 FR 51780 through 51781). Our estimate of the effects of the LTCHQR Program in that final rule (76 FR 51839) was $1,128,440. Our original estimate in the FY 2012 IPPS/LTCH PPS final rule was based on projected costs for the program, as we had no data related to the rate of submission of our proposed measures.

While the commenter is correct that the estimates in the proposed FY 2015 IPPS/LTCH PPS proposed rule, as well as this final rule, equal approximately $12 million, we would like to take this opportunity to explain the increase. Our original estimate in the FY 2012 IPPS/LTCH PPS final rule was based on projected costs for the program, as we had no data related to the rate of submission of our proposed measures.

In subsequent years, as we added measures to the LTCHQR Program and as we have obtained a better understanding of the rate at which LTCHs would submit HAI data to the NHSN, we calculated and recalculated these costs in order to provide a more accurate representation of the program costs. As we have done in past rules, based on new information from the CDC, we have again recalculated the program costs related to previously finalized quality measures and required data submission. The estimates contained within this final rule resulted from actual CDC data regarding the rate of submission of all quality measures submitted via the CDC's NHSN, as well as from OMB-approved burden estimates for each of these measures. In addition, we accounted for actual burden, such as the Patient Safety Reporting Plan and Denominator for Specialty Care Area, which together, added an additional 64.2 hours per year per provider or 28,248 hours for all LTCHs. Finally, in the FY 2015 IPPS/LTCH PPS proposed rule, as well as this final rule, we accounted for overhead and fringe benefits, which effectively doubled many of our earlier cost estimates. Our inclusion of these costs (overhead and fringe), which we have not included in the past, is a substantial factor associated with the increase in burden.

We believe that this cost estimate cannot be compared to the cost estimate in the FY 2014 and previous IPPS/LTCH PPS final rules, without recognition of the factors discussed above. However, we are mindful of the burden of LTCHQR Program requirements and we have attempted to balance the need for a robust LTCHQR Program with this burden. For example, we have authorized sampling for certain measure reporting. In addition, as discussed in section IX.C.7.a.(1) of the preamble of this final rule, in response to commenters' specific concerns regarding burden, we are not adopting the reporting of several proposed new items in the LTCH CARE Tool, which overlap other items we are retaining, had high “Activity Did Not Occur” rates, and can be removed from the quality measure without affecting the measure substantively. As noted above, these modifications reduce our burden estimate by $1,333,851.38.

O. Effects of Regarding Electronic Health Record (EHR) Incentive Program and Hospital IQR Program

In sections IX.D. of the preamble of this final rule, we discuss requirements for the EHR Incentive Program. We are aligning the Medicare EHR Incentive Program reporting and submission timelines for clinical quality measures for eligible hospitals and CAHs with the Hospital IQR Program's reporting and submission timelines. However we are not finalizing our proposal to require quarterly submission of electronic clinical quality measure data.

We have determined that the electronic submission of aggregate-level data using QRDA-III will not be feasible in 2015 for eligible hospitals and CAHs under the Medicare EHR Incentive Program. We are finalizing our proposal to continue, for FY 2015, the policy we adopted for FY 2014 for eligible hospitals and CAHs submitting electronic clinical quality measures under the Medicare EHR Incentive Program. For FY 2015, eligible hospitals and CAHs will be able to electronically submit using a method similar to the 2012 and 2013 EHR Incentive Program electronic reporting pilot for eligible hospitals and CAHs, which used QRDA-I (patient-level data). Eligible hospitals and CAHs that are beyond their first year of meaningful use may continue to report aggregate electronic clinical quality measure results through attestation. We also are clarifying our policy on zero denominators and the case threshold exemption for clinical quality measures.

We do not believe that our newly finalized proposals to align the Medicare EHR Incentive program reporting and submission timelines for clinical quality measures with the Hospital IQR Program's reporting and submission timelines and to allow the electronic submission of QRDA-I (patient-level data) for eligible hospitals and CAHs to electronic submit electronic clinical quality measures under the Medicare EHR Incentive Program will have a significant impact.

P. Effects of Revision of Regulations Governing Use and Release of Medicare Advantage Risk Adjustment Data

Under section X. of the preamble of this final rule, we are revising the existing regulations at § 422.310(f) to broaden the specified uses of Medicare Advantage (MA) risk adjustment data in order to strengthen program management and increase transparency in the MA program and to specify the conditions for release of risk adjustment data to entities outside of CMS. We are revising the regulations to specify four additional purposes for which CMS may use or release risk adjustment data submitted by MA organizations: (1) To conduct evaluations and other analysis to support the Medicare program (including demonstrations) and to support public health initiatives and other health care-related research; (2) for activities to support the administration of the Medicare program; (3) for activities conducted to support program integrity; and (4) for purposes authorized by other applicable laws. In addition, the existing regulations do not specify conditions for release by CMS of risk adjustment data submitted by MA organizations. Therefore, we are adding regulatory language to address CMS' release of such data to non-CMS entities.

We have determined that the regulatory amendments do not impose any mandatory costs on entities that may choose, under this newly finalized policy, to request data files from CMS for their research analyses or other purposes listed in the proposal. Requesting data from CMS is at the discretion of the requester. Therefore, we have determined that there are not any economically significant effects of the provisions. We also have determined that the regulatory amendments will not impose a burden on the entity requesting data files.

Q. Effects of Changes to Enforcement Provisions for Organ Transplant Centers

Under section XI. of the preamble of this final rule, we are finalizing our proposals to expand and clarify the current organ transplant regulation as it relates to a transplant program's ability to request approval for participation in Medicare based on mitigating factors, the timelines for such review, and potential System Improvement Agreements that may allow a transplant program to improve outcomes and avert Medicare termination when outcomes have not met CMS requirements. Our finalized policies also will allow for consideration of factors such as innovative practice in the field of organ transplantation, and for potential mitigating factors consideration of a transplant program's outcomes using Bayesian methodology for calculating outcomes for patient death and graft failure.

These finalized policies will not have a significant effect on Medicare and Medicaid programs as it will allow organ transplant programs to continue to participate in Medicare if approved based on mitigating factors or during the time established in the Systems Improvement Agreement. There is an added benefit to patients who receive transplants, and to the Medicare program, when a transplant program improves patient and graft survival through completion of a system Improvement Agreement. However, sufficient data are not currently available to quantify the added benefit of System Improvement Agreements or innovative practices. Therefore, we project only that the cost impact of the policies to the Medicare and Medicaid programs will be negligible.

Historical data reflect that between the date the transplant regulation was codified in 2007 and August 2013, CMS rendered a final determination for 129 organ transplant programs that applied for Medicare approval based on mitigating factors. Of the 129 transplant programs, 20 terminated Medicare participation. An additional 33 transplant programs averted Medicare termination by successful completion of a Systems Improvement Agreement and resulting substantial improvement in patient and graft survival. The remaining programs were approved for mitigating factors based on improved outcomes (without needing a System Improvement Agreement), special circumstances, or came into compliance with CMS requirements during the mitigating factors review period. We estimate the cost associated with the application for mitigating factors at $10,000. This is based on the salary for the transplant administrator to prepare the documents for the application during the 30-day timeframe allotted. The cost does not represent any increase from what is anticipated in the existing transplant regulation related to mitigating factors. For transplant programs that enter into a Systems Improvement Agreement, the estimated cost to the transplant program is $200,000 to $250,000 based on reports from programs that have completed such Agreements in the past. Both a mitigating factors review and completion of a System Improvement Agreement are voluntary acts on the part of a hospital that maintains a transplant program. Since the 2007 effective date of the CMS regulation, only one hospital has elected not to file a mitigating factors review after being cited by CMS for a condition-level deficiency for patient outcomes or clinical experience, and few hospitals have declined a CMS offer to complete a System Improvement Agreement. Therefore, we conclude that the costs involved in these activities are much lower for the hospital compared with other alternatives, such as filing an appeal and incurring the legal costs of that appeal.

Our finalized policies will not have a significant impact on a substantial number of small businesses or other small entities. Nor will they have a significant impact on small rural hospitals.

II. Alternatives Considered

This final rule contains a range of policies. It also provides descriptions of the statutory provisions that are addressed, identifies the finalized policies, and presents rationales for our decisions and, where relevant, alternatives that were considered.

III. Overall Conclusion

1. Acute Care Hospitals

Table I of section I.G. of this Appendix demonstrates the estimated distributional impact of the IPPS budget neutrality requirements for the MS-DRG and wage index changes, and for the wage index reclassifications under the MGCRB. Table I also shows an overall decrease of 0.6 percent in operating payments. As discussed in section I.G. of this Appendix, we estimate that operating payments will decrease by approximately $654 million in FY 2015 relative to FY 2014. However, when we account for the impact of the changes in Medicare DSH payments and the impact of the new additional payments based on uncompensated care in accordance with section 3133 of the Affordable Care Act, based on estimates provided by the CMS Office of the Actuary, consistent with our policy discussed in section IV.F. of the preamble of this final rule, we estimate that operating payments will decrease by approximately $457 million relative to FY 2014. In addition, we estimate a savings of $27 million associated with the HACs policies in FY 2015, which is an additional $1 million in savings as compared to FY 2014. We estimate the implementation of the HAC Reduction Program, under section 3008 of the Affordable Care Act, will reduce payments by $369 million in FY 2015. We estimate that the expiration of the expansion of low-volume hospital payments for discharges beginning on April 1, 2015, under the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93) will result in a decrease in payments of approximately $152 million relative to FY 2014. We estimate that the new technology add-on payments for FY 2015 will increase spending by approximately $91 million. Finally, we estimate that the policies related to validation, including submission of and payment for secure electronic versions of medical information for validation for the FY 2017 payment determination and subsequent years, as described in the ICRs for the Hospital IQR Program in section XII.B.6. of the preamble of this final rule, will result in no change in payments for CMS for FY 2015. These estimates, combined with our estimated decrease in FY 2015 operating payment of −$457 million, result in an estimated decrease of approximately $888 million for FY 2015. We estimate that hospitals will experience a 1.5 percent increase in capital payments per case, as shown in Table III of section I.I. of this Appendix. We project that there will be a $132 million increase in capital payments in FY 2015 compared to FY 2014. The cumulative operating and capital payments would result in a net decrease of approximately $756 million to IPPS providers. The discussions presented in the previous pages, in combination with the rest of this final rule, constitute a regulatory impact analysis.

2. LTCHs

Overall, LTCHs are projected to experience an increase in estimated payments per discharge in FY 2015. In the impact analysis, we are using the rates, factors, and policies presented in this final rule, including updated wage index values and relative weights, and the best available claims and CCR data to estimate the change in payments under the LTCH PPS for FY 2015. Accordingly, based on the best available data for the 422 LTCHs in our database, we estimate that FY 2015 LTCH PPS payments will increase approximately $62 million relative to FY 2014 as a result of the payment rates and factors presented in this final rule. In addition, we estimate that net effect of the projected impact of certain other LTCH PPS policy changes (that is, the reinstatement of the moratorium on the full implementation of the “25 percent threshold” payment adjustment as discussed in section VII.E. of the preamble of this final rule; the reinstatement of the moratorium on the development of new LTCHs and LTCH satellite facilities and additional LTCH beds as discussed in section VII.G. of the preamble of this final rule; the revocation of onsite discharges and readmissions policy as discussed in section VII.F. of the preamble of this final rule; and the payment adjustment for “subclause (II)” LTCHs as discussed in section VII.H. of the preamble of this final rule) is estimated to result in an increase in LTCH PPS payments of approximately $116 million. The impact analysis of the payment rates and factors presented in this final rule under the LTCH PPS, in conjunction with the estimated payment impacts of certain other LTCH PPS policy changes, will result in a net increase of $178 million to LTCH providers. Additionally, costs to LTCHs associated with the completion of the data for the LTCHQR Program are increasing by $4.7 million for FY 2015.

IV. Accounting Statements and Tables

A. Acute Care Hospitals

As required by OMB Circular A-4 (available at http://www.whitehouse.gov/omb/circulars/a004/a-4.pdf), in Table V below, we have prepared an accounting statement showing the classification of the expenditures associated with the provisions of this final rule as they relate to acute care hospitals. This table provides our best estimate of the change in Medicare payments to providers as a result of the changes to the IPPS presented in this final rule. All expenditures are classified as transfers to Medicare providers.

The savings to the Federal Government associated with the policies in this final rule are estimated at $756 million.

Table V—Accounting Statement: Classification of Estimated Expenditures Under the IPPS From FY 2014 to FY 2015

CategoryTransfers
Annualized Monetized Transfers−$756 million.
From Whom to WhomFederal Government to IPPS Medicare Providers.

B. LTCHs

As discussed in section I.L. of this Appendix, the impact analysis of the payment rates and factors presented in this final rule under the LTCH PPS, As discussed in section I.L. of this Appendix, the impact analysis of the payment rates and factors presented in this final rule under the LTCH PPS, in conjunction with the estimated payment impacts of certain other LTCH PPS policy changes (that is, the reinstatement of the moratorium on the full implementation of the “25-percent threshold” payment adjustment; the reinstatement of the moratorium on the development of new LTCHs and LTCH satellite facilities and increase in the number of LTCH beds; the revocation of onsite discharges and readmissions policy; and the payment adjustment for “subclause (II)” LTCHs), is projected to result in an increase in estimated aggregate LTCH PPS payments in FY 2015 relative to FY 2014 of approximately $178 million based on the data for 422 LTCHs in our database that are subject to payment under the LTCH PPS. Therefore, as required by OMB Circular A-4 (available at http://www.whitehouse.gov/omb/circulars/a004/a-4.pdf),, in Table VI below, we have prepared an accounting statement showing the classification of the expenditures associated with the provisions of this final rule as they relate to the changes to the LTCH PPS. Table VI provides our best estimate of the estimated increase in Medicare payments under the LTCH PPS as a result of the payment rates and factors and other provisions presented in this final rule based on the data for the 422 LTCHs in our database. All expenditures are classified as transfers to Medicare providers (that is, LTCHs). Lastly, we present the costs to LTCHs associated with the completion of the data for the LTCHQR Program at $4.7 million than in FY 2014.

The cost to the Federal Government associated with the policies for LTCHs in this final rule is estimated at $178 million.

Table VI—Accounting Statement: Classification of Estimated Expenditures From the FY 2014 LTCH PPS to the FY 2015 LTCH PPS

CategoryTransfers
Annualized Monetized Transfers$178 million.
From Whom to WhomFederal Government to LTCH Medicare Providers.
CategoryCosts
Annualized Monetized Costs for LTCHs to Submit Quality Data$4.7 million.

V. Regulatory Flexibility Act (RFA) Analysis

The RFA requires agencies to analyze options for regulatory relief of small entities. For purposes of the RFA, small entities include small businesses, nonprofit organizations, and small government jurisdictions. We estimate that most hospitals and most other providers and suppliers are small entities as that term is used in the RFA. The great majority of hospitals and most other health care providers and suppliers are small entities, either by being nonprofit organizations or by meeting the SBA definition of a small business (having revenues of less than $7.0 million to $35.5 million in any 1 year). (For details on the latest standards for health care providers, we refer readers to page 36 of the Table of Small Business Size Standards for NAIC 622 found on the SBA Web site at: http://www.sba.gov/sites/default/files/files/Size_Standards_Table.pdf.)

For purposes of the RFA, all hospitals and other providers and suppliers are considered to be small entities. Individuals and States are not included in the definition of a small entity. We believe that the provisions of this final rule relating to acute care hospitals would have a significant impact on small entities as explained in this Appendix. Because we lack data on individual hospital receipts, we cannot determine the number of small proprietary LTCHs. Therefore, we are assuming that all LTCHs are considered small entities for the purpose of the analysis in section I.L. of this Appendix. MACs are not considered to be small entities. Because we acknowledge that many of the affected entities are small entities, the analysis discussed throughout the preamble of this final rule constitutes our regulatory flexibility analysis. In FY 2015 IPPS/LTCH PPS proposed rule, we solicited public comments on our estimates and analysis of the impact of our proposals on those small entities. Any public comments that we received and our responses are presented throughout this final rule.

VI. Impact on Small Rural Hospitals

Section 1102(b) of the Social Security Act requires us to prepare a regulatory impact analysis for any proposed or final rule that may have a significant impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 603 of the RFA. With the exception of hospitals located in certain New England counties, for purposes of section 1102(b) of the Act, we define a small rural hospital as a hospital that is located outside of an urban area and has fewer than 100 beds. Section 601(g) of the Social Security Amendments of 1983 (Pub. L. 98-21) designated hospitals in certain New England counties as belonging to the adjacent urban area. Thus, for purposes of the IPPS and the LTCH PPS, we continue to classify these hospitals as urban hospitals. (We refer readers to Table I in section I.G. of this Appendix for the quantitative effects of the policy changes under the IPPS for operating costs.)

VII. Unfunded Mandates Reform Act Analysis

Section 202 of the Unfunded Mandates Reform Act of 1995 (Pub. L. 104-4) also requires that agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2014, that threshold level is approximately $141 million. This final rule will not mandate any requirements for State, local, or tribal governments, nor will it affect private sector costs.

VIII. Executive Order 12866

In accordance with the provisions of Executive Order 12866, the Executive Office of Management and Budget reviewed this final rule.

Appendix B: Recommendation of Update Factors for Operating Cost Rates of Payment for Inpatient Hospital Services

I. Background

Section 1886(e)(4)(A) of the Act requires that the Secretary, taking into consideration the recommendations of MedPAC, recommend update factors for inpatient hospital services for each fiscal year that take into account the amounts necessary for the efficient and effective delivery of medically appropriate and necessary care of high quality. Under section 1886(e)(5) of the Act, we are required to publish update factors recommended by the Secretary in the proposed and final IPPS rules, respectively. Accordingly, this Appendix provides the recommendations for the update factors for the IPPS national standardized amount, the Puerto Rico-specific standardized amount, the hospital-specific rate for SCHs and MDHs, and the rate-of-increase limits for certain hospitals excluded from the IPPS, as well as LTCHs. In prior years, we have made a recommendation in the IPPS proposed rule and final rule for the update factors for the payment rates for IRFs and IPFs. However, for FY 2015, we plan to include the Secretary's recommendation for the update factors for IRFs and IPFs in separate Federal Register documents at the time that we announce the annual updates for IRFs and IPFs. We also discuss our response to MedPAC's recommended update factors for inpatient hospital services.

II. Inpatient Hospital Update for FY 2015

A. FY 2015 Inpatient Hospital Update

As discussed in section IV.B. of the preamble to this final rule, for FY 2015, consistent with section 1886(b)(3)(B) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act, we are setting the applicable percentage increase by applying the following adjustments in the following sequence. Specifically, the applicable percentage increase under the IPPS is equal to the rate-of-increase in the hospital market basket for IPPS hospitals in all areas, subject to a reduction of one-quarter of the applicable percentage increase (prior to the application of other statutory adjustments; also referred to as the market basket update or rate-of-increase (with no adjustments)) for hospitals that fail to submit quality information under rules established by the Secretary in accordance with section 1886(b)(3)(B)(viii) of the Act and a 33 1/3 percent reduction to three-fourths of the applicable percentage increase (prior to the application of other statutory adjustments; also referred to as the market basket update or rate-of-increase (with no adjustments)) for hospitals not considered to be meaningful electronic health record (EHR) users in accordance with section 1886(b)(3)(B)(ix) of the Act, and then subject to an adjustment based on changes in economy-wide productivity (the multifactor productivity (MFP) adjustment), and an additional reduction of 0.2 percentage point as required by section 1886(b)(3)(B)(xii) of the Act. Sections 1886(b)(3)(B)(xi) and (b)(3)(B)(xii) of the Act, as added by section 3401(a) of the Affordable Care Act, state that application of the MFP adjustment and the additional FY 2015 adjustment of 0.2 percentage point may result in the applicable percentage increase being less than zero.

In the FY 2015 IPPS/LTCH PPS proposed rule, based on the most recent data available at that time, in accordance with section 1886(b)(3)(B) of the Act, we proposed to establish the FY 2015 market basket update used to determine the applicable percentage increase for the IPPS based on IHS Global Insight, Inc.'s (IGI's) first quarter 2014 forecast of the FY 2010-based IPPS market basket rate-of-increase with historical data through fourth quarter 2013, which was estimated to be 2.7 percent. Based on the most recent data available for this FY 2015 final rule, in accordance with section 1886(b)(3)(B) of the Act, we are establishing the FY 2015 market basket update used to determine the applicable percentage increase for the IPPS based on IHS Global Insight, Inc.'s (IGI's) second quarter 2014 forecast of the FY 2010-based IPPS market basket rate-of-increase, which is estimated to be 2.9 percent.

In accordance with section 1886(b)(3)(B) of the Act, as amended by section 3401(a) of the Affordable Care Act, in section IV.B.1. of the preamble of the FY 2015 IPPS/LTCH PPS proposed rule (79 FR 28087), we proposed a multifactor productivity (MFP) adjustment (the 10-year moving average of MFP for the period ending FY 2015) of 0.4 percent. Therefore, based on IGI's first quarter 2014 forecast of the FY 2010-based IPPS market basket, depending on whether a hospital submits quality data under the rules established in accordance with section 1886(b)(3)(B)(viii) of the Act (hereafter referred to as a hospital that submits quality data) and is a meaningful EHR user under section 1886(b)(3)(B)(ix) of the Act (hereafter referred to as a hospital that is a meaningful EHR user), we presented in the proposed rule four possible applicable percentage increases that could be applied to the standardized amount. Based on the most recent data available for this FY 2015 IPPS/LTCH PPS final rule, in accordance with section 1886(b)(3)(B) of the Act, as amended by section 3401(a) of the Affordable Care Act, in section IV.B.1. of the preamble of this final rule, we are establishing a MFP adjustment (the 10-year moving average of MFP for the period ending FY 2015) of 0.5 percent.

In accordance with section 1886(b)(3)(B) of the Act, as amended by section 3401(a) of the Affordable Care Act, as discussed in section IV.B.1. of the preamble of this final rule, we are establishing the applicable percentages increases for the FY 2015 updates based on IGI's second quarter 2014 forecast of the FY 2010-based IPPS market basket, depending on whether a hospital submits quality data under the rules established in accordance with section 1886(b)(3)(B)(viii) of the Act and is a meaningful EHR user under section 1886(b)(3)(B)(ix) of the Act, as outlined in the table below.

FY 2015Hospital submitted quality data and is a meaningful EHR userHospital submitted quality data and is NOT a meaningful EHR userHospital did NOT submit quality data and is a meaningful EHR userHospital did NOT submit quality data and is NOT a meaningful EHR user
Market Basket Rate-of-Increase2.92.92.92.9
Adjustment for Failure to Submit Quality Data under Section 1886(b)(3)(B)(viii) of the Act0.00.0−0.725−0.725
Adjustment for Failure to be a Meaningful EHR User under Section 1886(b)(3)(B)(ix) of the Act0.0−0.7250.0−0.725
MFP Adjustment under Section 1886(b)(3)(B)(xi) of the Act−0.5−0.5−0.5−0.5
Statutory Adjustment under Section 1886(b)(3)(B)(xii) of the Act−0.2−0.2−0.2−0.2
Applicable Percentage Increase Applied to Standardized Amount2.21.4751.4750.75

B. Update for SCHs and MDHs for FY 2015

Section 1886(b)(3)(B)(iv) of the Act provides that the FY 2015 applicable percentage increase in the hospital-specific rate for SCHs and MDHs equals the applicable percentage increase set forth in section 1886(b)(3)(B)(i) of the Act (that is, the same update factor as for all other hospitals subject to the IPPS).

As discussed in section IV.G. of the preamble of this final rule, section 1106 of the Pathway for SGR Reform Act of 2013 (Pub. L. 113-67), enacted on December 26, 2013, extended the MDH program from the end of FY 2013 through the first half of FY 2014 (that is, for discharges occurring before April 1, 2014). Subsequently, section 106 of the Protecting Access to Medicare Act of 2014 (Pub. L. 113-93), enacted on April 1, 2014, further extended the MDH program through the first half of FY 2015 (that is, for discharges occurring before April 1, 2015). Prior to the enactment of Public Law 113-67, the MDH program was to be in effect through the end of FY 2013 only. The MDH program expires for discharges beginning on April 1, 2015, under current law. Accordingly, the update of the hospital-specific rates for FY 2015 for MDHs will apply in determining payments for FY 2015 discharges occurring before April 1, 2015.

As mentioned above, the update to the hospital specific rate for SCHs and MDHs is subject to section 1886(b)(3)(B)(i) of the Act, as amended by sections 3401(a) and 10319(a) of the Affordable Care Act. Accordingly, depending on whether a hospital submits quality data and is a meaningful EHR user, we are establishing the same four applicable percentage increases in the table above for the hospital-specific rate applicable to SCHs and MDHs.

C. FY 2015 Puerto Rico Hospital Update

Section 401(c) of Public Law 108-173 amended section 1886(d)(9)(C)(i) of the Act and states that, for discharges occurring in a fiscal year (beginning with FY 2004), the Secretary shall compute an average standardized amount for hospitals located in any area of Puerto Rico that is equal to the average standardized amount computed under subclause (I) for FY 2003 for hospitals in a large urban area (or, beginning with FY 2005, for all hospitals in the previous fiscal year) increased by the applicable percentage increase under subsection (b)(3)(B) for the fiscal year involved. Therefore, the update to the Puerto Rico-specific operating standardized amount is subject to the applicable percentage increase set forth in section 1886(b)(3)(B)(i) of the Act as amended by sections 3401(a) and 10319(a) of the Affordable Care Act (that is, the same update factor as for all other hospitals subject to the IPPS). Accordingly, we are making an applicable percentage increase to the Puerto Rico-specific standardized amount of 2.2 percent.

D. Update for Hospitals Excluded From the IPPS for FY 2015

Section 1886(b)(3)(B)(ii) of the Act is used for purposes of determining the percentage increase in the rate-of-increase limits for children's hospitals, cancer hospitals, and hospitals located outside the 50 States, the District of Columbia, and Puerto Rico (that is, short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and America Samoa). Section 1886(b)(3)(B)(ii) of the Act sets the percentage increase in the rate-of-increase limits equal to the market basket percentage increase. In accordance with § 403.752(a) of the regulations, RNHCIs are paid under the provisions of § 413.40, which also use section 1886(b)(3)(B)(ii) of the Act to update the percentage increase in the rate-of-increase limits.

Currently, children's hospitals, PPS-excluded cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa are among the remaining types of hospitals still paid under the reasonable cost methodology, subject to the rate-of-increase limits. We are applying the FY 2015 percentage increase in the IPPS operating market basket to the target amount for children's hospitals, PPS-excluded cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa. For this final rule, the current estimate of the FY 2015 IPPS operating market basket percentage increase is 2.9 percent.

E. Update for LTCHs for FY 2015

Section 123 of Public Law 106-113, as amended by section 307(b) of Public Law 106-554 (and codified at section 1886(m)(1) of the Act), provides the statutory authority for updating payment rates under the LTCH PPS.

As discussed in section V.A. of the Addendum to this final rule, we are establishing an update to the LTCH PPS standard Federal rate for FY 2015 based on the full LTCH PPS market basket increase estimate (for this final rule, estimated to be 2.9 percent), subject to an adjustment based on changes in economy-wide productivity and an additional reduction required by sections 1886(m)(3)(A)(ii) and (m)(4)(E) of the Act. In accordance with the LTCHQR Program under section 1886(m)(5) of the Act, we are reducing the annual update to the LTCH PPS standard Federal rate by 2.0 percentage points for failure of a LTCH to submit the required quality data. The MFP adjustment described in section 1886(b)(3)(B)(xi)(ii) of the Act is currently estimated to be 0.5 percent for FY 2015. In addition, section 1886(m)(3)(A)(ii) of the Act requires that any annual update for FY 2015 be reduced by the “other adjustment” at section 1886(m)(4)(E) of the Act, which is 0.2 percentage point. Therefore, based on IGI's second quarter 2014 forecast of the FY 2015 LTCH PPS market basket increase, we are establishing an annual update to the LTCH PPS standard Federal rate of 2.2 percent (that is, the current FY 2015 estimate of the market basket rate-of-increase of 2.9 percent less an adjustment of 0.5 percentage point for MFP and less 0.2 percentage point). Accordingly, we are applying an update factor of 1.022 in determining the LTCH PPS standard Federal rate for FY 2015. For LTCHs that fail to submit quality data for FY 2015, we are applying an annual update to the LTCH PPS standard Federal rate of 0.2 percent (that is, the final annual update for FY 2015 of 2.2 percent less 2.0 percentage points for failure to submit the required quality data in accordance with section 1886(m)(5)(C) of the Act and our rules) by applying an update factor of 1.002 in determining the LTCH PPS standard Federal rate for FY 2015. Furthermore, we are making an adjustment for the final year of the 3-year phase-in of the one-time prospective adjustment to the standard Federal rate under § 412.523(d)(3) by applying a factor of 0.97834 (or approximately −1.3 percent) in FY 2015, consistent with current law.

III. Secretary's Recommendations

MedPAC is recommending an inpatient hospital update equal to 3.25 percent for FY 2015. MedPAC's rationale for this update recommendation is described in more detail below. As mentioned above, section 1886(e)(4)(A) of the Act requires that the Secretary, taking into consideration the recommendations of MedPAC, recommend update factors for inpatient hospital services for each fiscal year that take into account the amounts necessary for the efficient and effective delivery of medically appropriate and necessary care of high quality. Consistent with current law, depending on whether a hospital submits quality data and is a meaningful EHR user, we are recommending the four applicable percentage increases to the standardized amount listed in the table under section II. of this Appendix B. We are recommending that the same applicable percentage increases apply to SCHs and MDHs. For the Puerto Rico-specific standardized amount, we are recommending an update of 2.2 percent.

In addition to making a recommendation for IPPS hospitals, in accordance with section 1886(e)(4)(A) of the Act, we are recommending update factors for certain other types of hospitals excluded from the IPPS. Consistent with our policies for these facilities, we are recommending an update to the target amounts for children's hospitals, cancer hospitals, RNHCIs, and short-term acute care hospitals located in the U.S. Virgin Islands, Guam, the Northern Mariana Islands, and American Samoa of 2.9 percent.

For FY 2015, consistent with policy set forth in section VII. of the preamble of this final rule, we are recommending an update of 2.2 percent (that is, the current FY 2015 estimate of the LTCH PPS market basket rate-of-increase of 2.9 percent less an adjustment of 0.5 percentage point for MFP and less 0.2 percentage point) to the LTCH PPS standard Federal rate.

IV. MedPAC Recommendation for Assessing Payment Adequacy and Updating Payments in Traditional Medicare

In its March 2014 Report to Congress, MedPAC assessed the adequacy of current payments and costs, and the relationship between payments and an appropriate cost base. MedPAC recommended an update to the hospital inpatient rates equal to 3.25 percent concurrent with changes to the outpatient prospective payment system and with initiating change to the LTCH PPS. We refer the reader to the March 2014 MedPAC report, which is available for download at www.medpac.gov for a complete discussion on this recommendation. MedPAC expects Medicare margins to remain low in 2014. At the same time, MedPAC's analysis finds that efficient hospitals have been able to maintain positive Medicare margins while maintaining a relatively high quality of care.

Response: With regard to MedPAC's recommendation of an update to the hospital inpatient rates equal to 3.25 percent, for FY 2015, as discussed above, sections 3401(a) and 10319(a) of the Affordable Care Act amended section 1886(b)(3)(B) of the Act. Section 1886(b)(3)(B) of the Act, as amended by these sections, sets the requirements for the FY 2015 applicable percentage increase. Therefore, we are establishing an applicable percentage increase for FY 2015 of 2.2 percent, provided the hospital submits quality data and is a meaningful EHR user, consistent with these statutory requirements.

We note that, because the operating and capital prospective payment systems remain separate, we are continuing to use separate updates for operating and capital payments. The update to the capital rate is discussed in section III. of the Addendum to this final rule.

BILLING CODE 4120-01-P

BILLING CODE 4120-01-C

[FR Doc. 2014-18545 Filed 8-4-14; 4:15 pm]

BILLING CODE 4120-01-P