Skip to main content

Breadcrumb

  1. Home

Guidebook of Policies and Procedures for PRMR and MSR

Comment Status
Closed
Comment Period
-

Comments

Submitted by Anonymous (not verified) on Mon, 07/03/2023 - 10:59

Permalink

Thank you for sharing with the  clinical community 

Name or Organization
Mir ShuttariMD

Submitted by Anonymous (not verified) on Wed, 07/05/2023 - 12:54

Permalink

Thank you for the opportunity to comment on the Guidebook of Policies and Procedures for PRMR and MSR.  As I read through the Guidebook and materials from the webinar last week, it is unclear to me where there is representation of Ambulatory Care.   The ASC Quality Collaboration works with Ambulatory Surgery Center management companies and affiliated healthcare organizations to promote quality and patient safety in the Ambulatory Surgery Center setting.  In the past, ASC representation has fallen into the Hospital Workgroup.  The interests of the ASCs and other ambulatory care settings are best represented by organizations and individuals who work in those settings.  We encourage awareness of the differences between ambulatory care and hospital settings and are open to contributing to that effort. 

Name or Organization
ASC Quality Collaboration

Submitted by Anonymous (not verified) on Mon, 07/10/2023 - 15:35

Permalink

Thank you for holding the PRMR-MSR webinar today- I appreciated much of the clarifying the information as well as the novel hybrid delphi and nominal  groups - great way to manage the proccess.

 I have a few questions-I appreciate your replies. 

  1. 1) I was a little confused at one point if individual patients and or caregivers could nominate themselves to PRMR MSR, or if they have to be part of a patient organizations?

2) Is there honorarium available for patients/ caregivers participating as individuals and or if they are representing  a patient organixzation (advocacy or as advisors)

3) Regarding COI- As I understand this- we will not know our COI for any specific measure group or individual measure until we understand what measure we are looking at not until after we are nominated i.e. previous research study invovement, actual specific measure development engagment, authorship etc)

Thank you!

Janice Tufte

www.janicetufte.com

 

 

Name or Organization
Janice Tufte

Submitted by Anonymous (not verified) on Wed, 07/12/2023 - 14:56

Permalink

Hello,

Thank you for the opportunity to comment on the policies and procedures for PRMR and MSR. I appreciate the thoughtful redesign to improve efficiency and increase public engagement.

I would like to suggest that PQM explicitly add regional quality collaboratives to the list of "interested parties" in the PRMR/MSR processes. Regional quality collaboratives have significant knowledge and expertise related to quality measurement, as well as the practical challenges related to data collection, validation, and analysis. In addition, the multi-stakeholder nature of regional quality collaboratives gives them a unique understanding of the full range of stakeholder perspectives on the selection and use of quality measures.

Thank you!

Julie Sonier

President and CEO, MN Community Measurement 

Name or Organization
Julie Sonier, MN Community Measurement

Submitted by Anonymous (not verified) on Mon, 07/17/2023 - 12:10

Permalink

Which aspect(s) of the guidebook are you commenting on?
Processes (E&M, PRMR, MSR)

The American Medical Association (AMA) appreciates the opportunity to comment on the Partnership for Quality Measurement (PQM) Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). We are supportive of PQM’s efforts to ensure that the processes emphasize consensus and input not only from the committees but also from key stakeholders and the public as well as the revisions to the evaluation criteria. We agree with the increase of the consensus threshold to 75% and the discussion and voting quorum thresholds. We ask that the PQM consider the following comments in an effort to further improve the process. 

Regarding the Requirements row in Table 1 on page 5, it is important to note that while the MACRA statute encourages CMS to select quality measures for the Merit-Based Incentive Payment System (MIPS) that are endorsed by a consensus-based entity, CMS is not required to include only measures set forth or endorsed by NQF or other consensus-building entities. We request that the report also reflect the appropriate language per the MACRA statue. CMS may select for the MIPS program, any quality measures it deems appropriate, as long as the measure has a focus that is evidence-based. 

The AMA is concerned with the omission of a public comment period on the Recommendation Groups’ final PRMR recommendations to the Center for Medicare and Medicaid Services (CMS) and ask that the PQM add a brief comment period on the Recommendation Group’s deliberations prior to submitting the final recommendations to CMS. Given the revised process with the Advisory and Recommendation groups and higher consensus threshold, conducting a comment period during which external stakeholders can provide input on the recommendations will enable the PQM to evaluate whether the Recommendation Group’s composition sufficiently reflected the viewpoints of the key stakeholder groups. It will also allow one final opportunity to groups to change their support (or lack thereof) based on any additional materials or discussions and provide additional context to the PQM and CMS on whether the process operated as expected. 

While we understand that the timeframe for the PRMR must meet statutory requirements and cannot be modified, we request that the PQM reconsider the proposed timeline for the MSR. Based on Figure 7 on page 21, it appears that the measures that are proposed to be removed will be released for comment at the same time as when the Inpatient Prospective Payment System (IPPS) and other proposed rules are typically posted. While the public has a second opportunity to comment, that period is scheduled for July and August – the same time as when the Physician Fee Schedule and other proposed rules are made available. The timelines released for the Fall and Spring cycles for the revised endorsement process also overlap with these months, leading to a significant burden for external stakeholders. We are extremely concerned that these overlapping comment periods will lead to reduced public input on one or more of these activities and urge the PQM to change the timing of the MSR and endorsement comment periods to avoid the months when proposed rules are also released.  

Lastly, while it is our hope that this revised process is successful, as with any redesigned process there will be the need to evaluate and refine the structure, criteria, and/or steps over time. We request that the PQM commit to an initial evaluation after the first or second year of implementation and ongoing re-evaluations of the process. These evaluations should be comprehensive including whether the Novel Hybrid Delphi and Nominal Group (NHDNG) technique and structure of the Advisory Group and/or Recommendation Group successfully achieve the desired goal of consensus-driven recommendations. Examples of what could be examined are whether the composition of the Recommendation Group for PRMR and MSR are representative of the key stakeholders to which the measures are most relevant and did the final recommendations reflect the collective input from public comments in addition to the committees. Some of these questions may be more relevant to track over time since membership of the Recommendation Group will change every year and consistency in the recommendations will be critical to ensure buy-in in the process. In addition, these evaluations should include opportunities for the public to provide reactions and recommendations on changes.

Thank you for the opportunity to comment. 

Name or Organization
American Medical Association

Submitted by Anonymous (not verified) on Wed, 07/19/2023 - 17:15

Permalink

Which aspect(s) of the guidebook are you commenting on?
General

The guidebook is clear and concise on the makeup of the membership, committees, meeting quorums, and voting quorums. A glossary of terms would be helpful, particularly for patients/recipients of care, caregivers, and patient advocates serving on the committees. 

Name or Organization
Tracey Brasel

Submitted by Anonymous (not verified) on Fri, 07/21/2023 - 09:47

Permalink

Which aspect(s) of the guidebook are you commenting on?
General

Thank you for the opportunity to comment on the Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). We appreciate the time and effort that Battelle has put into streamlining the steps and improving transparency throughout the processes. We also appreciate the ability for all stakeholders to participate in both processes. 

We would like to raise a few questions and highlight one area of caution. First, regarding selection of members for the various Committees, are there specific qualifications that Battelle is looking for in Committee members beyond the stakeholder groups listed? How will potential Committee members be evaluated? Similarly, we appreciate the commitment to including ad hoc subject matter experts when needed, but it remains unclear where these subject matter experts will be sourced from and how it will be determined that subject matter expertise is required.

Next, in terms of timelines, we respectfully request that Battelle consider adding a second public comment period after the PRMR Recommendation group votes have occurred. This would allow the public the chance to react to the Recommendation group discussions and highlight any areas of concern. Similarly, we suggest that, if possible, the second comment period for MSR be extended to 30 days. Given the volume of information Battelle will gather and post in the report, additional time may be needed for comments.

Additionally, can you clarify whether measure stewards will have access to Preliminary Assessments (PAs) prior to them being shared with the PRMR or MSR Committees? We appreciate that Battelle will discuss measures on the Measures Under Consideration (MUC) list with developers as needed, but it may be beneficial to share PAs with developers as well so they can be prepared for questions at the PRMR and MSR meetings. Similarly, will the PA be part of the packet of information related to each measure that will be publicly released?

Our final question relates to voting procedures. While we see the benefit of increasing the number of members on each Committee and applaud Battelle for employing a rigorous Novel Hybrid Delphi and Nominal Group technique, is there a possibility of no consensus at the Recommendation group meetings? If that is possible, will iterative voting proceed until consensus is reached or would it be possible for the final decision to be no consensus? 

Lastly, we appreciate the additional guidance regarding PRMR and MSR criteria. However, with respect to the PRMR criterion “Time to value realization”, we suggest that it may be too soon to evaluate measures based on whether they are built on standardized data elements or definitions identified in United States Core Data for Interoperability (USCDI) version 3 or USCDI+. The initial set of data elements for USCDI+ was only introduced in May of 2023, and only version 1 of USCDI is required until at least January 1, 2025. While we understand the motivation to align quality measures and standardize definitions, we suggest allowing additional time for these standards to develop and be deployed before evaluating measures on them.

Name or Organization
College of American Pathologists

Submitted by Anonymous (not verified) on Fri, 07/21/2023 - 16:37

Permalink

Which aspect(s) of the guidebook are you commenting on?
General

On behalf of the over 84,000 members of the American College of Surgeons (ACS), we appreciate the opportunity to submit comments to the Partnership for Quality Measurement’s (PQM) Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). The ACS is a scientific and educational association of surgeons founded in 1913 to improve the quality of care for the surgical patient by setting high standards for surgical education and practice. Prior to the Center for Medicare and Medicaid Services' (CMS) measure enterprise, ACS has been developing quality standards in our ACS quality and verification programs for decades. Amongst the most recognized of the ACS programs are the Trauma Center Verification Program, the Commission on Cancer (CoC), and the Metabolic and Bariatric Surgery Verification Program. We have also been active in developing quality measures for CMS quality programs and have participated in the PRMR process numerous times since the inception of the Measure Application Partnership (MAP). Given our experiences as leaders in quality of care and as a measure developer, we believe we can provide valuable insight on the PQM’s policies and procedures. 

Over the last two years, the ACS developed and submitted a quality measure for consideration in the Hospital Inpatient Quality Reporting (IQR) program and participated in the PRMR process supported by the National Quality Forum (NQF). Based on our expertise in how to measure care for a patient’s condition across the care journey, we believe the application for measure submission and the evaluation criteria for new measures lacks flexibility for non-traditional measures, such as programmatic quality measures. Programmatic quality measures are a specific type of clinically-focused composite metric, which 1.) align clinically relevant structures, processes, and outcomes 2.) target condition and/or population specific care, 3.) apply to multiple quality domains, 4.) address improvement across the continuum of care and 5) are informative to and actionable for clinicians and patients. 

The ACS submitted a programmatic measure in the 2022 measure review cycle that incorporated the essential elements of a quality program for inpatient geriatric care delivery. The measure requires that a healthcare facility attest to having the structural elements and key process in place proven to support the delivery of high-quality care for older adults, including mechanisms to gather data and measure outcomes to inform improvement cycles. A programmatic measure supports team-based care delivery across the patient’s care journey and can better support patients when seeking care by ensuring the public that the care team has a program in place to care for their needs. However, the application did not allow for this new type of measure to be recognized so we had to submit it as a structural measure, which is not an adequate way to describe the multiple dimensions of programmatic measures.  The measure incorporates structural, process, and outcome measures and demonstrates operational improvement. It is not simply a structural measure. Specifically, the data required as part of the measure submission application are not flexible enough to incorporate a programmatic or crosscutting measure. Instead, the data requirements are confined to the traditional data requirements of typical structural, outcome, and process measures, leaving those developing creative solutions to quality measurement with little to no options to share testing results, evidence, and other relevant data. The lack of flexibility leaves little room for understanding or explanation of other measurement mechanisms, leads to confusion as the measure continues through the review process, and stifles innovation. 

The guidebook incorporates consideration for the patient’s journey as one of the measure evaluation criteria to be used for the PRMR. The ACS agrees that acknowledgement of the patient journey is a critical criterion for measure evaluation, but upon review of the current measure inventory and the PRMR process, most measures are not designed to look across the patient journey. We appreciate that the PQM is highlighting the importance of incorporating the patient journey in their measure evaluation process and ask that they also revise the measure submission process to align with this priority. Doing so will allow for more productive and informed conversations around measures when they move through the PRMR process. 

We suggest that the PQM build their measure evaluation process around the following criteria: 

  • What impact does the measure have on the patient? Does the measure guide a patient seeking care to a provider or healthcare facility that best meets their needs? 
  • Does the measure support care teams to form around patients and provide them with the information they need to identify gaps and drive improvement? 
  • Does the measure give providers the data needed to make informed referrals to other providers who are best suited for their patients’ specific needs? 
  • Do these measures provide a payer with the information they need to inform the customers they serve?
     

The guidebook discusses measure evaluation criteria that focus on what is being done to the patient and the outcome. From the ACS perspective, measures should also be evaluated based on how they are designed for the patient’s ability to navigate care, for example, does the measure inform the patient in seeking care that aligns with their values near where they live or can travel, and at a price they can afford? CMS should consider if quality measures provide payers with the information they need to inform their customers about where to seek safe, affordable, and effective care, and find areas where they can incentivize care teams to improve. Then measures should be evaluated based on if and how they support care teams in identifying gaps or areas for improvement in the care they deliver to help drive improvement cycles. Striving for patient-centered quality measurement mechanisms that inform patients, ensuring that hospitals have the proper processes and structures in place to form care delivery around the patient, and driving improvements in care should be highest priority when evaluating quality measures. 

As CMS moves into a new partnership on measurement and quality in healthcare, it is important to realize the traditional measurement system needs a relook. A wholesale change of the national quality model used by the payer community would entail a loss of time and require huge change efforts. Instead, working a new model within the current framework is possible if we open our eyes to the limits of the current approach and consider more meaningful and action-oriented approaches, such as fitting programmatic measures into hospital measures and sharing accountability for a patient with all involved role players. 

The ACS appreciates the opportunity to provide feedback on this guidebook and looks forward to continuing dialogue with Battelle and CMS on these important issues. If you have any questions about our comments, please contact Jill Sage, Chief of Quality Affairs,  at [email protected].

Name or Organization
American College of Surgeons

Submitted by Anonymous (not verified) on Mon, 07/24/2023 - 10:22

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure

In reading the guidebook, it lends the perception Advisory & Recommendations Group are considered titles and therefore, should be capitalized throughout the document.

Section 1.1.3 PRMR and MSR Highlights (Page 4)

Recommended wording:

The PRMR and MSR processes are implemented to foster collaboration and, to balance the input of various interested parties, resulting in well informed recommendations regarding measures to be included or removed from a specific CMS reporting program. PRMR’s responsibility is to assess the appropriateness of the specific intended use of the measures included on the MUC list, each of which is targeted for a given program and population.  In contrast, MSR conducts a voluntary review of relative strengths and weaknesses of CMS’s current measure portfolio and, how the removal of an individual measure would reduce redundancy or create a measurement gap.  The PRMR and MSR processes recommend selection or removal to address national health care priorities, fill critical measurement gaps, and increase alignment of measures across programs. 

Section 2.1 Overview (page 7)

Recommended wording:

The Consensus-Based Entity (CBE, which is currently Battelle) convenes interested parties into committees to participate in PRMR and MSR committees. There are three PRMR committees—grouped by care setting (hospital, clinician, and post-acute care/long-term care) and a select group of members from each of these committees will be tapped to participate in the MSR committee that spans across care settings and populations.

Section 2.3.4 Advisory and Recommendation Groups (Starting on Page 11)

Recommended wording:

To ensure representation from the population of interested parties, Battelle develops a roster for each of the setting specific PRMR committees based on the categories previously outlined in Section 2.2.1.  These roster categories are comprised of individual and organizational seats for a total of 180 members.  Of those 180 members, 60 members are recruited to one of the three setting-specific committees.  Of those 60 members, 35 to 45 members are appointed to the Advisory Group and 18 to 20 members are appointed to the Recommendations Group, aligned to the specific setting. 

The Advisory and Recommendations Group are mutually exclusive.  The participants of the Recommendations Group are randomly appointed on an annual rotational basis from the committee roster of eligible nominees.  This ensures equal representation and allows every committee member the opportunity to provide feedback through participation in both groups during their three-year rotation. 

EXAMPLE: The goal is to have a total of 7 clinicians including primary care providers and specialists, 2 of the 7 will be randomly assigned to the Recommendations Group and the other 5 clinicians will serve on the Advisory Group.  If for any reason an appointed member of the Recommendations Group is unable to participate, there will still be enough eligible nominees in the category pools to draw additional members from. 

A committee member will serve on the Advisory or Recommendations Group for an entire measure review cycle.  For the following review cycle and, assuming their term is still active, another member will be randomly selected for the Recommendations Group.  It is possible, someone serving on the Advisory Group for the previous cycle may serve on the Recommendations Group the next cycle. 

 

(I’d eliminate 2.4 Interested Parties Involved in MSR, as a lot of the info is redundant, and put it under 2.3.4.) 

The MSR Recommendations Group is larger in size than the PRMR Recommendations Group, as it is comprised of 20 to 25 members selected from each of the three PRMR Committees. The PRMR committee members invited to serve on the MSR Recommendations Group while not aligned to a setting-specific structure like PRMR, these members are identified based on representation criteria, ensuring a holistic perspective.  Members of this committee follow the three year term similar to the appointment of the PRMR committees. 

 

Section 2.3.5 Term of Appointment

It states, “In the 2023-2024 cycle….” And goes on to discuss how members will be randomly assigned etc. The wording needs to clearly reflect if this process is only for the 2023-2024 cycle and what it will be moving forward. 

Based on the wording regarding organizations replacing committee members, it lends the perception Batelle is not vetting the member the organization is using to replace the outgoing member. 

 

Section 3.1 Overview

Recommended wording:

The PRMR and MSR evaluation processes entails iterative reviews of the measures. The review process is a combination of Battelle-led assessments (Staff Assessments) and input from the committee members. Both evaluations use a multi-step process meant to increase engagement of all members and structure facilitation by using standard criteria and practices.  However, there are some differences in these processes. 

  • PRMR uses a modified NHDNG technique to build consensus among committee members, leveraging experienced and trained facilitators. 
  • The MSR process is less structured, to allow for a more holistic review involving qualitative assessment of portfolios of measures across programs and is guided by interested parties’ input. Figure 5 presents an overview of these processes.

 

Section 3.3 PRMR Process (under #5 Recommendations Group Meetings)

Recommended wording:

In mid- to late-January, the Recommendation Group meets to discuss issues/concerns raised during the public comment period and feedback from the advisory group. Feedback from the advisory group is shared at least two weeks prior to the meeting to help the recommendation group prioritize their discussions on areas where consensus is lacking regarding the measure(s), based on the results from the pre-evaluation independent ratings. This is determined by the aggregated ratings from the first round from both groups. Battelle shares these first-round results with the Recommendation Group for review prior to the meeting.

To increase efficiency, similar measures are also discussed in the Recommendations Group(?) and the members then vote on the measures individually. Once votes are tabulated for the grouped measures, the next set of grouped measures are discussed and voted on. More detail on the consensus and the voting process is provided in Chapter 4.  This iterative and graduated process of measure review

improves efficiency and utilizes a meaningful approach for making final recommendations. Recommendations Group meetings are facilitated by Battelle staff according to the compiled comments and ratings from the Advisory and Recommendations groups to ensure discussions remain productive, within scope, and inclusive of all voices. Battelle staff facilitate meetings, establish meeting ground rules and goals, course correct as needed, and ensure decisions are reached. 

Using a consensus threshold of 75%, Battelle’s trained facilitators evaluate and communicate whether consensus was achieved, and dissenting views are noted in meeting summaries. This structured approach allows for efficient information exchange among committee members, which is particularly important when each member offers unique points of view. 

(Battelle is only documenting dissenting views in the meeting minute summaries?)

Section 3.4 MSR Process 

(# 1 Review of Cascade of Meaningful Measures (CoMM) Priorities)

Recommended wording:

The Cascade of Meaningful Measures (CoMM) is a tool to help prioritize existing health care quality measures, to align or reduce the number of measures, and identify gaps where new measures may need to be developed. 

(# 3 Staff Assessments)

Recommended wording:

For each MSR cycle, Battelle synthesizes information to guide the process. These assessments include:

a. Preliminary Assessment (PA): Battelle conducts a PA of measures including the following:

(#4 Recommendations Group Meetings)

Recommended wording:

The MSR Recommendation Group prioritizes discussion on measures based on comments received during both periods of public comment with the least agreement Battelle’s trained facilitators use established ground rules and goals for these recommendation group meetings, conduct course corrections as needed, and ensure decisions are reached. Meeting goals and rules are shared at least 3 weeks prior to the meetings. Battelle summarizes the discussion from the meeting, including all dissenting views, and submits recommendations to CMS.

Section 4.2 Establishing Consensus 

Recommended wording:

Battelle uses the NHDNG multi-step process, an iterative consensus-building approach to achieve a minimum percentage of 75% agreement amongst voting members, versus using a majority vote. Voting members are those who are appointed to the setting-specific recommendation groups. Consistent with our goal to add rigor to all aspects of the consensus development process, Battelle will rely on an evidence-based consensus index to determine whether consensus has been reached in committee votes. This index, analogous to the inter-rater reliability statistics, accounts for the degree as to the lack of consensus in committee votes. This approach is advantageous, as it takes into consideration the different sizes of the voting groups and different ratings across groups. Based on this approach, consensus can be determined to be 75% or higher agreement among members. 

Table 5 describes the consensus achievement process for final recommendations

Name or Organization
Suellen Shea

Submitted by Anonymous (not verified) on Wed, 07/26/2023 - 11:30

Permalink

Which aspect(s) of the guidebook are you commenting on?
General

On behalf of the more than 9,000 physiatrists of the American Academy of Physical Medicine and Rehabilitation (AAPM&R), we appreciate the opportunity to submit comments in response to the Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review.

 AAPM&R is the national medical specialty organization representing physicians who are specialists in physical medicine and rehabilitation (PM&R). PM&R physicians, also known as physiatrists, treat a wide variety of medical conditions affecting the brain, spinal cord, nerves, bones, joints, ligaments, muscles, and tendons. PM&R physicians evaluate and treat injuries, illnesses, and disability and are experts in designing comprehensive, patient-centered treatment plans. Physiatrists utilize cutting-edge as well as time-tested treatments to maximize function and quality of life.

AAPM&R has concerns regarding the removal of an additional public comment period prior to recommendation submissions to CMS in the PRMR lifecycle. Allowing key stakeholders additional time for input is crucial when it comes to measure development and inclusion. Minor changes to measures can have major impact. 

We are also still unclear on the committee appointment process and hope that PQM can be more transparent in both the process and selection of committee members as time goes on. This is where we feel the previous contractor really struggled and lacked transparency. Having a diverse group of stakeholders including, physiatrists, who focus on the whole person, quality of life and attaining reasonable functional goals/outcomes should be a priority for all committees. 

Generally, AAPM&R is in favor of both the American Medical Association and the American College of Surgeon comments listed below. 

Thank you for your consideration of these comments. If you have any questions or would like more information, please contact Beth Radtke, Director of Quality and Research at (847)737-6088 or [email protected]

Name or Organization
The American Academy of Physical Medicine and Rehabilitation

Submitted by Anonymous (not verified) on Wed, 07/26/2023 - 22:30

Permalink

Which aspect(s) of the guidebook are you commenting on?
General

Very comprehensive and clearly outlines committee selection/ compostion etc.   

Name or Organization
Bandera Healthcare, Inc. Ensign affiliate

Submitted by Anonymous (not verified) on Thu, 07/27/2023 - 16:39

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Processes (E&M, PRMR, MSR)
Quorum and voting
Evaluation rubric

The American Medical Rehabilitation Providers Association (AMRPA) appreciates the opportunity to submit comments on the PQM Guidebook of Policies and Procedures for PRMR and MSR. AMRPA is the national trade association representing more than 700 freestanding inpatient rehabilitation facilities and rehabilitation units of acute-care general hospitals (IRFs).[1]  The vast majority of our members are Medicare participating providers. In 2021, IRFs served 335,000 Medicare Fee-for-service (FFS) beneficiaries with more than 379,000 IRF stays among 1,181 IRFs.[2]  AMRPA has always looked to be a partner to regulating agencies and other key quality stakeholders in promoting meaningful and effective quality reporting in the IRF program, and we look forward to continuing this type of partnership with Battelle and the PQM moving forward

AMRPA recognizes the importance of a consensus-based entity (CBE) and the processes “to inform the selection and removal of health care quality and efficiency measures, respectively, for use in the Department of Health and Human Services (HHS) Centers for Medicare & Medicaid Services (CMS) Medicare quality programs”. AMRPA believes that the PQM PRMR and MSR  processes are essential and must include the careful consideration of quality measures to ensure that they distinguish high-quality care in and among IRFs and other post-acute care providers.  AMRPA is hopeful that the new PQM PRMR and MSR processes will improve upon some of the issues of the National Quality Forum (NQF) Measure Applications Partnership (MAP) that have been experienced over the past few years and have impacted IRFs and their patients.

While AMRPA supports the PQM PRMR and MSR concepts, our review of the Guidebook has identified a few concerns related to committee structure, consensus agreement threshold, voting procedures and consideration of non-endorsed measures.  We note that many of these recommendations complement the separate comments we provided on the PQM E&M Guidebook and urge Battelle to incorporate these refinements across both documents. We offer our recommendations in the attached document.

[1] Inpatient rehabilitation facilities (IRFs) – both freestanding and units located within acute-care hospitals – are fully licensed hospitals that must meet Medicare Hospital Conditions of Participation (COPs) and provide hospital-level care to high acuity patients.  IRFs’ physician-led care, competencies, equipment and infection control protocols are just some of the features that distinguish the hospital-level care provided by IRFs from most other PAC providers.

[2] Medicare Payment Advisory Committee (MedPAC) March 2023 Report to the Congress – Medicare Payment Policy, Chapter 9. Pages 263 and 266.

Name or Organization
American Medical Rehabilitation Providers Association (AMRPA)

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 08:44

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Processes (E&M, PRMR, MSR)

I appreciate Battelle’s commitment to promote a more transparent and engaged measure set review, endorsement and maintenance processes. However, I am writing to respectfully request an extension of the Partnership for Quality Measurement (PQM) committee nomination deadline from July 30, 2023 to August 30, 2023. 

First, I believe the selection of a Sunday, July 30, unintentionally shortens the deadline to the proceeding Friday, inadvertently shortening the nomination window. I also contend that the recently proposed and final rules, including but not limited to the CY 2024 Physician Fee Schedule Proposed Rule, CY 2024 Hospital Outpatient Prospective Payment System and Ambulatory Surgical Center Proposed Rule, FY 2024 Inpatient Rehabilitation Facility Final Rule, and the FY 2024 Inpatient Psychiatric Hospital Final Rule, may adversely impact the ability of certain qualified individuals to make an informed nomination submission due to competing priorities associated with the evaluation of these rules. 

Due to the above considerations, I believe an extension through August 30, 2023 is appropriate because it will ensure adequate time to enable more active and informed participation in the nomination process. Thank you in advance for your consideration.

Name or Organization
Brian Hart

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 13:59

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Processes (E&M, PRMR, MSR)

The American Geriatrics Society (AGS) applauds the Partnership for Quality Measurement’s (PQM) commitment and efforts to engage with stakeholders on the new processes for consensus development and strategic planning for health care quality measurement. While the new processes generally seem reasonable, the AGS urges the inclusion of geriatrics expertise as these policies and procedures are further refined. It is also critically important to ensure that the Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR) committees include geriatrics expertise. Geriatrics health professionals provide care for older adults, usually over the age of 65, and see the oldest and sickest patients. Their expertise in caring for older people with medical complexity or serious illness, leading interprofessional collaboration, implementing knowledge of long-term care across settings and sites, and treating older people as whole persons would be an essential skill set for the quality measurement process. Given the heterogeneity of older people, geriatrics health professionals are crucial to ensure that quality measures and the related processes meaningfully consider the unique healthcare needs of this growing population as well as the geriatrics specialty. 

Name or Organization
American Geriatrics Society

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 14:27

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Submitting measures to Battelle (E&M)
Processes (E&M, PRMR, MSR)
Quorum and voting
Appeals (E&M)
Evaluation rubric

Acumen appreciates the opportunity to comment on the proposed Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). We applaud Battelle’s effort in making the consensus-based evaluation of measures more rigorous and streamlined. With that spirit, Acumen offers 14 comments from the measure developer perspective, which is rooted in our experience as a developer of many quality and cost measures that are currently in use across Centers for Medicare & Medicaid Services’ (CMS) programs. 

1. Overlapping in Scopes of E&M and PRMR

From the measure developer perspective, the existence of two separate but overlapping processes to evaluate measures is extremely burdensome. In an ideal world, there should be only one process that holistically evaluates measures. We understand that there are regulatory and contractual requirements for having two processes, we recommend the following to harmonize the two processes and to reduce the burden on committee members:

  • Measures that already received a recommendation from PRMR or MSR should be automatically endorsed because the meaningfulness (PRMR) or impact (MSR) criterion completely overlaps with all E&M criteria and there will likely be significant overlap in committee membership.
  • Measures that already received endorsement should only be evaluated on criteria other than meaningfulness (PRMR) or impact (MSR) because both of these completely overlap with all E&M criteria and there will likely be significant overlap in committee membership.

2. Conflicting Information of the Endorsement Decision Categories

There are several points that are in direct conflict with each other:

  • Table 5 states that “More than 75%” is required for consensus in any decision category. However, the paragraphs in section 4.2 states “minimum of 75%” and “75% or higher”, which are in direct conflict with the statement in Table 5. 
  • Table 5 states that 25%-75% votes for “do not recommend” means “No consensus, which is also in direct conflict with statements of “minimum of 75%” and “75% or higher” in section 4.2. 
  • It is not immediately clear which decision category that “No consensus” maps to since there are officially only three decision categories (Endorsed, Endorsed with Conditions, and Not Endorsed/Endorsement Removed). 

We recommend rewriting section 4.2 to exhaustively map all permutations of possible votes to their respective final decisions

3. Threshold for Consensus of 75%

We appreciate Battelle’s effort in raising the standard for consensus. However, this proposed voting threshold is a significant deviation from the predecessor’s, which was set at 60%. We see three issues: 

  1. We understand that the main justification is rooted in a measure of consensus (E&M guidebook page 42), which is set at 95% and distinct from the voting threshold. However, there is no justification provided for using the threshold of 95% for the measure of consensus and, by extension, the rationale for a voting threshold of 75% is also lacking. 
  2. Some decision categories are overlapping in concept that may not constitute disagreement. For example, “Endorsed” and “Endorsed with Conditions” are closer in agreement than between “Endorsed with conditions” and “Not Endorsed”. In other words, the proposal (presented in E&M guidebook appendix F) failed to consider the relative distance between categories, which can lead to an inaccurate measure of consensus and, by extension, incorrect choice of the voting threshold. 
  3. This proposal constitutes a major scientific methodological choice that we believe is best suited for the Scientific Method Panel (SMP) to consider. The consideration of relative distance or agreement between decision categories outlined above is an example of the need for the SMP’s input. 

Until the SMP has a chance to review the proposed scientific rationale for the choices proposed and public comments on the SMP’s decision are received, we recommend keeping the existing voting threshold of 60% that was been painstakingly discussed and approved by the predecessor. 

4. Insufficient Guidance for Reviewers in Determining if a Criterion is Met

Unlike the E&M guidebook, this guidebook completely lacks guidance for reviewers to determine whether a submission meets or does not meet a particular evaluation criterion (Appendix B). This presents two major problems: (1) measure developers currently do not have enough information on how to best prepare submissions or be able to develop measures to meet the evaluation criteria, and (2) this creates the possibility for reviewers to inconsistently apply the evaluation criteria and undermining the transparency in the decision-making process. Currently, readers cannot meaningfully provide comments without these details being available. We recommend opening another public comment period after more details regarding meeting and not meeting the evaluation criteria are released.

5. Insufficient Guidance for Reviewers in Combining Ratings from Individual Criteria

While we understand that Tables 3 and 4 imply “Evidence is complete and adequate” means “Recommend”, “Evidence is incomplete or inadequate but there is a plausible path forward” means “Recommend with conditions”, and “Evidence is either incomplete or inadequate and there is no plausible path forward” means “Do not Recommend”. However, a reviewer may choose a different rating for each criterion. It is unclear how a reviewer should reconcile and weigh their own ratings across different criteria to arrive at the final overall rating. This creates the possibility for reviewers to be inconsistent even within their own review of different measures, as well as inconsistency across reviewers. We recommend developing a flowchart that exhaustively considers all criteria and possible ratings in determining a final overall rating. 

6. Lack of Specifications on Reasonable Conditions for Recommendation

While the decision of “Recommend with conditions” may be appropriate in some cases, it creates another possibility for reviewers to be inconsistent in either choosing a condition or making decisions across similar measures. To improve consistency and transparency, we recommend developing a limited set of reasonable endorsement conditions with detailed guidance that reviewers can choose from. 

7. PRMR Criteria – Appropriateness of Scale is Redundant and Under-Developed 

Conceptually, the appropriateness of scale criterion is touching on many aspects that overlap other criteria. We highlighted some examples below, assuming that the importance sub-criterion within meaningfulness is similar to that in the E&M process since this guidebook does not provide sufficient guidance to determine if a criterion is met:

  • Evidence related to performance gap or equity gap is already captured in importance criterion (E&M guidebook pages 31-32). 
  • Evidence related to quality improvement is already considered in the importance sub-criterion within meaningfulness and time to value realization (page 29).
  • Evidence related to the target population is already considered in the use for the target population and entities of the program under consideration within meaningfulness (page 29).
  • Evidence of meaningfulness to patient/care givers is already considered in the evidence that patients or caregivers value the measured outcome (E&M guidebook page 31). 

We recommend removingappropriateness of scale because it is redundant in the presence of meaningfulness and time to value realization. 

8. PRMR Criteria – Time to Value Realization is in Direct Conflict with Meaningfulness

Assuming the importance sub-criterion within meaningfulness is similar to that in the E&M process, the time to value realization criterion does not require evidence of a net benefit (page 29) but the importance within meaningfulness criterion does (E&M guidebook pages 31-32). Specifically, the time to value realization criterion allows for the measure to lead to better evidence in the future, as well as maturing over time to reduce uncertainty about how measured entities may respond to the measurement. In other words, this criterion acknowledges the fact that a measure does not have to have a causal relationship with a behavioral change by measured entities in order to be implemented in a program. However, if the importance sub-criterion within meaningfulness is similar to that in the E&M process, some measures may pass on time to value realization but fail the importance sub-criterion within meaningfulness. We recommend acknowledging that measurement can be important even if it may not lead to a quality improvement and removing the requirement for a measure to lead to a quality improvement. 

9. Insufficient Information on the Interaction between MSR and Legally Mandated Measures 

Table 1: Summary of PRMR and MSR scope and approach, mentions that the goal of MSR is to build consensus around removal recommendations across the entire CMS measure portfolio. We have the following question related to MSR and legally mandated measures: 

  1. How does Battelle handle a situation in which the MSR Recommendation Group recommends a measure for removal that is mandated by program statute (e.g., IMPACT Act measures)? 

We recommend that Battelle open another public comment period after more details regarding MSR and legally mandated measures are released. 

10. MSR - Lack of Consideration for the Burden for Developers and Unique Policy Context 

Section 3.4 MSR Process mentions that in facilitation of the MSR cycle, Battelle will reach out to measure stewards and developers to request any prior or updated testing data for measures that are under consideration for removal discussions. We recommend:

  • When a measure is chosen for review in an MSR cycle, Battelle should immediately notify the measure stewards/developers
  • Seeking public comments from measure developers in specifying the number of days that measure stewards/developers will be required to provide this information to Battelle staff
  • Battelle must consider, share with the MSR Recommendation Group, and accept reasonable explanation by measure stewards/developers regarding why a measure should not be discussed for program removal in the MSR. 

11. Public Comments

We have two questions related to public comments: 

  1. How does the Battelle plan to share public comment feedback with measure stewards and developers?
  2. Will all public comments be available to view, per measure, on the PQM website, or does Battelle plan to send a public comment summary workbook to measure developers?

We recommend that Battelle open another public comment period after more details regarding the process for sharing public comments are released.

12. Quorum

We appreciate that the Battelle is ensuring that committee members are present and ensuring quorum at the beginning of the committee meetings and if there is less than 60% attendance, a back-up meeting will be held. We have the following question related to quorum: 

  1. Will Battelle be checking quorum throughout the meetings and what would happen in the event that quorum is lost during the measure discussion? 

We recommend that Battelle open another public comment period after more details regarding quorum are released.

13. Lack of Dialogue with Measure Developers during the Review Process

We would appreciate additional details regarding the measure steward and developer’s role throughout the 6-month E&M cycle. The guidebook mentions that measure stewards/developers will have a chance to answer questions/provide measure insight during listening sessions. For this process to be successful and transparent, we recommend setting aside time for the measure steward and developers to have a chance to answer questions, provide context, and defend measures during the evaluation meetings. 

14. Preliminary Assessment Developer/Steward Review Timeline

Section 3.3 PRMR Process mentions that Battelle staff develop a Preliminary Assessment to assess the measure’s importance, reliability, validity, feasibility, and usability. We have the following questions regarding measure stewards/developers review of Preliminary Assessments:

  1. Will measure stewards/developers have a chance to review the Preliminary Assessment for factual accuracy before it is delivered to the PRMR committees? 
  2. If measure stewards/developers have a chance to review the Preliminary Assessment, how many days will measure stewards/developers have to conduct the factual review?

We recommend that Battelle open another public comment period after more details regarding the Preliminary Assessment developer/steward review timeline are released.

Thank you for your consideration of these comments. We look forward to reviewing additional details released by Battelle, and welcome an opportunity to provide further comments on the PRMR and MSR processes. Please feel free to reach out to [email protected] for any questions. 

 

Name or Organization
Acumen LLC

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 15:16

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Processes (E&M, PRMR, MSR)
Quorum and voting

The opportunity to submit comments on the new procedures for the Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR) guidebook is appreciated. It is encouraging to see how quickly Battelle is moving into action with modifications to the process. 
 

Committee Structure and Nominations
For PRMR, Battelle plans to dissolve the Coordinating Committee and integrate the Rural and Health Equity committees into the three main PRMR committees with up to 180 members. While larger committees can provide more diverse perspectives, they can also slow down decision-making.  Dividing each committee into a Recommendations Group and Advisory Group will help tremendously to keep meetings focused, as only the Recommendations Group (18-20 members) will meet virtually to discuss areas of disagreement. 

There is support for incorporating the Health Equity and Rural Health groups into the committees. However, there are concerns about committee member burnout. The time commitment is described as participation in calls and providing timely responses to requests for feedback.  There is also reference to reviewing meeting materials prior to each scheduled meeting.  As there will be fewer committees to divide the measures under consideration (MUC) list, there may be too many measures for the committees to review. For example, the 2022 MUC list had 21 measures just for the Merit-based Incentive Payment System (MIPS) alone.  The staff assessments will help to pull out relevant items from submission materials, but there are still numerous pages of measure submission materials the members should review to adequately assess measures and make recommendations.

Recruiting subject matter experts (SMEs) to the PRMR committees on a rotating basis depending on the committee expertise for specific measure submissions is supported. Battelle plans to recruit SMEs as soon as a measure is approved for the MUC list.  However, there are concerns about the timeline for the nomination of SMEs.  According to the timelines outlined in statute and on page 6, the MUC list is released in December.  If a nomination process is then started for SMEs in the areas of the measures submitted, this may not allow either the advisory or recommendations groups sufficient time to consider the measures in light of the SMEs feedback.    

Battelle plans to invite select PRMR committee members to also participate in MSR activities. What are the criteria for selecting the members to participate in MSR? This process should be as transparent as possible.

Additionally, the current nomination form can be improved.  Ideally, an individual or organization could select which of the two groups, Advisory or Recommendations Group, for which he or she is submitting a nomination. The Recommendations Group is selected from the overall committee roster of eligible nominees. The former Measures Advisory Partnership (MAP) process was organization-based, and it would be helpful to know if that is a preference for this process. It is encouraging to see that all key stakeholders are involved but those being measured (e.g., clinicians, hospitals) should be heavily represented in both groups.
 

Recommendations Group Meetings
To increase efficiency, similar measures will be discussed in a group. Recommendations Group members then vote on the discussed measures individually. Battelle is using the Novel Hybrid Delphi and Nominal Group (NHDNG) to find areas of disagreement, which will be addressed in the meetings. This should expedite meetings.  However, it is unclear whether Recommendations Group members have the opportunity to discuss any areas that have agreement during the meeting. While 80% of the members may agree on a measure, the other 20% might want to bring up why they disagree with a particular domain. Additionally, public comments may be submitted that should be addressed within areas of agreement.
 

Voting Procedures
The NHDNG process will reduce the subjectivity of the process.  There is support for the higher thresholds for agreement and quorum among the recommendations group as opposed to the previous consensus-based entity (CBE) process.
 

Public Comment and Listening Sessions
There is support for the increased number of public comment opportunities, as well as the transparency of public comments. With the improved public comment process, public comments are available to the public as soon as Battelle staff approve the comment for posting. The time frame of 21 days for public comment is sufficient.

The new listening sessions are a great addition to the measure review process. It is a good opportunity for committee members to ask questions on measures prior to submitting their ratings. Allowing measure developers to clarify items on their submissions is valuable, as developers were previously not allowed to provide feedback during the meetings.  

Name or Organization
Karen Campos

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 16:48

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Processes (E&M, PRMR, MSR)
Quorum and voting
Evaluation rubric

Dear Battelle Team, 

On behalf of Health Services Advisory Group, Inc. (HSAG), we appreciate the opportunity to review and comment on the Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). We are supportive of Battelle’s efforts to refine the process and respectfully submit the following overarching comments for consideration:

  • We encourage Battelle to align the proposed PRMR and MSR criteria with the Consensus-Based Entity (CBE) Endorsement & Maintenance (E&M) criteria to the extent possible and where there is overlap conceptually.
  • Overall, for both PRMR and MSR processes, we consider the underlying criteria within a given category heterogeneous and note overlap of evaluation concepts across the criteria. We recommend refining the definitions to help committee members apply the criteria consistently across measures and programs. 
  • Currently, there is no mechanism for developers to submit updated measure information through the Centers for Medicare & Medicaid Services (CMS) Measures Under Consideration Entry/Review Information Tool (MERIT) for measures in use in a CMS program. We encourage Battelle to consider expanding MERIT to allow developers/stewards to submit updated measure information and at specified time points in the measure lifecycle. Using one system to submit measure information in a standardized format would reduce burden on developers and enhance efficiency.
  • We support the use of the Novel Hybrid Delphi and Nominal Group (NHDNG) technique and believe that this technique will increase engagement of members and structure facilitation by using standard criteria and practices. However, we are concerned that it may be challenging to achieve an 80% voting quorum with 60 members.

 Thank you for the opportunity to comment.

Name or Organization
Health Services Advisory Group (HSAG)

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 17:12

Permalink

Which aspect(s) of the guidebook are you commenting on?
Committee structure
Processes (E&M, PRMR, MSR)

NCQA would like to comment on the proposed E&M and PRMR committees. We’re concerned that the E&M committees and setting specific PRMR committees (i.e., Hospital Committee, Clinician Committee, and PAC/LTC Committee) are not comprehensive enough to capture the expansive health care landscape. Additionally, we’re concerned that the proposed number of aggregate members (∼60) slated for each committee is too large. With groups this considerable in size, it will be difficult to have confidence in the appropriate expertise and experience of committee members.

Name or Organization
NCQA

Submitted by Anonymous (not verified) on Fri, 07/28/2023 - 19:24

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Quorum and voting

Dear Battelle: 

  

On behalf of America's Essential Hospitals, thank you for the opportunity to comment on the Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR). America’s Essential Hospitals is the leading champion for hospitals and health systems dedicated to high-quality care for all. Our more than 300 member hospitals fill a vital role in their communities and provide a disproportionate share of the nation’s uncompensated care. Three-quarters of their patients are uninsured or covered by Medicare or Medicaid, and more than half of patients seen at essential hospitals are people of color. Essential hospitals provide state-of-the-art, patient-centered care while operating on margins less than half that of other hospitals—3.2 percent on average compared with 7.7 percent for all hospitals nationwide.i These narrow operating margins result in minimal reserves and low cash on hand—circumstances exacerbated by financial pressures related to COVID-19. As essential hospitals rebound from the pandemic, they face new challenges, such as rising workforce costs and shortages, rising supply costs, and supply shortages. 

  

As outlined below, we have significant concerns that several specific policies in the Guidebook violate OMB (Office of Management and Budget) Circular A-119 principles on balance, openness, and transparency. On July 10, Battelle hosted an informational session to discuss Guidebook policies and procedures, including timelines related to PRMR and MSR, measure selection, and removal criteria. As detailed below, there are some discrepancies between the Guidebook and webinar content. 

  

Restrictions on Association Involvement in Measure Development Process 

 

The measure development process involves both a hospital advisory group and a recommendation group. Only the recommendation group votes on final measure recommendations to the Centers for Medicare & Medicaid Services (CMS). The advisory group provides input that guides the recommendation group but does not vote. Chapter 2, Table 1 of the Guidebook indicates that associations can participate only in the advisory group and not the recommendation group for PRMR and MSR.  

 

Limiting voting rights to a subset of stakeholders contradicts OMB A-119’s direction to make concerted efforts to involve all affected parties. This restriction skews perspective balance on final measure recommendations. OMB A-119’s guidance is clear that consensus groups should not be dominated by any single interest. Further, a slide shown during the July 10 PRMR-MSR webinar showed associations in both the advisory and recommendation groups. This conflicting information about association involvement in final voting should be clarified. 

 
We urge Battelle to revise the structure of the recommendation group to confirm that associations can participate in the final vote on measure recommendations. This will ensure the measure development process is balanced and open to input from all stakeholders, as required by OMB A-119. 

 

Concerns about Committee Term Limits 

 

The Guidebook outlines a policy in which committee terms will last for three years. However, in the first cycle, members randomly will be assigned one-, two-, or three-year terms to establish a rolling membership.  

 

The Guidebook notes that members will rotate between advisory and recommendation groups as needed during their term. It is unclear as to how this process will occur or who will make the decision to perform this reassignment and why. In addition, rotating members between the advisory and recommendation group is concerning, as it could result in key stakeholders losing voting privileges in an arbitrary fashion.  

 

We also are concerned about the randomized term length assignments suggested for the first year of the committees. This process likely will be highly disruptive, as it would prevent key stakeholders from providing expertise and advice on new measures that the field would need to adhere to should CMS adopt them in federal rulemaking. We urge Battelle to consider appointing knowledgeable hospital association representatives, inclusive of all hospital types, for the default maximum term of three years. This would let us provide expert advice and guidance to the committee throughout our appointment. 

 

Limited Public Comment Opportunities  

 

Chapter 3, Figure 6 of the Guidebook indicates the PRMR process allows only one period for public comment before a quality measure is formally recommended to CMS. Not allowing comment on final recommendations violates OMB A-119’s requirement for consensus bodies to make draft standards available for public review and input. The limited opportunity for public comment during the PRMR process is a significant concern because it contravenes the requirements of OMB A-119, which mandates consensus bodies to avail draft standards for public scrutiny and contribution. The directive promotes multiple opportunities for public discourse, ensuring that final recommendations are well-reviewed and incorporate diverse perspectives. Offering a single chance for feedback reduces the likelihood of achieving a comprehensive consensus and restricts the ability for hospital leaders to provide practical insights on measures before they are proposed to CMS.  

 

A single comment period also complicates the pursuit of health care equity. Hospitals serve heterogeneous populations, making their feedback crucial in understanding the potential effect of proposed measures on diverse groups. Without ample opportunity for this feedback, achieving equity becomes difficult. Additionally, economic implications are significant. With programs like the Hospital Value-Based Purchasing Program and the Hospital Readmissions Reduction Program tying Medicare reimbursements to hospital performance, there is a risk of imposing financial penalties on hospitals that may find the new measures unfeasible or inappropriate. The inability to provide sufficient feedback could lead to substantial financial losses and negatively affect patient care quality and accessibility.  

 

Therefore, providing more than one public comment period serves the best interest of all parties involved. We urge Battelle to revise the PRMR process to allow a formal comment period for finalized measures it plans to submit to CMS. This would help ensure that final recommendations are well informed, supported by evidence, and reflective of the diverse perspectives and needs of stakeholders. 

 

Including Patients and Caregivers on Committees 

 

In Chapter 2 of the Guidebook, Battelle states patients and caregivers can participate on committees but does not explicitly state they must have quality measure development experience. During the July 10 webinar, it was mentioned broadly that patients with interactions with the health care system would be eligible for these advisory committees. We caution that allowing committee members with limited experience does not meet OMB A-119’s balance requirements. All committee members must have relevant expertise to meaningfully contribute to complex measure decisions. We urge Battelle to ensure that all committee members possess the necessary and relevant expertise in quality measure development. While it is commendable that patients and caregivers can participate on committees, it is crucial to require explicitly prior experience in this field to ensure that complex measure decisions are made with a well-rounded perspective. 

 

In conclusion, we believe that the proposed measure development process can be strengthened by mitigating the above-mentioned issues to ensure hospitals are fairly evaluated under future quality measures. We appreciate your attention to these matters and look forward to continued collaboration and dialogue. 

  

If you have questions, please contact me at 202.585.0127 or [email protected]

  

Thank you, 

Erin O’Malley  

Senior Director of Policy 

America’s Essential Hospitals 

  

Name or Organization
America's Essential Hospitals

Submitted by Anonymous (not verified) on Sat, 07/29/2023 - 18:28

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Processes (E&M, PRMR, MSR)

Please see the comment letter attached.  We look forward to a continuing dialogue and collaboration with Battelle’s PQM regarding measures that affect the ability of occupational therapy practitioners to provide quality care to people, populations, and communities.

Name or Organization
American Occupational Therapy Association (AOTA)

Submitted by Anonymous (not verified) on Sat, 07/29/2023 - 21:21

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Submitting measures to Battelle (E&M)
Processes (E&M, PRMR, MSR)

The American Society of Anesthesiologists (ASA) appreciates the opportunity to comment on the Partnership for Quality Measurement’s Guidebook of Policies and Procedures for PreRulemaking Measure Review (PRMR) and Measure Set Review (MSR). We appreciate the level of transparency and outreach that Battelle has established since earning the opportunity to lead the review and selection of quality and efficiency measures under consideration for use by the US Department of Health and Human Services (HHS). ASA supports many of the proposed changes within this document, including the expansion of project rosters and the criteria used to review measures. 

 

Pre-Rulemaking Measure Review

 

We recognize Battelle must navigate a tight schedule each time HHS releases its Measures Under Consideration list. Within a two-month window, Battelle must receive public comments, hold Advisory and Recommendation Group meetings, and submit a report to HHS. Battelle has judiciously streamlined the measure review groups to hospital-based measures, clinician measures, and measures focused on post-acute care. This process is more understandable and cohesive than previous measure review committee structures.

 

ASA also supports both Advisory and Recommendations Group structures. In particular, the expansion of participants in the Advisory Group to 35-45 members and inclusion of 18-20 members within the Recommendations Group will allow additional opportunities for anesthesiologists to participate. For the Recommendation Group, we ask Battelle to ensure an adequate number of “providers (and facilities)” within the group. In previous review cycles, a measure might be endorsed for use without considering the burden or impact the measure would have on individual specialties or physician groups. Including proper representation from anesthesiologists, as well as measure stewards in general, will generate additional buy-in and legitimacy in the review process. 

 

Measure Set Review 

 

Although we recognize the desire of HHS and the Centers for Medicare & Medicaid Services (CMS) to reduce quality measures in federal payment programs, we nonetheless are concerned that the measure removal process has not been transparent or effective. Battelle has proposed a workable process that could lead to the desired outcome. Previous criteria used by CMS and the National Quality Forum favored quality improvement and identification of performance gaps as key criteria for maintaining a measure rather than the measure’s effect on patient safety or outcomes. Not every measure will have a direct pathway for achieving quality improvement year over year, especially those measures focused on patient safety where high-performance rates lead to better quality and patient outcomes at reduced costs. We support Battelle’s emphasis that measures must be important, reliable, valid, feasible, and usable within specific programs to maintain their endorsement and use. We request that Battelle protect against the removal of measures based upon an arbitrary assessment of topped-out status. 

 

Appendix B. Supplemental Guidance on Applying PRMR and MSR Criteria

 

PRMR Criteria—Meaningfulness: As part of the review criteria, Battelle will assess whether a measure will “address a high-impact clinician and/or policy area.” We request further information on how Battelle will define such clinicians or policy areas. In addition to the “high-impact clinician,” ASA encourages Battelle to think more broadly of all specialties a measure may impact. For instance, anesthesiologists may work as pain medicine physicians and in critical care settings. In previous reviews, several quality measures related to pain medicine were removed or scrutinized without much concern for how such measures would affect anesthesiologists. Inclusion of specialists will allow for a greater understanding of cross-cutting areas of patient care. 

 

PRMR Criteria—Time to Value Realization: ASA requests additional information related to the “Time to Value Realization” criteria. We are concerned that an overemphasis on digitization or e-specifying measures may present barriers for specialties and our registries. PRMR criteria should ensure a glide path toward digitization that enables the endorsement of measures using current criteria for non-digitized measures. We agree that measure reviewers should have the opportunity to assess a measure based on its capacity to be transitioned into a digital format. In this way, our registry and other registries may benefit from having additional time to identify high-use measures for digitization. 

 

MSR Criteria—Impact: ASA recommends Battelle solicit and consider comments from a measure steward when determining its impact on patient care and outcomes. As currently written, it appears Battelle and its committees may make a recommendation based on endorsement status rather than on why a measure may or may not have received endorsement. For ASA, we weigh endorsement upon several factors including, but not limited to, potential longevity in a payment program, the time and cost needed to complete endorsement, and feedback from CMS. Understanding the measure steward perspective will allow Advisory and Recommendation Groups to make balanced recommendations. We support criteria for assessing health equity in most, if not all, quality measures. Assessing health equity is relatively new to measure development and maintenance. Removing measures from a program prior to a full review of its effect on promoting health equity would, in most cases, be premature.

 

MSR Criteria—Patient Journey: As recently discussed in national medical journals and in CMS blogposts, we are increasingly concerned that assessing a measure based upon topped-out status or performance gaps alone undermines the significance that a measure may have related to patient safety. Removing patient safety measures without consideration to their downstream outcomes (such as prophylactic antibiotic administration with regard to surgical site infections) can impact patient outcomes, experience, and cost of care. 

_____

 

MATTHEW T. POPOVICH
Chief Quality Officer
American Society of Anesthesiologists

Name or Organization
American Society of Anesthesiologists

Submitted by Anonymous (not verified) on Sun, 07/30/2023 - 15:46

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure
Processes (E&M, PRMR, MSR)
Quorum and voting

The Joint Commission appreciates the opportunity to comment on the Partnership for Quality Measurement (PQM) Guidebook of Policies and Procedures for Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR) June 2023.

Founded in 1951, The Joint Commission seeks to continuously improve health care for the public in collaboration with other stakeholders, by evaluating health care organizations (HCOs) and inspiring them to excel in providing safe and effective care of the highest quality and value. An independent, not-for-profit organization with a global presence, The Joint Commission has programs that accredit or certify more than 22,000 HCOs and programs in the United States. The Joint Commission evaluates across the continuum of care, including most of the nation’s hospitals. Although accreditation is voluntary, a variety of federal and state government regulatory bodies, including CMS, recognize The Joint Commission’s decisions and findings for Medicare or licensure purposes.

The Joint Commission appreciates the efforts to streamline the process and provide a comprehensive explanation of a complex process in a concise document. Overall, The Joint Commission supports the revised process, Novel Hybrid Delphi and Nominal Group (NHDNG), described in the Guidebook. We agree with the proposal to increase the number of members reviewing measures and collecting pre-evaluation independent ratings. This change will facilitate consensus and permit an unbiased and stronger evaluation. When committee members are held responsible to review and provide feedback in advance of meetings, all voices can more easily be heard. Joint Commission appreciates the desire to focus meeting time on areas of disagreement and agrees that this can support consensus building.

The Joint Commission suggests that the PQM consider ways to address alignment across the committees. Without a coordinating committee, the ability to see across settings and coordinate similar measures may be lost. Previously the coordinating committee had a unique ability to address related and competing measures and prevent different committees from reaching different final decisions on the same measure. We suggest the PQM staff monitor feedback on measures submitted that are reviewed by multiple committees and create a process to share discussion and voting on measures that are presented across programs. 

The Joint Commission seeks clarification regarding whether the PRMR members selected to participate in the MSR committee will continue to serve on the PRMR and the MSR or if they will only serve on the MSR. We support establishing an MSR committee composed of a subset of active PRMR members. From our experience as an organizational member of the MAP Coordinating Committee, we recognize that measure removal recommendations are often informed by measures under consideration, such as when a measure recommended for addition is more proximal to an outcome than a current program measure. Additionally, we anticipate that common membership would support thorough consideration of measurement gaps in programs. We also support the proposal that the scope of MSR span across care settings and populations; as previously mentioned, the PRMR and MSR can enable cross-program consideration of measurement impacting the care continuum. The approach is also efficient, as demonstrated by past MAP Coordinating Committee measure removal review meetings which took this approach in 2021 and 2022. 

As a point of clarification, under the “Post Acute Care Settings" list we do not see Home Care VPB and seek to clarify if this program is also included. 

The Joint Commission appreciates that date ranges have been established so that interested parties, measure developers and other stakeholders can plan for meetings and public comment periods. We encourage PQM to adhere to the posted timelines and provide as early notice as possible when the exact dates and times of the events are confirmed and provide as much notice as possible when date changes are necessary. 

The Joint Commission is concerned that the 1st- 3rd week of December is not an ideal time to address public comments since this time competes with year-end activities. We suggest moving this public comment period to earlier in the fall. Additionally, a 15–21-day period during which public comments are accepted is short. When comments are submitted on behalf of a large organization, coordination is required to capture and respond with the organization’s full breadth of expertise. Patients and providers may also have difficulty with short public comment periods. The Joint Commission recommends allowing at least 30 days for public comments. 

The Joint Commission has concerns with the voting procedure for the Recommendations Group meeting that when voting quorum is not achieved, the members not in attendance for the group discussion can submit their votes offline after the meeting. This is problematic because the members who are not present are not able to able to express their opinions to the group or ask questions, and their opinions are not informed by the discussions of the full group. We additionally have concerns that 48 hours or 2 business days would not be sufficient time to gather votes. 

The Joint Commission appreciates the transparency in the process, specifically that both staff preliminary assessments and public comments received will be made available on the PQM website. 

The Joint Commission recommends clarifying what an organizational seat means, how it is defined, and the criteria that will be applied in the selection of an organizational committee member versus an individual member. 

The listening session prior to the close of public comment states CMS, our staff, and measure developers/stewards address questions prior to the public submitting their comments and committee members submitting their ratings and explanations. The Joint Commission recommends PQM clarify if this is the only time the measure stewards would have the opportunity to speak on behalf of the measure. 

Thank you for this opportunity to review and provide comments. The Joint Commission is pleased to answer any questions you may have regarding our comments. Please do not hesitate to contact Michelle Dardis, Director of Quality Measurement at (630) 792-5915 or [email protected].

Name or Organization
The Joint Commission

Submitted by Anonymous (not verified) on Sun, 07/30/2023 - 19:56

Permalink

Which aspect(s) of the guidebook are you commenting on?
General
Committee structure

July 30, 2023

Dear PQM team,

We at the American Urological Association appreciate the opportunity to offer comment on your new Pre-Rulemaking Measure Review (PRMR) and Measure Set Review (MSR) processes.

Overall, we support the changes that you have made to the process.  We are particularly intrigued with the move to the Novel Hybrid Delphi and Nominal Group (NHDNG) methodology for achieving consensus for program recommendations.  We also like the idea of integrating the previous coordinating and advisory committees with the setting-specific committees, as this should allow more time for public comment and for more impactful engagement from those who can provide rural health and/or equity perspectives.   

However, we do have a few questions/concerns, as follows:

  • The distinction between individual versus organizational Committee membership (and the need for both) is not clear and several questions remain.  For example, will organizational members have different roles or responsibilities, as compared to individual members? What criteria are you using to select organizational members, compared to individual members? What roster categories are organizational?  Would someone be considered for an organizational slot but not for an individual slot, or vice-versa?  If so, how should someone decide whether to nominate as an individual versus representing their organization? 
  • It is not clear in the guidebook as to whether (or when) staff preliminary assessments will be made available to the public, nor whether public comments will be available in real time during the comment period.    
  • It is not clear whether “interim” ratings by the Committee be made available to the public, and if they are made available, whether this will happen during the process or afterwards (i.e., to fully document the process and the recommendations).
  • We understand that measure developers and stewards may be asked to provide supplemental information to aid in the evaluation process.  While this seems necessary, we are concerned that such requests may be labor intensive, and the time frame may be quite short.  We encourage PQM to ensure adequate time for response, especially given that the PRMR timeframe may be impacted by the holidays. 

Please feel free to reach out if you have any questions.

Sincerely,

Karen Johnson, PhD

Director, Quality and Measurement

American Urological Association

Name or Organization
American Urological Association