Skip to main content

Oncology: Medical and Radiation – Plan of Care for Pain

CBE ID
0383
Endorsement Status
1.1 New or Maintenance
Previous Endorsement Cycle
Is Under Review
No
Next Maintenance Cycle
Advanced Illness and Post-Acute Care Fall 2028
1.3 Measure Description

This measure looks at the percentage of visits for patients, regardless of age, with a diagnosis of cancer currently receiving chemotherapy or radiation therapy who report having pain with a documented plan of care to address pain. This measure is to be submitted at each denominator eligible visit occurring during the performance period for patients with a diagnosis of cancer and in which pain is present who are seen during the performance period / measurement period. The time period for data collection is intended to be 12 consecutive months.

 

THERE ARE TWO SUBMISSION CRITERIA FOR THIS MEASURE:

1) All visits for patients, regardless of age, with a diagnosis of cancer currently receiving chemotherapy who report having pain

 

OR

 

2) All visits for patients, regardless of age, with a diagnosis of cancer currently receiving radiation therapy who report having pain

 

This measure is comprised of two populations but is intended to result in one reporting rate. This is a proportion measure and better quality is associated with a higher score.

        • 1.14 Numerator

          Submission Criteria 1

          Patient visits that included a documented plan of care to address pain

           

          Submission Criteria 2

          Patient visits that included a documented plan of care to address pain

           

          Numerator Instructions: A documented plan of care may include: use of opioids, nonopioid analgesics, psychological support, patient and/or family education, referral to a pain clinic, or reassessment of pain at an appropriate time interval.

          1.14a Numerator Details

          Time period for data collection: At each visit within the measurement period

           

          Numerator Instructions: A documented plan of care may include: use of opioids, nonopioid analgesics, psychological support, patient and/or family education, referral to a pain clinic, or reassessment of pain at an appropriate time interval.

           

          The measure has two submission criteria to capture 1) visits for patients currently receiving chemotherapy who report having pain and 2) visits for patients currently receiving radiation therapy who report having pain. 

           

          For the Submission Criteria 1 and Submission Criteria 2 numerators, report the following quality data code to submit the numerator for patient visits in which plan of care to address pain is documented:

           

          0521F: Plan of care to address pain documented

        • 1.15 Denominator

          Submission Criteria 1

          All visits for patients, regardless of age, with a diagnosis of cancer currently receiving chemotherapy who report having pain

           

          Submission Criteria 2

          All visits for patients, regardless of age, with a diagnosis of cancer currently receiving radiation therapy who report having pain

           

          DENOMINATOR NOTE: For the reporting purposes of this measure, in instances where CPT code 77427 is reported, the billing date, which may or may not be the same date as the face-to-face or telehealth encounter with the physician, should be used to pull the appropriate patient population into the denominator.  It is expected, though, that the numerator criteria would be performed at the time of the actual face-to-face or telehealth encounter during the series of treatments. 

          1.15a Denominator Details

          Time period for data collection: 12 consecutive months

           

          The measure has two submission criteria to capture visits for patients currently receiving chemotherapy who report having pain and 2) visits for patients currently receiving radiation therapy who report having pain. 

           

          Submission Criteria 1 Denominator: Visits for patients with a diagnosis of cancer currently receiving chemotherapy who report having pain

           

          All eligible instances when pain severity quantified; pain present (1125F) is submitted in the numerator for Measure #0384

          AND

          Diagnosis for cancer (ICD-10-CM) - Due to character limitation, please see codes in the attached Excel file.

          AND

          Patient encounter during the performance period (CPT) – to be used to evaluate remaining denominator criteria and for numerator evaluation: 99202, 99203, 99204, 99205, 99212, 99213, 99214, 99215

          AND

          Patient procedure during the performance period: 51720, 96401, 96402, 96405, 96406, 96409, 96411, 96413, 96415, 96416, 96417, 96420, 96422, 96423, 96425, 96440, 96446, 96450, 96521, 96522, 96523, 96542, 96549

           

          Submission Criteria 2 Denominator: Visits for patients with a diagnosis of cancer currently receiving radiation therapy who report having pain

          DENOMINATOR NOTE: For the reporting purposes of this measure, in instances where CPT code 77427 is reported, the billing date, which may or may not be the same date as the face-to-face  or telehealth encounter with the physician, should be used to pull the appropriate patient population into the denominator.  It is expected, though, that the numerator criteria would be performed at the time of the actual face-to-face or telehealth encounter during the series of treatments.

           

          All eligible instances when pain severity quantified; pain present (1125F) is submitted in the numerator for Measure #0384

          AND

          Diagnosis for cancer (ICD-10-CM) - Due to character limitation, please see codes in the attached Excel file.

          AND

          Patient procedure during the performance period (CPT) – Procedure codes: 77427, 77431, 77432, 77435

        • 1.15b Denominator Exclusions

          None

          1.15c Denominator Exclusions Details

          None

        • OLD 1.12 MAT output not attached
          Attached
          1.13 Attach Data Dictionary
          1.13a Data dictionary not attached
          Yes
          1.16 Type of Score
          1.17 Measure Score Interpretation
          Better quality = Higher score
          1.18 Calculation of Measure Score

          PY 2023 measure flow diagram is attached to this submission. 

           

          This measure is comprised of two submission criteria but is intended to result in one reporting rate. The reporting rate is the aggregate of Submission Criteria 1 and Submission Criteria 2, resulting in a single performance rate. For the purposes of this measure, the single performance rate can be calculated as follows: 

          Performance Rate = (Numerator 1 + Numerator 2)/ (Denominator 1 + Denominator 2)

           

          Calculation algorithm for Submission Criteria 1: Visits for patients with a diagnosis of cancer currently receiving chemotherapy who report having pain

          1. Find the patient visits that qualify for the denominator (i.e., the specific group of patient visits for inclusion in a specific performance measure based on defined criteria). 

          2. From the patient visits within the denominator, find the visits that meet the numerator criteria (i.e., the group of patient visits in the denominator for whom a process or outcome of care occurs). Validate that the number of patient visits in the numerator is less than or equal to the number of patient visits in the denominator.

           

          If the visit does not meet the numerator, this case represents a quality failure.

           

          Calculation algorithm for Submission Criteria 2: Visits for patients with a diagnosis of cancer currently receiving radiation therapy who report having pain

          1. Find the patient visits that qualify for the denominator (i.e., the specific group of patient visits for inclusion in a specific performance measure based on defined criteria). 

          2. From the patient visits within the denominator, find the visits that meet the numerator criteria (i.e., the group of patients in the denominator for whom a process or outcome of care occurs). Validate that the number of patient visits in the numerator is less than or equal to the number of patient visits in the denominator.

           

          If the visit does not meet the numerator, this case represents a quality failure.

          1.18a Attach measure score calculation diagram, if applicable
          1.19 Measure Stratification Details

          We encourage the results of this measure to be stratified by race, ethnicity, administrative sex, and payer.

          1.26 Minimum Sample Size

          It is recommended to adhere to the standard CMS guideline, which stipulates a minimum of 20 denominator counts to calculate the measure. In addition, it is advisable to incorporate data from patients with diverse attributes for optimal results.

        • Most Recent Endorsement Activity
          Advanced Illness and Post-Acute Care Fall 2023
          Initial Endorsement
          Last Updated
        • Measure Developer Secondary Point Of Contact

          Caitlin Drumheller
          American Society of Clinical Oncology
          2318 Mill Road
          Suite 800
          Alexandria, VA 22314
          United States

          • 2.1 Attach Logic Model
            2.2 Evidence of Measure Importance

            Cancer is the second leading cause of death in the US (1) and there is an estimated incidence rate of over 1.9 million cases in 2023. (2) Pain is one of the most common and debilitating symptoms reported amongst cancer patients and in fact ICD-11 contains a new classification for chronic cancer-related pain, defining it as chronic pain caused by the primary cancer itself, or metastases, or its treatment. A systematic review found that 55 percent of patients undergoing anticancer treatment reported pain (3) and chemotherapy and radiation specifically are associated with several distinct pain syndromes. (4) Each year, over a million cancer patients in the US receive chemotherapy or radiation. (5) Severe pain increases the risk of anxiety and depression (4) and a recent study showed that cancer patients who reported pain had worse employment and financial outcomes; the greater the pain, the worse the outcomes. (6) Cancer patients have also reported that pain interferes with their mood, work, relationships with other people, sleep, and overall enjoyment of life. (7)

             

            Assessing pain and developing a plan of care (i.e., pain management) are critical for symptom control, pain management, and the cancer patient’s overall quality of life; it is an essential part of the oncologic management of a cancer patient (see below for specific clinical guideline recommendations). (8) However, many oncology patients report insufficient pain control. (9) A retrospective chart review analysis found an 84 percent adherence to the documentation of pain intensity and 43 percent adherence to pain re-assessment within an hour of medication administration. (10) An observational study found that over half of its cancer patients had a negative pain management index score, indicating that the prescribed pain treatments were not commensurate with the pain intensity reported by the patient. (11) Disparities exist as well, for example, a recent study evaluated opioid prescription fills and potency among cancer patients near end of life between 2007-2019. The study found that while all patients had a steady decline in opioid access, Black and Hispanic patients were less likely to receive opioids than White patients (Black, -4.3 percentage points, 95% CI; Hispanic, -3.6 percentage points, 95% CI) and received lower daily doses (Black, -10.5 MMED, 95% CI; Hispanic, -9.1 MMED, 95% CI). (12)

             

            Although there have been some improvements, subpar pain management amongst cancer patients persists. The intent of the paired measures Percentage of patient visits, regardless of patient age, with a diagnosis of cancer currently receiving chemotherapy or radiation therapy in which pain intensity is quantified and Percentage of visits for patients, regardless of age, with a diagnosis of cancer currently receiving chemotherapy or radiation therapy who report having pain with a documented plan of care to address pain is to improve pain management, thereby improving the function and quality of life of the cancer patient.

             

            Specific clinical practice guideline recommendations that support this measure are: (8) 

            1. Perform pain reassessment at specified intervals to ensure that analgesic therapy is providing maximum benefit with minimal adverse effects, and that the treatment plan is followed.
            2. General principles of cancer pain management
              • Optimize pain management therapies to improve function and meet patient's goals of care.
              • Select the most appropriate analgesic regimen based on the pain diagnosis, comorbid conditions, safety, potential drug interactions, estimated trajectory of pain, medication availability, and expense/financial toxicity
              • Analgesic regimen may include an opioid, acetaminophen, nonsteroidal anti-inflammatory drugs, and/or adjuvant analgesics.
              • Provide psychosocial support
              • Provide patient and family/caregiver education 
              • Optimize integrative interventions and multidisciplinary care 
            3. Ongoing care & goals of treatment
              1. Have regular follow-up schedule to monitor pain therapy outcomes
              2. Monitor for the use of analgesics as prescribed, especially in patients with risk factors for or history of substance misuse/diversion or cognitive dysfunction
              3. Provide written follow-up pain plan, including prescribed medications 
              4. Routinely reevaluate pain at each contact and as needed to meet patient-specific goals for comfort, function, and safety
              5. Instruct the patient on the importance of
                1. Following documented pain plan 
                2. Scheduling and keeping outpatient appointments 
                3. Contacting clinician if pain worsens or adverse effects are inadequately controlled, including availability of after-hours assistance to facilitate titration of analgesic
            4. Pain intensity rating
              1. Pain intensity rating scales can be used as part of universal screening and comprehensive pain assessment. At minimum, patients should be asked about “current” pain, as well as "worst" pain, “average” pain, and "least" pain in the past 24 hours. 
              2. For comprehensive assessment, also include "worst pain in past week," "pain at rest," and "pain with movement." 
            5. Comprehensive Pain Assessment
              1. The goal of comprehensive pain assessment is to find the cause of the pain and identify optimal therapies. Individualized pain treatment is based on the etiology and characteristics of pain, pain trajectory, the patient's clinical condition, and patient-centered goals of care.
            6. Psychosocial Support
              1. Describe the mutually agreed upon plan of care to be taken and when results can be expected.

             

            All recommendations are Category 2A - Based upon lower-level evidence, there is uniform NCCN consensus that the intervention is appropriate.

            References:

            1. Centers for Disease Control and Prevention. (2023, January 18). Leading Causes of Death. National Center for Health Statistics. https://www.cdc.gov/nchs/fastats/leading-causes-of-death.htm
            2. National Cancer Institute. (2018). Cancer of Any Site - Cancer Stat Facts. Surveillance, Epidemiology, and End Results Program. https://seer.cancer.gov/statfacts/html/all.html 
            3. Van den Beuken-van Everdingen, M. H., Hochstenbach, L. M., Joosten, E. A., Tjan-Heijnen, V. C., & Janssen, D. J. (2016). Update on Prevalence of Pain in Patients With Cancer: Systematic Review and Meta-Analysis. Journal of Pain and Symptom Management51(6), 1070–1090.e9. https://doi.org/10.1016/j.jpainsymman.2015.12.340
            4. National Cancer Institute. (2019, March 6). Cancer Pain (PDQ®)–Patient Version. https://www.cancer.gov/about-cancer/treatment/side-effects/pain/pain-pdq 
            5. Centers for Disease Control and Prevention. (2022, November 2). Information for Health Care Providers on Infections During Chemotherapy. https://www.cdc.gov/cancer/preventinfections/index.htm 
            6. Halpern, M. T., de Moor, J. S., & Yabroff, K. R. (2022). Impact of Pain on Employment and Financial Outcomes Among Cancer Survivors. Journal of Clinical Oncology: Official Journal of the American Society of Clinical Oncology40(1), 24–31. https://doi.org/10.1200/JCO.20.03746
            7. Moryl, N., Dave, V., Glare, P., Bokhari, A., Malhotra, V. T., Gulati, A., Hung, J., Puttanniah, V., Griffo, Y., Tickoo, R., Wiesenthal, A., Horn, S. D., & Inturrisi, C. E. (2018). Patient-Reported Outcomes and Opioid Use by Outpatient Cancer Patients. The Journal of Pain, 19(3), 278–290. https://doi.org/10.1016/j.jpain.2017.11.001
            8. National Comprehensive Cancer Network® (NCCN). (July 31, 2023). NCCN Clinical Practice Guidelines in Oncology. Adult Cancer Pain Version 2.2023. http://www.nccn.org
            9. Jacqueline C. Dela Pena, Vincent D. Marshall & Michael A. Smith. (2022). Impact of NCCN Guideline Adherence in Adult Cancer Pain on Length of Stay. Journal of Pain & Palliative Care Pharmacotherapy, 36:2, 95-102, DOI: 10.1080/15360288.2022.2066746
            10. El Rahi, C., Murillo, JR., & Zaghloul, H. (September 2017). Pain Assessment Practices in Patients with Cancer Admitted to the Oncology Floor. J Hematol Oncol Pharm, 7(3):109-113. https://jhoponline.com/issue-archive/2017-issues/jhop-september-2017-vol-7-no-3/17246-pain-assessment-practices-in-patients-with-cancer-admitted-to-the-oncology-floor 
            11. Thronæs, M., Balstad, T. R., Brunelli, C., Løhre, E. T., Klepstad, P., Vagnildhaug, O. M., Kaasa, S., Knudsen, A. K., & Solheim, T. S. (2020). Pain management index (PMI)-does it reflect cancer patients' wish for focus on pain? Supportive Care in Cancer: Official Journal of the Multinational Association of Supportive Care in Cancer28(4), 1675–1684. https://doi.org/10.1007/s00520-019-04981-
            12. Enzinger, A. C., Ghosh, K., Keating, N. L., Cutler, D. M., Clark, C. R., Florez, N., Landrum, M. B., & Wright, A. A. (2023). Racial and Ethnic Disparities in Opioid Access and Urine Drug Screening Among Older Patients With Poor-Prognosis Cancer Near the End of Life. Journal of clinical oncology : official journal of the American Society of Clinical Oncology41(14), 2511–2522. https://doi.org/10.1200/JCO.22.01413 
          • 2.6 Meaningfulness to Target Population

            A 2022 study evaluated patient and caregiver perspectives on cancer-related quality measures, to inform priorities for health system implementation. Measure concepts related to pain management plans and improvement in pain were nominated as part of the top five concepts. The study notes that the patient and caregiver panel put much emphasis on the important of routine pain screening, management, and follow-up. (1) 

             

            References:

             

            1. O'Hanlon, C. E., Giannitrapani, K. F., Lindvall, C., Gamboa, R. C., Canning, M., Asch, S. M., Garrido, M. M., ImPACS Patient and Caregiver Panel, Walling, A. M., & Lorenz, K. A. (2022). Patient and Caregiver Prioritization of Palliative and End-of-Life Cancer Care Quality Measures. Journal of General Internal Medicine37(6), 1429–1435. https://doi.org/10.1007/s11606-021-07041-8
          • 2.4 Performance Gap

            See logic model attachment

            Table 1. Performance Scores by Decile
            Performance Gap
            Overall Minimum Decile_1 Decile_2 Decile_3 Decile_4 Decile_5 Decile_6 Decile_7 Decile_8 Decile_9 Decile_10 Maximum
            Mean Performance Score SEE LOGIC MODEL ATTACHMENT
            N of Entities
            N of Persons / Encounters / Episodes
            • 3.1 Feasibility Assessment

              Not applicable during the Fall 2023 cycle.

              3.3 Feasibility Informed Final Measure

              Feedback from EHRs, cancer registries, and oncology practices provides compelling evidence that this measure is easy to implement and presents minimal feasibility challenges. The necessary data elements required for the denominator (active cancer diagnosis, office visit, chemotherapy administration and/or radiation treatment) can be found within structured fields and are recorded using commonly accepted coding standards. The same applies to the numerator data element, which requires documentation of the pain assessment result.

               

              The measure's data capture can be seamlessly integrated into existing physician workflows and data collection tools without requiring any significant modifications. Numerous healthcare practices have already set up their workflows to gather this information, highlighting its easy adoption. This is evident from the considerable number of practices that report this measure to the Centers for Medicare and Medicaid Services (CMS) via the Merit-based Incentive Payment System (MIPS) program.

               

              This measure has been widely adopted and proven to be effective. It has been implemented without any issues or feasibility concerns. Therefore, no adjustments to the measure specifications are needed.

            • 3.4a Fees, Licensing, or Other Requirements

              As the world’s leading professional organization for physicians and others engaged in clinical cancer research and cancer patient care, American Society of Clinical Oncology, Inc. (“Society”) and its affiliates1 publishes and presents a wide range of oncologist‐approved cancer information, educational and practice tools, and other content. The ASCO trademarks, including without limitation ASCO®, American Society of Clinical Oncology®, JCO®, Journal of Clinical Oncology®, Cancer.Net™, QOPI®, QOPI Certification Program™, CancerLinQ®, CancerLinQ Discovery®, and Conquer Cancer®, are among the most highly respected trademarks in the fields of cancer research, oncology education, patient information, and quality care. This outstanding reputation is due in large part to the contributions of ASCO members and volunteers. Any goodwill or commercial benefit from the use of ASCO content and trademarks will therefore accrue to the Society and its respective affiliates and further their tax‐exempt charitable missions. Any use of ASCO content and trademarks that may depreciate their reputation and value will be prohibited.

               

              ASCO does not charge a licensing fee to not-for-profit hospitals, healthcare systems, or practices to use the measure for quality improvement, research or reporting to federal programs. ASCO encourage all of these not-for-profit users to obtain a license to use the measure so ASCO can:

              • Keep users informed about measure updates and/or changes
              • Learn from measure users about any implementation challenges to inform future measure updates and/or changes
              • Track measure utilization (outside of federal reporting programs) and performance rates

               

              ASCO has adopted the Council of Medical Specialty Society’s Code for Interactions with Companies (chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://cmss.org/wp-content/uploads/2016/02/CMSS-Code-for-Interactions-with-Companies-Approved-Revised-Version-4.13.15-with-Annotations.pdf), which provides guidance on interactions with for‐profit entities that develop produce, market or distribute drugs, devices, services or therapies used to diagnose, treat, monitor, manage, and alleviate health conditions. The Society’s Board of Directors has set Licensing Standards of American Society of Clinical Oncology (chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://old-prod.asco.org/sites/new-www.asco.org/files/content-files/about-asco/pdf/ASCO-Licensing-Standards-Society-and-affiliates.pdf) to guide all licensing arrangements. 

               

              In addition, ASCO has adopted the Council of Medical Specialty Society’s Policy on Antitrust Compliance (chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://cmss.org/wp-content/uploads/2015/09/Antitrust-policy.pdf), which provided guidance on compliance with all laws applicable to its programs and activities, specifically including federal and state antitrust laws, including guidance to not discuss, communicate, or make announcements about fixing prices, allocating customers or markets, or unreasonably restraining trade.

               

              Contact Us:

              • If you have questions about the ASCO Licensing Standards or would like to pursue a licensing opportunity, please content ASCO’s Division of Licensing, Rights & Permissions at [email protected].
              • Individual authors and others seeking one‐time or limited permissions should contact [email protected]. ASCO members seeking to use an ASCO trademark in connection with a grant, award, or quality initiative should contact the administrator of that particular program.

              1 Unless otherwise specified, the term “ASCO” in these Licensing Standards refers collectively to American Society of Clinical Oncology, Inc., the ASCO Association, Conquer Cancer Foundation of the American Society of Clinical Oncology, CancerLinQ LLC, QOPI Certification Program, LLC, and all other affiliates of the American Society of Clinical Oncology, Inc.

              3.4 Proprietary Information
              Proprietary measure or components with fees
              • 4.1.3 Characteristics of Measured Entities

                The clinicians and practices included in the reliability analysis represented all 49 states of the continental United States and ranged from very small single proprietorships to large academic institutions according to the information they provided to the CMS. For validity analysis, McKesson’s Practice Insights QCDR randomly selected 10 community-based practices across the United Sates.

                4.1.1 Data Used for Testing

                Six datasets provided by CMS' MIPS program and publicly reported were used to test the measure's reliability:

                1. A data set of 207 individual clinicians who reported on the measure in the calendar year 2019 with 21,383 qualifying patients.
                2. A data set of 107 practices that reported on the measure in the calendar year 2019 with 34,056 qualifying patients.
                3. A data set of 229 individual clinicians who reported on the measure in the calendar year 2020 with 51,074 qualifying patients.
                4. A data set of 118 practices that reported on the measure in the calendar year 2020 with 144,509 qualifying patients.
                5. A data set of 366 individual clinicians who reported on the measure in the calendar year 2021 with 77,165 qualifying patients.
                6. A data set of 138 practices that reported on the measure in the calendar year 2021 with qualifying patient encounters.

                 

                The data source used to test the measure’s validity is 2022 patient data from the McKesson Practice Insights QCDR. McKesson’s Practice Insights QCDR is an oncology-specific reporting and analytics platform that supports a variety of practice value-based care initiatives. The web-based reporting system is fully integrated with the oncology-specific iKnowMed Generation 2 technology, leveraging the clinical data contained within the EHR system and enabling the automated calculation of quality measures and analytics to support improved patient care. Through Practice Insights QCDR, which provides continuous data monitoring and feedback, practices are enabled to exceed the simple task of participating in quality programs with the goal to achieve optimized patient care and reduced costs. Practice Insights not only supports successful participation in the MIPS program, but it also serves as a powerful reporting platform for practices pursuing other value-based care initiatives and alternative payment models (APMs), including the Enhancing Oncology Model (EOM).

                 

                For the purpose of conducting validity testing, 10 community-based oncology practices were randomly selected from the full list of Practice Insights QCDR participants, representing 3% of all 2022 MIPS program participants.  From these, a randomized sample of 50 patients per practice, for a total of 500 patients, were selected for full medical record chart audits.

                4.1.4 Characteristics of Units of the Eligible Population

                CMS did not capture nor provide any patient-level socio-demographic variables and therefore no patient demographic data is available. McKesson's Practice Insights QCDR masked patients' demographic data to protect privacy during medical chart audits and did not provide patient demographics.

                4.1.2 Differences in Data

                To conduct data element testing with greater granularity, we acquired an additional data set from the McKesson Practice Insights QCDR as the CMS-provided MIPS individual clinician and practice performance data sets were not detailed enough. The CMS-provided data sets were utilized for accountable entity-level testing, while the Practice Insights QCDR-provided data set was used to carry out encounter/patient-level testing.

              • 4.2.1 Level(s) of Reliability Testing Conducted
                4.2.2 Method(s) of Reliability Testing

                An assessment of the measure's reliability was performed through the utilization of signal-to-noise analysis, a method that determines the precision of the actual construct in comparison to the random variation. The signal-to-noise ratio is determined by calculating the ratio of between unit variance to total variance. This analysis provides valuable insight into the measure's reliability and its ability to produce consistent results.

                4.2.3 Reliability Testing Results

                Among the average of 267 individual clinicians and 121 practices over 3 calendar years, the reliability of the measure scores ranged from 0.804 to 1.000. The average reliability score was an almost perfect 0.987.

                 

                Overall, 100% of clinicians and practices had measure scores with reliabilities of 0.70 or higher, a commonly accepted reliability threshold (Adams 2010). The reliability values were consistently close to the ideal, indicating that the clinician performance rates were highly reliable, and any measurement error was minimal.

                 

                Adams, J. L., Mehrotra, A., Thomas, J. W., & McGlynn, E. A. (2010). Physician cost profiling—reliability and risk of misclassification. New England Journal of Medicine, 362(11), 1014-1021.

                Table 2. Accountable Entity–Level Reliability Testing Results by Denominator-Target Population Size
                Accountable Entity-Level Reliability Testing Results
                  Overall Minimum Decile_1 Decile_2
                Reliability SEE LOGIC MODEL ATTACHMENT
                Mean Performance Score
                N of Entities
                4.2.4 Interpretation of Reliability Results

                Based on the available data, it is evident that individual clinicians and practices, even those with a minimal sample size, display reliability coefficients that exceed 0.80. This result indicates that the measure is highly reliable, both at individual clinician and practice levels. Therefore, the performance scores provide a true reflection of the quality of care.

              • 4.3.3 Method(s) of Validity Testing

                For the purpose of checking the validity of the data elements in this measure, a random sample of 500 patients from 10 different test sites was selected. Both a measure abstractor and an automated algorithm were used to score patients on each data element of the measure. The agreement between the two scoring methods was evaluated using the Kappa statistic. Denominator and numerator data elements were assessed for all 500 patients. Since this measure does not have any denominator exclusion or exception data element, these data elements were not tested.

                4.3.4 Validity Testing Results

                Measure Data Element    Measure Component    Kappa Estimate    Standard Error    95% Confidence Limits
                Denominator    Cancer Diagnosis That's Active    1.0000    0.0000    1.0000    1.0000
                Denominator    Office Visit    1.0000    0.0000    1.0000    1.0000
                Denominator    Chemotherapy Administration    0.9509    0.0218    0.9081    0.9937
                Denominator    Radiation Treatment Management    0.9081    0.0914    0.7289    1.0000
                Numerator    Plan of Care to Address Pain Documented    1.0000    0.0000    1.0000    1.0000
                 

                4.3.5 Interpretation of Validity Results

                The calculated Kappa coefficient was 0.96 (with a 95% confidence interval of 0.91 to 1.00) for the denominator data element and 1.00 (with a 95% confidence interval of 1.00 to 1.00) for the numerator data element.

                 

                The Kappa coefficients were interpreted using the benchmarks for Cohen's Kappa established by Landis and Koch in 1977, which are widely recognized in the field of psychometrics:

                • 0.8 to 1.0 – almost perfect agreement;
                • 0.6 to 0.8 – substantial agreement;
                • 0.4 to 0.6 – moderate agreement;
                • 0.2 to 0.4 – fair agreement;
                • Zero to 0.2 – slight agreement; and
                • Zero or lower – poor agreement.

                 

                The evaluation benchmarks suggest that the measure accurately distinguishes between good and poor quality, with nearly perfect validity for both the measure's denominator and numerator.

                 

                Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 159-174.

              • 4.4.1 Methods used to address risk factors
                4.4.1b If an outcome or resource use measure is not risk adjusted or stratified

                N/A

                Risk adjustment approach
                Off
                Risk adjustment approach
                Off
                Conceptual model for risk adjustment
                Off
                Conceptual model for risk adjustment
                Off
                • 5.1 Contributions Towards Advancing Health Equity

                  SEE MEASURE RATIONALE

                  • 6.1.4 Program Details
                    Merit-based Incentive Payment System (MIPS) reporting program, Center for Medicare and Medicaid Services (CMS)., https://qpp.cms.gov/mips/explore-measures, MIPS encourages improvement in clinical practice and supporting advances in technology that allow for easy exchange of information., MIPS eligible providers may earn performance-based payment adjustments for the services provided to Medicare patients in the USA., Level of measurement and setting: Clinician/Group Level; Registry Data Source; Outpatient Services/Ambulatory Care Setting Purpose: MIPS takes a compr
                    Enhancing Oncology Model, Center for Medicare and Medicaid Services (CMS). This measure is listed as EOM-4., https://www.cms.gov/priorities/innovation/innovation-models/enhancing-oncology-model., Under EOM, participating oncology practices will take on financial and performance accountability for episodes of care surrounding systemic chemothera, There are 44 practices and three payers participating, nationwide., Level of measurement and setting: Oncology practices; the measure source is EOM participant reported and measure is reported in aggregate across all p
                    Practice Insights by McKesson in Collaboration with The US Oncology Network – QCDR., https://www.mckesson.com/Specialty/Oncology-Clinical-Management-Technology/, Practice Insights is a performance analytics tool that helps analyze data generated throughout the patient journey., Represents over 10,000 oncology physicians, nurses, clinicians, and cancer care specialists nationwide., Level of measurement and setting: Oncology practices.  Purpose: Practice Insights by McKesson in Collaboration with The US Oncology Network – QCDR. Pr
                    ASCO Certified: Patient-Centered Cancer Care Standards, https://practice.asco.org/quality-improvement/quality-programs/asco-certified, The new program certifies oncology group practices and health systems that meet a single set of comprehensive, evidence-based oncology medical home st, ASCO Certified was informed by a pilot of 12 practice groups and health systems across 95 service sites and 500 oncologists. The cohort comprised a va, Oncology group practices and health systems. 
                  • 6.2.1 Actions of Measured Entities to Improve Performance

                    Providers are evaluated on if there is a plan of care documented at every patient visit amongst patients who reported pain from measure 0384/e. For the measured entity to improve performance on this measure, they should follow the NCCN practice guidelines cited above. These include, but are not limited to, performing pain reassessment at specified intervals to ensure that analgesic therapy is providing maximum benefit with minimal adverse effects, and that the treatment plan is followed.

                     

                    ASCO has not received feedback that the measure negatively impacts the provider’s workflow.

                    6.2.2 Feedback on Measure Performance

                    ASCO’s measure development team allows for feedback and measure inquiries from implementers and reporters via email ([email protected]).  In addition, we receive questions and feedback from the CMS Helpdesk. ASCO has not received feedback on this measure through those avenues. 

                    6.2.3 Consideration of Measure Feedback

                    N/A

                    6.2.4 Progress on Improvement

                    While improvement was demonstrated for measures 0384/e, observed performance rates from the MIPS-Quality program for this measure, 0383, indicate a concerning decline in quality and suggest that there is ample opportunity for improvement in both individual clinician and practice performance. It is important to note that participants are allowed to self-select measures; as a result, performance rates may not be nationally representative.

                    6.2.5 Unexpected Findings

                    Per the NQF Cancer CDP Report from September 2020, the panel determined there is consensus of expert opinion that the benefits of what is being measured (documented plan of care to address pain) outweighs any potential harm. An unforeseen benefit is that practices are improving their electronic infrastructure to accurately capture this documentation. The panel overall agreed that the measure was feasible.

                    • Submitted by Amanda on Mon, 01/08/2024 - 14:34

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Strengths:

                      • The developer cites evidence regarding the incident rate of over 1.9 million cancer cases in 2023 and the prevalence of pain among cancer patients during treatment. There is a logic model linking the process where providers document a plan of care for cancer patients undergoing chemotherapy or radiation, optimizing pain management therapies which leads to improved function by way of symptom control and pain management, thereby improving the quality of life of the cancer patient.
                      • The developer also cites a systematic review which found that 55% of patients undergoing anticancer treatment reported pain. The impact of severe pain on mental health, employment and financial outcomes, relationships and overall quality of life is highlighted by the developer.
                      • The developer cites evidence regarding reports of insufficient pain management adherence rate to pain documentation (84%) and reassessment (43%) within an hour of medication administration, discrepancies in prescribed pain treatments and reported pain intensity among cancer patients.
                      • The developer cites clinical practice guideline recommendations for effective oncologic management of a cancer patient. This guideline recommends:
                        • Consistently reassessing pain to ensure effective treatment with minimal side effects.
                        • Tailoring analgesic therapy to consider diagnosis, safety, and patient goals.
                        • Maintaining regular follow-ups, monitoring medication use and adjusting pain plan as needed to meet specific goals.
                        • Using pain intensity rating scale to comprehensively assess pain and also identify the cause of the pain.
                        • Clearly communicating and agreeing upon a care plan to manage expectations and outcomes.
                      • The developer cites disparities in opioid access and dosage among different racial groups, noting that Black and Hispanic patients were less likely to receive opioids than White patients (Black, -4.3 percentage points, 95% CI; Hispanic, -3.6 percentage points, 95% CI) and received lower daily doses (Black, -10.5 MMED, 95% CI; Hispanic, -9.1 MMED, 95% CI).
                      • Additionally, the developer highlights the decline in performance rates from MIPS-Quality program data reflecting calendar years 2019-2021:
                        Individual Clinician Performance - mean performance rate in 2019 was 0.86|0.79 in 2020|0.69 in 2019
                        Practice Performance - mean performance rate in 2019 was 0.83|0.69 in 2020|0.70 in 2021

                       

                      Limitations:

                      • Although no direct patient input on the meaningfulness of the measure, the developer cites a 2022 study reporting that the study's patient and caregiver panel placed emphasis on the importance of routine pain screening, management, and follow-up.

                       

                      Rationale:

                      There is a business case supported by credible evidence depicting a link between health care processes to desired outcomes for cancer patients. Actions providers can take to reach the desired outcome are outlined. Additionally, a gap in care remains that warrants this measure.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Strengths:

                      • Data elements required for the numerator and denominator can be found within structured fields and are recorded using commonly accepted coding standards. The developer notes that the measure's data capture can be seamlessly integrated into existing physician workflows and data collection tools without requiring any significant modifications.
                      • There are no fees to use this measure, however, the developer encourages all not-for-profit users to obtain a license to use the measure. Guidance on interactions with for-profit entities is provided.

                       

                      Limitations:

                      None

                       

                      Rationale:

                      The necessary data elements required for the numerator and denominator can be found within structured fields and are recorded using commonly accepted coding standards. There are no fees for not-for-profit hospitals, healthcare systems, or practices to use the measure. Guidance on interactions with for-profit entities is provided.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Strengths:

                      • The measure is well-defined and precisely specified.
                      • Across all years analyzed and individual clinician and practice levels, the reliability scores ranged from 0.804 to 1.000 with an overall average of 0.987. Within year and accountable entity level, the average reliability ranged from 0.964 to 0.998 and the vast majority of facilities had reliability greater than 0.9.
                      • Across all years analyzed and individual clinician and practice levels, hundreds of accountable entities and tens of thousands of patient encounters were included in the reliability analysis.
                      • The data were retrieved from 2021-2023 performance reports and reflect calendar years 2019-2021.

                       

                      Limitations:

                      • The Calculation Algorithms for Criterion 1 and 2 are very generic and lack details specific to this particular measure.

                       

                      Rationale:

                      Measure score reliability testing (accountable entity level reliability) performed. All practice levels have a reliability which exceeds the accepted threshold of 0.6. Sample size for each year and accountable entity level analyzed is sufficient.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Strengths:

                      • The developer tested the validity of the data elements (both numerator and denominator) using a random sample of 500 patient encounters across 10 test sites. The developer scored encounters on each data element using both a measure abstractor and an automated algorithm and then evaluated agreement between the two scoring methods using the Kappa statistic.
                      • Results: 
                        Kappa coefficient for the denominator data element was 0.96 (with a 95% confidence interval of 0.91 to 1.00), indicating almost 100% accuracy.
                        Kappa coefficient for the numerator data element was 1.00 (with a 95% confidence interval of 1.00 to 1.00), indicating 100% accuracy.
                      • There are no denominator or numerator exclusions for this measure.

                       

                      Limitations:

                      None

                       

                      Rationale:

                      • The developer tested the validity of the data elements (both numerator and denominator) using a random sample of 500 patient encounters across 10 test sites. The developer scored encounters on each data element using both a measure abstractor and an automated algorithm and then evaluated agreement between the two scoring methods using the Kappa statistic.
                      • Results: 
                        Kappa coefficient for the denominator data element was 0.96 (with a 95% confidence interval of 0.91 to 1.00), indicating almost 100% accuracy.
                        Kappa coefficient for the numerator data element was 1.00 (with a 95% confidence interval of 1.00 to 1.00), indicating 100% accuracy.
                      • There are no denominator or numerator exclusions for this measure.

                      Equity

                      Equity Rating
                      Equity

                      Strengths:

                      N/A

                       

                      Limitations:

                      Developer used this section to refer to measure rationale; no information is provided in rationale (or elsewhere) demonstrating this submission addresses equity as intended.

                       

                      Rationale:

                      Developer did not address this optional criterion.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Strengths:

                      • Measure currently in use in MIPS (eligible entities can receive performance-based incentives) and the Enhancing Oncology Model (EOM-4; practices take on financial and performance accountability for episodes of care).
                      • Other tools for QI are Practice Insights by McKesson, a performance analytics tool used by subscribing providers, and the Patient-Centered Cancer Care Standards ASCO Certification.
                      • Developers assert that providers wishing to improve on the measure should follow the NCCN practice guidelines.
                      • Developers note a decline in performance rates, which they interpret as a clear ongoing need for the measure; no information about performance improvement in OEM-4 is provided (looks like not available yet).
                      • Providers can send feedback via the CMS Helpdesk or via email to ASCO. The developer reports no feedback has been received.
                      • No unexpected findings are reported, and developers cited NQF Cancer CDP Report (Sept 2020), which indicated that the panel overall agreed that the benefit outweigh potential harms

                       

                      Limitations:

                      • Developer's explanation for no improvement in the measure is that participants in MIPS can self-select measures and improvement rates may not be nationally-representative.

                       

                      Rationale:

                      • The measure is in use in two federal programs, and tools for QI include participation in a McKesson analytics platform (Practice Insights) and an ASCO-sponsored certification program. No significant feedback or unexpected findings are reported.
                      • Developer notes that performance rates have declined (time frame not provided) but does not provide sufficient explanation for the decline.

                      Summary

                      N/A

                    • Submitted by Andrew on Wed, 01/10/2024 - 11:21

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Pain is a difficult qualifier in medicine - it's subjective. 

                       

                      And clouded by the "5th vital sign" fiasco of recent decades. 

                       

                      Cancer pain is recognized as real and these patients likely suffer from the flux away from opiates in the setting of the current recognized crisis.

                       

                       

                      Copy from staff notes I also found helpful:

                      • The developer cites evidence regarding the incident rate of over 1.9 million cancer cases in 2023 and the prevalence of pain among cancer patients during treatment. There is a logic model linking the process where providers document a plan of care for cancer patients undergoing chemotherapy or radiation, optimizing pain management therapies which leads to improved function by way of symptom control and pain management, thereby improving the quality of life of the cancer patient.
                      • Additionally, the developer highlights the decline in performance rates from MIPS-Quality program data reflecting calendar years 2019-2021:
                        Individual Clinician Performance - mean performance rate in 2019 was 0.86|0.79 in 2020|0.69 in 2019
                        Practice Performance - mean performance rate in 2019 was 0.83|0.69 in 2020|0.70 in 2021

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Access to the data is available and planned. This is a quantifiable study with data at hand. 

                       

                      Copy from staff notes I also found helpful:

                      The necessary data elements required for the numerator and denominator can be found within structured fields and are recorded using commonly accepted coding standards. There are no fees for not-for-profit hospitals, healthcare systems, or practices to use the measure. Guidance on interactions with for-profit entities is provided.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Well-defined.

                       

                      Copy from staff notes I also found helpful:

                      Note a possible type-o in above assessment (reliability which exceeds the accepted threshold of 0.6.)

                       

                      • The measure is well-defined and precisely specified.
                      • Across all years analyzed and individual clinician and practice levels, the reliability scores ranged from 0.804 to 1.000 with an overall average of 0.987. Within year and accountable entity level, the average reliability ranged from 0.964 to 0.998 and the vast majority of facilities had reliability greater than 0.9.
                      • Across all years analyzed and individual clinician and practice levels, hundreds of accountable entities and tens of thousands of patient encounters were included in the reliability analysis.
                      • The data were retrieved from 2021-2023 performance reports and reflect calendar years 2019-2021.
                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Adequate power and with a CI of 0.95

                       

                      • The developer tested the validity of the data elements (both numerator and denominator) using a random sample of 500 patient encounters across 10 test sites. The developer scored encounters on each data element using both a measure abstractor and an automated algorithm and then evaluated the agreement between the two scoring methods using the Kappa statistic.
                      • Results: 
                        Kappa coefficient for the denominator data element was 0.96 (with a 95% confidence interval of 0.91 to 1.00), indicating almost 100% accuracy.
                        Kappa coefficient for the numerator data element was 1.00 (with a 95% confidence interval of 1.00 to 1.00), indicating 100% accuracy.
                      • There are no denominator or numerator exclusions for this measure.

                      Equity

                      Equity Rating
                      Equity

                      Is this not an addressable fact that the study will spotlight these disparities and facilitate addressing the metric?

                       

                      Copy from staff notes I also found helpful:

                      The developer cites disparities in opioid access and dosage among different racial groups, noting that Black and Hispanic patients were less likely to receive opioids than White patients (Black, -4.3 percentage points, 95% CI; Hispanic, -3.6 percentage points, 95% CI) and received lower daily doses (Black, -10.5 MMED, 95% CI; Hispanic, -9.1 MMED, 95% CI).

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      The use is evident, the usability is in question - as the feedback or performance improvement is not provided - not used or not available yet? A simple edit to make the explanation for performance rate decline and how this data will improve that issue, will be of value here.

                       

                      Copy from staff notes I also found helpful:

                      • The measure is in use in two federal programs, and tools for QI include participation in a McKesson analytics platform (Practice Insights) and an ASCO-sponsored certification program. No significant feedback or unexpected findings are reported.
                      • Developers note a decline in performance rates, which they interpret as a clear ongoing need for the measure; no information about performance improvement in OEM-4 is provided (looks like not available yet).
                      • Developer notes that performance rates have declined (time frame not provided) but does not provide sufficient explanation for the decline.
                      • Additionally, the developer highlights the decline in performance rates from MIPS-Quality program data reflecting calendar years 2019-2021:
                        Individual Clinician Performance - mean performance rate in 2019 was 0.86|0.79 in 2020|0.69 in 2019
                        Practice Performance - mean performance rate in 2019 was 0.83|0.69 in 2020|0.70 in 2021

                      Summary

                      This study will benefit from this review process to help elucidate the needed factors of use, equity, and benefit.

                      Submitted by Yaakov Liss on Mon, 01/15/2024 - 16:56

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      While having a treatment plan in place for cancer-related pain is an admirable goal, the current definition within the measure is quite broad: "use of opioids, nonopioid analgesics, psychological support, patient and/or family education, referral to a pain clinic, or reassessment of pain at an appropriate time interval."

                       

                      As such, it is not clear to me that having any plan in place to treat the patient's pain, even if it is medically the wrong plan, is any better than not having a plan.  

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      How will it be determine that a documented care plan is in place?  How good is the technnology at determine that such a care plan exists?

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      See above.  I don't understand how it is being determined that the numerator is being met.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      See above.

                      Equity

                      Equity Rating
                      Equity

                      N/A

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      See above.  Presumably this measure already exists and is performing okay but I would like to understand it better.

                      Summary

                      See above

                      Submitted by Stephen Weed on Wed, 01/17/2024 - 20:31

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Some of the comments have mentioned the opiate epidemic suggesting that there might be a reduction in valid medications to cancer patients. The  NCCN Panel recommendations address history of opioid abuse, psychological issues, etc. In doing so, it does not ignore the clinical possibilities/ realities. This measure has no effect on the administration of the plan and it is not appropriate to criticize it for what it cannot do.
                       

                      My bigger concern is that the measure only requires that a plan is in place at the time the data was collected. I understand that PQM lacks the ability to verify that a treatment plan is appropriate. To do so would be seen in a poor light by the participating providers. However I would like to see this measure find a way to verify that treatment plans are implemented. This could be accomplished by finding a way to indicate if a treatment plan was reviewed. While this might be difficult to tie patient by patient, It might be possible to tabulate and track the total number of pain plan reviews.


                      There has been an increase in reporting entities for the three years the program has results. This is encouraging. The scores fluctuated some so I am giving the most recent results the most focus.

                      In the 2021 report,  51 of the 138 entities had Pearson score below .64. The patient encounters for these lowest providers comprises 53,103 patients. The providers who had a rating of 0 represented 14,461 patients, So on the face of it, very many patients included in the measures results for 2021 appeared not to have a pain management plan. This is very disconcerting. Period.
                       

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      The measure as implemented shows an increase in the data collected. I am not knowledgeable about the systems in place to make comments. 
                      My comments on importance make my recommendation Not Met but Addressable for feasibility too. The measure lacks enough rigor to evaluate whether the pain plans represent an important step in treatment. I would appreciate discussion of whether and how data can be collected to make this measure more meaningful. 

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      I have no concerns about the reliability. 

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      I have no concerns about the validity. 

                      Equity

                      Equity Rating
                      Equity

                      Not explored in this measure.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      ASCO has provided a summary of the resources available to providers for pain management, including some designed for particular types of cancer. Since the measure does not ask for plan specifics or a review of plans, it is difficult to know how a facility would benefit from the measure's report. On a patient level, the ability of a patient to engage in understanding and advocating for care will vary greatly. Even if patients knew that on aggregate, a provider does not create a formal pain management plan, they will likely be proactive in asking for pain management. So I doubt that this measure does much for the patient either.

                       

                      Summary

                      My general impression is that the intent of the measure is good but it adds little to CMS' oversight on this subject. It also does little to affect care.

                      Submitted by Erin Crum on Fri, 01/19/2024 - 14:02

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Monitoring pain intensity is undoubtably valuable to patient care.  However, simply asking a patient about their pain intensity without requiring the clinician to develop plan to address elevated pain is inadequate. This is evidenced by data provided by the measure steward, as well Oncology Care Model data showing that although practices tend to perform high on measures associated with collecting a pain score, avoidable pain continues to be one of the most prevalent reasons for hospital ED visits.  Performance benchmarks indicate high performance for practices and individual clinicians asking about pain levels, but we know that pain is a persistent, unmanaged issue for a large percentage of patients with cancer.   Given the current state of our inadequate pain management and high performance on existing measures, perhaps a more relevant quality measure would be: 1) A combined quality measure to assess both pain intensity and plan of care for pain, or 2) A patient-reported outcome measure indicating pain improvement within a certain time period of follow up.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Agree that all measure data elements can be documented in discrete fields within most EHRs.  Furthermore, both the eCQM (0384e)and MIPS CQM (registry- 0384) version of this measure have been fully implemented for the Oncology Care Model and Enhancing Oncology Care Model, indicating that EHRs have been able to accommodate the registry-version of the measure specification, in addition to the eCQM.   This sets a precedent that the pain intensity and pain care plan measures could be combined to create a single, more comprehensive measure.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Data submitted supports the measure’s feasibility, validity and reliability.  Sample size is statistically valid and data element-level testing is robust.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Data submitted supports the measure’s feasibility, validity and reliability.  Sample size is statistically valid and data element-level testing is robust.

                      Equity

                      Equity Rating
                      Equity

                      Not addressed at this time, but not required.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Users would benefit from clarification around the definition of pain.  For example, it is unclear if all pain should be documented or specifically cancer-related pain.  Working with oncologists, there has been much discussion around whether they should be addressing unrelated pain, such as chronic back pain, a recent broken bone, or nasal sinus pain from a cold.  Clearer guidance within the measure specification would resolve this.

                      In addition to this, in the measure’s current state, there is not clear guidance on situations where a patient is under the care of a medical oncologist and radiation oncologist simultaneously.  For example, if a patient sees both the medical oncologist and radiation oncologist on the same visit day, should both physicians document a pain scale and subsequent plan of care for pain?  Should the patient be in the denominator twice for that same day?  Moreover, how does this impact the patient’s experience of care?

                      Summary

                      The data and additional content provided by the measure steward supports reindorsement. However, this measure has been available for decades, stemming back to the Meaningful Use and Physician Quality Reporting System days.  Asking patients about pain has become standard of care, but effectively managing pain occurs less frequently.  Current 2023 CMS Benchmarks for both the versions of this measure specification are high and considered “topped out” - eCQM (93%) and MIPS CQM (85%). What is more relevant to measure is comprehensive pain assessment and management. Therefore, the ideal version of this measure would be to combine both MIPS 143 (#0384/0384e) and MIPS 144 (0383) which would assess the percentage of cancer patients on treatment that have had their pain assessed, and if pain is present, do they have a plan of care in place with the care team.  This is essentially what CMMI has done for the Enhancing Oncology Model.  This would raise the bar, creating greater opportunity for performance improvement and ensure that action is taken when there is a positive pain score.  In addition, it would ensure that the same patient population is being addressed across these two activities (pain score and plan).  In it's current state, there is no full view of all patients eligible to be screened with the numerator = (no pain + pain with plan).

                      Combining the pain intensity and pain care plan into one measure is feasible.  Both the eCQM and MIPS CQM (registry) version of this measure have been fully implemented for the Oncology Care Model and Enhancing Oncology Care Model, indicating that EHRs have been able to accommodate the registry-version of the measure specification, in addition to the eCQM.  Furthermore, there is a precedence for other similar measures: Depression Screening and Plan of Care, Tobacco Screening and Plan, BMI Screening and Plan, Alcohol Use Screening and Plan, to name a few.  These measures all require screening the full eligible patient population, and if positive, a plan must be documented. 

                       

                       

                      Submitted by Raina Josberger on Fri, 01/19/2024 - 15:12

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Given incidence of cancer, this is an important measure. 

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      yes, use standard data elements

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      reliable measure

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      passed validity testing

                      Equity

                      Equity Rating
                      Equity

                      not addressed

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      in two programs, but declining rates

                      Summary

                      N/A

                      Submitted by Karie Fugate on Sat, 01/20/2024 - 14:43

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      As a patient/caregiver this quality measure is very important as it discusses encounters with cancer patients receiving chemotherapy or radiation who report having pain with a documented plan of care to address this pain. I realize this measure only addresses a plan of care being developed and monitored, not that the plan adequately addresses the pain the patient is having. 

                       

                      From the developer:

                      “However, many oncology patients report insufficient pain control. (9) A retrospective chart review analysis found an 84 percent adherence to the documentation of pain intensity and 43 percent adherence to pain re-assessment within an hour of medication administration. (10) An observational study found that over half of its cancer patients had a negative pain management index score, indicating that the prescribed pain treatments were not commensurate with the pain intensity reported by the patient.”

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      The developer notes that “the measure's data capture can be seamlessly integrated into existing physician workflows and data collection tools without requiring any significant modifications." 

                       

                      "Numerous healthcare practices have already set up their workflows to gather this information, highlighting its easy adoption. This is evident from the considerable number of practices that report this measure to the Centers for Medicare and Medicaid Services (CMS) via the Merit-based Incentive Payment System (MIPS) program.”

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Overall, 100% of clinicians and practices had measure scores with reliabilities of 0.70 or higher, a commonly accepted reliability threshold (Adams 2010). The reliability values were consistently close to the ideal, indicating that the clinician performance rates were highly reliable, and any measurement error was minimal.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      The evaluation benchmarks suggest that the measure accurately distinguishes between good and poor quality, with nearly perfect validity for both the measure's denominator and numerator.

                      Equity

                      Equity Rating
                      Equity

                      The developer notes the below and did not address how this quality-of-care gap will be addressed with this measure. As a patient/caregiver I would also ask if pain management rules (implemented due to the opioid crisis) impacted the rates at which long-acting opioids were prescribed and for more data on the actual numbers (by race, age, and gender) of patients with cancer discussed in this measure. I was unable to access reference (11) as this is a paid subscription.

                       

                      “Disparities exist as well, for example, a recent study evaluated opioid prescription fills and potency among cancer patients near end of life between 2007-2019. The study found that while all patients had a steady decline in opioid access, Black and Hispanic patients were less likely to receive opioids than White patients (Black, -4.3 percentage points, 95% CI; Hispanic, -3.6 percentage points, 95% CI) and received lower daily doses (Black, -10.5 MMED, 95% CI; Hispanic, -9.1 MMED, 95% CI).”

                       

                      Also noted (from the developer) is “CMS did not capture nor provide any patient-level socio-demographic variables and therefore no patient demographic data is available. McKesson's Practice Insights QCDR masked patients' demographic data to protect privacy during medical chart audits and did not provide patient demographics.”

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Per the developer “considerable number of practices that report this measure to the Centers for Medicare and Medicaid Services (CMS) via the Merit-based Incentive Payment System (MIPS) program.”

                       

                      “The MIPS-Quality program data were retrieved from 2021-2023 performance reports and reflect calendar years 2019-2021. The observed performance rates indicate a concerning decline in quality and suggest that there is ample opportunity for improvement in both individual clinician and practice performance.”

                       

                      More data should be provided on the decline in quality and how improvements will be implemented.

                      Summary

                      N/A

                      Submitted by Gerri Lamb on Sat, 01/20/2024 - 17:17

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      The same information is used to support the importance of this measure as  Measure 384 (companion measure). Citations are provided to support the large incidence of cancer, the importance and impact of pain control, and the lack of congruence between reported pain level and treatment.  This measure, like 384, is incorporated in practice guidelines. Clearly, this measure has face validity.  However, there are no citations supporting a connection between having a plan of care and adequate pain control. Since this measure has been in use for a number of years and according to the measure developers, has been selected by a large number of practices for MIPS and other quality programs, it seems appropriate to provide literature and data that support the association between having a plan of care and the outcomes in the logic model. 

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      The measure developers note that "feedback from EHRs, cancer registries, and oncology practices provide compelling evidence that the measure is easy to implement."  Examples from the sources of feedback would be appropriate to include. 

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      The measure developers state that all data elements - for both numerator and denominator - exist in structured fields. Sources of data for the numerator are not specified here - presumably data about opioid and non-opioid prescriptions would be trackable. How are other acceptable indicators of a plan of care like psych support, education and reassessment measured? 

                       

                      Reliability of this measure is evaluated using the same data sets reported for Measure 384. Signal to noise ratios are all within adequate limits. 

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      The method used for validity assessment compares record abstraction with automated algorithms using kappa statistic.  Given the length of time this measure has been in use and the number of practices choosing to report it, are other measures of concurrent and construct validity available? 

                      Equity

                      Equity Rating
                      Equity

                      General information about disparities is provided in the importance section. No specific data related to equity and disparities are provided. 

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      The measure is in use in MIPS and other quality programs. There is some indication that the measure has topped out and may not be as helpful to guide practice changes. 

                      Summary

                      The same information was used to support the importance and feasibility of this measure as Measure 383. The main question is about the availability of data to go beyond the general statements here and demonstrate a consistent association between completion of the measure and outcomes noted in the logic model. Similarly, it would be important to have data supporting the construct validity of the measure, i.e., that it results in improved pain management.  

                      Submitted by Emily Martin on Sun, 01/21/2024 - 19:46

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      The authors outline the importance of standardizing documentation of the pain management plan for patients undergoing chemotherapy and/or radiation therapy who report pain. 

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Assessment of pain and documentation of management plan can be readily incorporated into workflow and can be measured through structured data elements. 

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      The measure has high reliability scores. 

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      The measure has outstanding validity scores. 

                      Equity

                      Equity Rating
                      Equity

                      The authors did not make a point of addressing equity but this is optional. 

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      The authors note that this measure is already in use for EOM and MIPS programs. There has not been improvement to date but it is unclear why. 

                      Summary

                      This measure meets criteria in most domains. In those in which it doesn't, the changes can be readily addressed. 

                      Submitted by Sarah Thirlwell on Sun, 01/21/2024 - 22:26

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Measure developers address the importance of this measure for a sub-group of oncology patients who receive radiation therapy treatment and those who receive a chemotherapy administration procedure.  Since the endorsement of this measure in 2017, an increasing number of oncology patients receive other forms of treatments that are not addressed by the developers.  Cancer patients receiving other treatment modalities also experience pain and the developers could consider expanding this measure to include these other sub-groups of oncology patients as additional populations in the reported rate of this measure.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Measure developers have not updated measure specifications for the numerator to reflect the stated NCCN clinical practice guideline recommendations for an appropriate plan of care for oncology patients experiencing pain.  This could lead to differences in coding for the numerator that do not reflect differences in quality of care.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Measure developers do not address how numerator elements coded to reflect a documented plan of care for pain were tested for reliability.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Measure developers do not address how numerator elements coded to reflect a documented plan of care for pain were tested for validity.

                      Equity

                      Equity Rating
                      Equity

                      Measure developers indicated that differences could exist and that care settings are encouraged to track additional data that could reflect differences in health equity but these have not been included in measure specifications and analyses according to those data were not reported. 

                       

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Measure developers indicate the use and usability of this measure nationally and internationally as an indicator of quality of oncology care.  No evidence is reported regarding improvements over time of a documented plan of care for oncology patients experiencing pain.

                      Summary

                      Given availability and access to repeat pain scores according to results for Measures 0383/e, it would be of value for measure developers to develop a new outcome measure to quantify a patient's pain score in the visit after completion of the documented plan of care.

                      Submitted by Nicole Keane on Mon, 01/22/2024 - 14:38

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Instead of two process measures (0383 and 0384) combine into one measure; pain assessed and on the plan of care. It would be less burdensome for sites.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      The necessary data elements required can be found within structured fields and are recorded using ICD-10.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Current performance data used.Sample size for each year and accountable entity level analyzed is sufficient.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Kappa coefficient threshold met for reliability.

                      Equity

                      Equity Rating
                      Equity

                      No information provided.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Measure currently in use in the CMS Merit-based Payment System (MIPS). Providers can send feedback via the CMS Helpdesk or via email to ASCO. 

                      Summary

                      Instead of two process measures (0383 and 0384) combine into one measure; pain assessed and on the plan of care. It would be less burdensome for sites.

                      Submitted by Brigette DeMarzo on Mon, 01/22/2024 - 15:03

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Agree with PQM staff comments

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Agree with PQM staff comments

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Agree with PQM staff comments

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Agree with PQM staff comments

                      Equity

                      Equity Rating
                      Equity

                      Agree with PQM staff comments; could not find additional referenced information

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Agree with PQM staff comments

                      Summary

                      Agree with PQM staff comments

                      Submitted by Dima Raskolnikov on Mon, 01/22/2024 - 17:41

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      What is the data in support of the position that having a documented plan in place leads to better patient outcomes? 

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      I find the statement "Feedback from EHRs, cancer registries, and oncology practices provides compelling evidence that this measure is easy to implement and presents minimal feasibility challenges" difficult to believe. 

                       

                      Clinical plans are very difficult to identify from patient records, and the only place to meaningfully do so would be in the narrative note which is not a structured field by definition. The fact that this data is used by other federal programs is not reassuring to me, unless it can be demonstrated that patient outcomes are improved by collecting this data. 

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      not acceptable if there is low confidence in the numerator (i.e., # of plans in place)

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      see above

                      Equity

                      Equity Rating
                      Equity

                      not described

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Would want more detail re: this staff point: 

                      "Developer notes that performance rates have declined (time frame not provided) but does not provide sufficient explanation for the decline."

                      Summary

                      n/a

                      Submitted by Morris Hamilton on Mon, 01/22/2024 - 20:59

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      The information provided adequately summarizes the reason this measure is important and identifies that gaps in the target population currently exist. Because the cohorts change considerably over the years 2019, 2020, and 2021, comparisons of means and distributions across years are not possible.

                       

                      I also agree with a comment from another member that the definition of a plan is too broad. Currently a documented plan is defined as "A documented plan of care may include: use of opioids, nonopioid analgesics, psychological support, patient and/or family education, referral to a pain clinic, or reassessment of pain at an appropriate time interval." Is there any relevant medical evidence to narrow this definition further?

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      All necessary information has been provided, and growth in usage of the measure has increased.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      At 0.8 and above, estimated entity-level reliability exceeds standards. Unsurprisingly, reliability appears to weaken for providers with fewer patients. Encounter-level reliability is provided in validity section.

                       

                      The number of patient encounters for data set #6 was omitted.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      With very high kappa values, encounter-level validity is satisfied. The elements of the measure appear accurately measured.

                       

                      Entity-level validity is not provided. As a maintenance measure that has been in existence for several years, the submission should also include measures of concurrent validity. How correlated is this measure to other measures related to patient quality for pain or cancer? Are the correlations reasonable?

                      Equity

                      Equity Rating
                      Equity

                      The authors indicate that demographic data are not available at the patient-level; however, they do not acknowledge that geographic data of the providers may be available. A comparison of measure performance by Area Deprivation Index may be feasible and may elucidate some information about the relationship between measure performance and equity. Though this domain is optional, I encourage the developers to investigate further.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      While the developers provide good evidence to suggest that providers can improve, the developers provide no evidence of improvement, which suggests that the measure may be of limited usability.

                       

                      Without a stable cohort to compare across years, the claim that there is a decline in performance is spurious. If the authors can limit their presentation of performance to a stable cohort and a decline still exists, then the authors should explain why a decline would occur. The period was 2019-2021. Did the PHE play a role?

                      Summary

                      Overall, the measure is well defined, important, feasible, and reliable. It is currently in use in several federal programs. The developers should provide additional analyses to improve their submission. At this time, entity-level validity and usability cannot be adequately evaluated. The developers may also consider using geographic data for providers to investigate equity relationships further.

                      Submitted by Heather Thompson on Mon, 01/22/2024 - 22:11

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Literature review, including patient-identified areas of concern related to cancer treatments, are supportive of this measure importance.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Data points easily built into the electronic medical record and existing workflow processes.

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      High reliability.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      High validity.

                      Equity

                      Equity Rating
                      Equity

                      Opportunities exist for cross referencing additional patient demographics and characteristics in the electronic medical record with pain management care planning outcomes to identify areas of equity opportunity.

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Data/outcomes are actionable and address effective pain assessment as well as effective pain management care planning.

                      Summary

                      Measure is valuable in identifying opportunities for improvement in key processes necessary for effective pain management, quantifying pain and development of effective care plans.

                      Submitted by Carol Siebert on Mon, 01/22/2024 - 23:33

                      Permalink

                      Importance

                      Importance Rating
                      Importance

                      Concur with this staff analysis: 

                      There is a business case supported by credible evidence depicting a link between health care processes to desired outcomes for cancer patients. Actions providers can take to reach the desired outcome are outlined. Additionally, a gap in care remains that warrants this measure. Would like to have seen something about patient input on meaningfulness of measure. The cited 2022 study affirms the intent of the measure but is not the same as patient feedback on the measure.

                      Feasibility Acceptance

                      Feasibility Rating
                      Feasibility Acceptance

                      Staff analysis indicates there are not fees. However, it seems the developer responded to this question with "Indicate whether your measure or any of its components are proprietary, with or without fees.

                      Proprietary measure or components with fees." Licensing fees are not charged to non-profit entities.

                      Measure uses data that is easily identifiable/retrievable (standard fields, codes).

                      Scientific Acceptability

                      Scientific Acceptability Reliability Rating
                      Scientific Acceptability Reliability

                      Well defined, precisely specified. Recent reliability testing with strong reliability.Agree with staff assessment The Calculation Algorithms for Criterion 1 and 2 are very generic and lack details specific to this particular measure. The measure flow chart was more useful.

                      One aspect that concerns me is that, in most EHRs, a "plan of care" can be carried over from visit to visit almost automatically. If the numerator statement "plan of care to address pain documented" is all that is needed, this can be accomplished in an EHR with two clicks. There is nothing to indicate the plan of care has been discussed with the patient, modified, etc. I realize that the using the presence of the plan of care simplifies the measure, but it also makes the measure less meaningful.

                      Scientific Acceptability Validity Rating
                      Scientific Acceptability Validity

                      Strong recent validity testing.

                      Equity

                      Equity Rating
                      Equity

                      Developer did not really address equity. The lack of demographics in the MIPs data and the masking of demographics in the McKesson data and no efforts by the developer means inequities are not recognized or addressed. This is a concern given the evidence that there are inequities in pain management, including management of cancer pain:

                      J Gen Intern Med. 2019 Mar; 34(3): 435–442. doi: 10.1007/s11606-018-4785-z

                      Proc Natl Acad Sci U S A. 2016 Apr 19; 113(16): 4296–4301. doi: 10.1073/pnas.1516047113

                      Anesthesiol Clin 2023 Jun;41(2):471-488. doi: 10.1016/j.anclin.2023.03.008

                       

                      Use and Usability

                      Use and Usability Rating
                      Use and Usability

                      Used in multiple programs.

                      Developer did not adequately address recent declines in performance.

                      Summary

                      This measure is well defined and has exceptional reliability and validity. My main concerns:

                      Developer did not address recent decline in performance.

                      Lack of demographic data or addressing equity issues in light of broader, documented disparities in assessing and addressing pain.

                      The simplicity of the measure may be at the expense of it being meaningful. The presence of a plan for addressing pain does not mean the plan is implemented or even tailored to the needs/priorities of the patient (person-centered). The same standardized fields that make the measure simple and easy to abstract also make it easy for such a plan to exist in the chart but be neither meaningful nor effective.