The updated Consumer Assessment of Healthcare Providers and Systems® (CAHPS) Home Health Care Survey, also referred as “HHCAHPS,” is a 25-item instrument. This is a standardized survey instrument and data collection methodology for measuring home health patients’ perspectives on their home health care in Medicare-certified home health care agencies. The survey is administered monthly to patients who recently received or are receiving home health care from Medicare-certified home health agencies. It is offered as mail-only, phone-only or mixed mode (mail with telephone follow-up). The survey instrument consists of 17 core questions about various aspects of their care experience, 6 demographic questions, and 2 questions determining whether a proxy respondent completed the mail survey.
The HHCAHPS Survey is part of the CAHPS family of experience of care surveys. English and other translations of this survey are available at https://homehealthcahps.org/. CMS initiated national implementation of the HHCAHPS Survey in October 2009 with agencies participating on a voluntary basis prior to when quality reporting requirements for the home health annual payment update (APU) began in the third quarter of calendar year 2010. The Centers for Medicare & Medicaid Services (CMS) began publicly reporting results from the HHCAHPS Survey on Home Health Compare on the Medicare.gov website in April of 2012. HHCAHPS was linked to the quality reporting requirement for the CY 2012 APU.
Care of Patients is a multi-item measure derived from the updated CAHPS® Home Health Care Survey, also referred as “HHCAHPS.”
This composite measure is based on responses collected over a 12-month period with the oldest quarter dropping off and the newest quarter being added each public reporting period. The Care of Patients measure is composed of responses to the following survey items:
Q6. In the last 2 months of care, how often did home health staff from this agency seem to be aware of all the care or treatment you were getting at home?
Never, Sometimes, Usually, Always
Q7. In the last 2 months of care, how often did home health staff from this agency treat you with care – for example, when moving you around or changing a bandage?
Never, Sometimes, Usually, Always
Q10. In the last 2 months of care, how often did home health staff from this agency treat you with courtesy and respect?
Never, Sometimes, Usually, Always
Q11. In the last 2 months of care, how often did you feel that home health staff from the agency cared about you as a person?
Never, Sometimes, Usually, Always
Q13. In the last 2 months of care, how often have the services you received from this agency helped you take care of your health?
Never, Sometimes, Usually, Always
Measure Specs
- General Information(active tab)
- Numerator
- Denominator
- Exclusions
- Measure Calculation
- Supplemental Attachment
- Point of Contact
General Information
One of the goals of the CMS National Quality Strategy is to foster engagement and to bring the voices of patients to the forefront. As part of fostering engagement, it is critical to hear the voice of individuals by obtaining feedback from them on home health agency performance and incorporating it as part of CMS’s comprehensive approach to quality. Patient-centeredness is a central goal of home health care and can be directly measured through surveys of home health patients.
AHRQ and the CAHPS grantees developed the Home Health CAHPS® Survey, a component of the Home Health Quality Reporting Program (HHQRP), to ensure that an assessment of the patient-centeredness of care would be included to monitor home health agency performance, promote quality improvement, and inform consumer decision making in the selection of a home health agency via public reporting of results. The HHCAHPS Survey is a standardized survey instrument and data collection methodology for measuring home health patients’ perspectives on their home health care in Medicare-certified home health care agencies. The survey is administered monthly to patients who recently received or are receiving home health care from Medicare-certified home health agencies.
The HHCAHPS Survey Care of Patients measure assesses patient experience of care and treatment the patient got at home, how often the staff treated the patient with care, whether patient was treated with courtesy and respect, how often the patient felt the agency cared about them as a person, and whether the services helped the patient take care of their health.
This measure reflects patient experiences with their home health agency across a variety of domains that were identified as important to patients and stakeholders based on focus groups and cognitive interviews during the questionnaire revision development phase. This measure will be publicly reported on Medicare.gov Care Compare tool to help agencies with quality improvement and increase quality transparency to consumers.
Not Applicable
Numerator
The items in the HHCAHPS Care of Patients measure use a “Never/Sometimes/Usually/Always” response scale. The top-box numerator is the number of respondents who answer “Always.”
The HHCAHPS Survey Care of Patients measure assesses patient experience of care and treatment the patient got at home, how often the staff treated the patient with care, whether patient was treated with courtesy and respect, how often the patient felt the agency cared about them as a person, and whether the services helped the patient take care of their health.
See response in the ITS form items 1.14 and 1.6 for information about HHCAHPS Survey top-box scoring for this measure.
This composite measure is based on responses collected over a 12-month period with the oldest quarter dropping off and the newest quarter being added each public reporting period.
The Care of Patients measure is composed of responses to the following survey items:
Q6. In the last 2 months of care, how often did home health staff from this agency seem to be aware of all the care or treatment you were getting at home?
Never, Sometimes, Usually, Always
Q7. In the last 2 months of care, how often did home health staff from this agency treat you with care – for example, when moving you around or changing a bandage?
Never, Sometimes, Usually, Always
Q10. In the last 2 months of care, how often did home health staff from this agency treat you with courtesy and respect?
Never, Sometimes, Usually, Always
Q11. In the last 2 months of care, how often did you feel that home health staff from the agency cared about you as a person?
Never, Sometimes, Usually, Always
Q13. In the last 2 months of care, how often have the services you received from this agency helped you take care of your health?
Never, Sometimes, Usually, Always
Denominator
HHCAHPS Survey respondents are the adult patients who received care from a home health agency in a given month.
The denominator for the HHCAHPS Care of Patients measure is the number of respondents with completed surveys who answer at least one item within the multi-item measure.
See ITS form item 1.15 for more information. An HHCAHPS survey is defined as completed when at least 50 percent of the questions applicable to all patients are answered.
Exclusions
Cases are excluded from the measure denominator if:
• Patients under 18 years of age at any time during their stay are excluded.
• Patients who received fewer than 2 visits from home health agency personnel during a 2-month look-back period are excluded. The 2-month look-back period is defined as the 2 months prior to and including the last day in the sample month.
• Patients have been previously selected for an HHCAHPS sample during any month in the current quarter, or during the last 5 months, are excluded.
• Patients who are currently receiving hospice, or are discharged to hospice, are excluded.
• All routine maternity patients are excluded.
• All “No publicity” status patients are excluded. [“No publicity” patients are defined as patients who do not wish to have their contact information released to anyone outside the agency.]
• Patients receiving only non-skilled care are excluded.
• Patients who reside in a state where their health condition exclude them from surveys.
• Patients who are decedents at the time of the sample are excluded.
Denominator is the total number of surveys fielded in the 4-quarter period minus the total number of ineligible surveys. The total number of ineligible surveys includes sample cases assigned a final disposition code of 210—ineligible: deceased, 220—ineligible: does not meet eligible population criteria (see Section 1.15b above), 230—ineligible: language barrier, and 240—ineligible: mentally or physically incapacitated, no proxy available. No other disposition codes will be excluded from the denominator.
Measure Calculation
The survey is administered monthly to patients who recently received or are receiving home health care from Medicare-certified home health agencies.
CMS calculates HHCAHPS Survey measure scores using top-box scoring for completed surveys (a survey is defined as completed when at least 50 percent of the questions applicable to all patients are answered (Q1-Q15, Q17)). The top-box score refers to the percentage of respondents who give the most positive response(s) (e.g., the most positive responses, depending on the specific item, are 'strongly agree', 'always', or 'yes'). See “Steps_Calculate_HHCAHPS_Composites_1.18” attachment in section 7.1 for more information on the positive responses' definitions and risk adjustment calculation. HHCAHPS Survey respondents are the adult eligible patients who received care from a home health agency in a given month. The denominator for the HHCAHPS composite measures is the number of respondents with completed surveys who answer at least one item within the multi-item measure.
Cases are excluded from the measure denominator if:
• Patients under 18 years of age at any time during their stay are excluded.
• Patients who received fewer than 2 visits from home health agency personnel during a 2-month look-back period are excluded. The 2-month look-back period is defined as the 2 months prior to and including the last day in the sample month.
• Patients have been previously selected for an HHCAHPS sample during any month in the current quarter, or during the last 5 months, are excluded.
• Patients who are currently receiving hospice, or are discharged to hospice, are excluded.
• All routine maternity patients are excluded.
• All “No publicity” status patients are excluded. [“No publicity” patients are defined as patients who do not wish to have their contact information released to anyone outside the agency.]
• Patients receiving only non-skilled care are excluded.
• Patients who reside in a state where their health condition exclude them from surveys.
• Patients who are decedents at the time of the sample are excluded.
See document entitled Steps Calculate HHCAHPS Composites for discussion of the calculation used.
The measure is not stratified.
Proxy permitted if patient incapable of responding. A family member or friend knowledgeable about the patient’s home health care is an ideal proxy respondent. An employee of the home health agency cannot serve as a proxy.
The HHCAHPS Survey is currently administered on a monthly basis, through mail only, telephone only, or mixed mode (mail with telephone follow-up) and is available in multiple languages. For the revised survey instrument, which is the Instrument measure being submitted, a field test was conducted in English and Spanish and tested as mail only, phone only, mail with phone follow-up, and Web with mail follow-up. HHCAHPS Survey items are not imputed when responses are missing. Imputation is only used on a very limited basis for respondent characteristics (e.g. age or education group) in the score adjustment process. Response rates for the revised survey, which was used to develop this measure, were 26.2% for mail only, 19.5% for phone only, 34.9% for mixed mode, and 19.1% for Web/mail modes.
Guidance on maximizing response rates includes using the agency name and/or logo on outgoing mailing envelopes and letters, training interviewers in refusal avoidance, and standardizing the layout of the questionnaire to a two-column format. Further guidance is presented in the Protocols and Guidelines Manual, located here: https://homehealthcahps.org/Portals/0/SurveyMaterials/PandGManual.pdf
Guidance for computing response rates is located in the above manual as well. There is no minimum response rate requirement on HHCAHPS.
Response rates are calculated as the total number of completed surveys (assigned a final disposition code of 110—mail complete and 120—phone complete), divided by the total number of surveys fielded (all final disposition codes) less the total number of ineligible surveys (defined as cases with a final disposition code of 210--deceased, 220--does not meet eligible population criteria, 230—language barrier, or 240—mentally or physically incapacitated, no proxy available).
No minimum sample size
Supplemental Attachment
Point of Contact
Not applicable
Elizabeth Goldstein
Baltimore, MD
United States
Sarah With
RTI International
Durham, NC
United States
Importance
Evidence
The Consumer Assessment of Healthcare Providers and Systems (CAHPS) set of patient experience surveys are well-established measures of healthcare quality*. From December 2023-December 2024, over 1.2 million patients throughout the country completed the Home Health Care CAHPS (HHCAHPS) Survey. Public reporting of these survey results creates incentives for agencies to improve their quality of care, directly impacting the patients who receive it. Because of this, it is important to ensure that the survey aligns with what patients believe constitutes high-quality care.
In 2016, RTI conducted a qualitative analysis and conceptual mapping of patient experiences in home health care to guide potential revisions to the HHCAHPS Survey instrument. RTI conducted a literature review of peer-reviewed publications and “grey literature” and identified 104 domains reflecting home health patient experience. We then used conceptual mind-mapping software to arrange the domains in related clusters. Findings from this analysis were published in a peer-reviewed 2018 article**.
The results from the literature review were used to inform focus group and moderator guides. RTI then conducted focus groups and telephone interviews with people receiving home health care and proxy respondents (e.g., family members) to understand home health patients’ experience and what constitutes high-quality home health care, and to assess the level of importance that home health care patients place on specific components of care. Many of the domains identified in the literature review were highlighted by participants in the focus groups, including domains on the personal characteristics of the caregiver and those relating to technical competency, availability of needed care, and amount of care/time spent with the patient. We used the conceptual mapping and focus group and interview results to develop additional survey items for testing.
In 2018, RTI conducted key informant interviews with home health agency representatives, hospital discharge planners and Technical Expert Panel (TEP) members to gain their feedback on potential new survey items based on findings from the focus groups and telephone interviews. After incorporating feedback from the TEP, we conducted cognitive interviews using the revised survey instrument with new survey items in 2019 and 2020 with home health patients and (if needed) proxies.
Data from the 2022 HHCAHPS Survey field test do not allow for direct assessment of the relationship between survey measures and structures or processes. However, given that the modified instrument-derived composite measures (Care of Patients and Communications Between Providers and Patients) are similar to current HHCAHPS composite measures, CMS anticipates that these revised measures will exhibit similar relationships to those of the existing HHCAHPS Survey composite measures. The three new single-item measures (Review Medicines, Talk About Home Safety, and Talk About Medicine Side Effects) are currently contained within an existing HHCAHPS Survey multi-item composite measure. CMS is dropping this last composite measure because the items that remained, after some were dropped from the survey, did not perform well enough together to report as a modified composite. Instead, these remaining items are being proposed to be reported as single-item measures.
Citations:
**Lines, L. M., Anderson, W. L., Blackmon, B. D., Pronier, C. R., Allen, R. W., & Kenyon, A. E. (2018). Qualitative analysis and conceptual mapping of patient experiences in home health care. Home Health Care Services Quarterly, 37(1), 25-40.
* Martino, S.C., Elliott, M.N., Cleary, P.D., Kanouse, D.E., Brown, J.A., Spritzer, K. L., & Hays, R. D., (2009). Psychometric properties of an instrument to assess Medicare beneficiaries’ prescription drug plan experiences. Health Care Financing Review, 30(3), 41-53.
National Quality Measures Clearinghouse. Home health care satisfaction: mean section score for "Managing Your Home Health Care" questions on Home Health Care Survey. South Bend (IN): Press Ganey Associates, Inc.; 2008. Available at: http://www.qualitymeasures.ahrq.gov/content.aspx?id=28154
Press Ganey Associates. Home Health and Public Reporting. 2010 (white paper). Available at: http://helpandtraining.pressganey.com/Documents_secure/HomeCare/White%20Papers/HomeHealth_andPublicReporting.pdf
Institute for Healthcare Improvement. A Vision for "What Matters to You?" Video: "What matters" in the home health setting. Accessed Jan. 27, 2016. Available at: http://www.ihi.org/Topics/WhatMatters/Pages/default.aspx
Davies E, Shaller D, Edgman-Levitan S, et al. Evaluating the use of a modified CAHPS survey to support improvements in patient-centered care: lessons from a quality improvement collaborative. Health Expect. 2008;11:160–176.
Quigley DD, Mendel PJ, Predmore ZS, et al. Use of CAHPS(®) patient experience survey data as part of a patient-centered medical home quality improvement initiative. J Healthc Leadersh. 2015;7:41–54.
Anhang Price R, Elliott MN, Zaslavsky AM, et al. Examining the role of patient experience surveys in measuring health care quality. Med Care Res Rev. 2014;71:522–554.
Elliott, M. N., Lehrman, W. G., Goldstein, E. H., Giordano, L. A., Beckett, M. K., Cohea, C. W., & Cleary, P. D. (2010). Hospital survey shows improvements in patient experience. Health affairs, 29(11), 2061-2067.
CAHPS instruments include questions that ask about aspects of health care delivery that are important to patients which allows for objective and meaningful comparisons between healthcare providers on domains that are important to consumers.1, 3 The Home Health Care CAHPS (HHCAHPS) Survey is designed to measure the experiences of people receiving home health care from Medicare-certified home health agencies. Instruments such as the HHCAHPS Survey are important as they promote public accountability for agencies because results are publicly reported to help consumers decide where they receive care. Consumers are able to use the results of the HHCAHPS Survey to compare agencies and choose higher-performing ones that meet their needs on the aspects of care that are important to them.
Public reporting of results also serve as an incentive for agencies to improve their quality of care in order to achieve higher scores. Results allow agencies to target areas that require improvement and monitor and assess the results of improvement efforts.5, 6, 8 This ultimately benefits the patients that are receiving care as higher quality patient experience is associated with better health and clinical outcomes, patient safety, and higher levels of adherence to medications and treatment processes.2, 4
Because of this, it is important to ensure that the HHCAHPS Survey aligns with what patients believe constitutes high quality care. Through a literature review and focus groups and telephone interviews with people receiving home health care and their family members we found that domains on personal characteristics of the caregiver and those relating to technical competency, availability of needed care, and amount of care/time spent with the patient are important components of care for this patient population.7 Based on these results we developed new survey items to target these domains. This redesigned survey ensures that the data being captured continues to be meaningful for both the patients and agencies that use the data.
References:
1. Agency for Healthcare Research and Quality (AHRQ). (2013). Principles underlying CAHPS
surveys. Retrieved from https://www.ahrq.gov/cahps/about-cahps/principles/index.htm
2. Anhang Price, R., Elliott, M. N., Zaslavsky, A. M., Hays, R. D., Lehrman, W. G., Rybowski, L., & Cleary, P. D. (2014). Examining the role of patient experience surveys in measuring
health care quality. Medical Care Research and Review, 71(5), 522-554.
3. Bland, C., Zuckerbraun, S., Lines, L. M., Kenyon, A., Hinsdale-Shouse, M., Hendershott, A., ... & Butler, J. (2022). Challenges Facing CAHPS Surveys and Opportunities for
Modernization. RTI Press.
4. ‘Browne, K., Roseman, D., Shaller, D., & Edgman-Levitan, S. (2010). Measuring patient
experience as a strategy for improving primary care. Health affairs, 29(5), 921-925.
5. Cefalu, M., Elliott, M. N., & Hays, R. D. (2021). Adjustment of patient experience surveys for
how people respond. Medical care, 59(3), 202.
6. Elliott, M. N., Lehrman, W. G., Goldstein, E. H., Giordano, L. A., Beckett, M. K., Cohea, C. W., & Cleary, P. D. (2010). Hospital survey shows improvements in patient
experience. Health affairs, 29(11), 2061-2067.
7. Lines, L. M., Anderson, W. L., Blackmon, B. D., Pronier, C. R., Allen, R. W., & Kenyon, A. E. (2018). Qualitative analysis and conceptual mapping of patient experiences in home health care. Home Health Care Services Quarterly, 37(1), 25-40.
8. Quigley, D. D., Mendel, P. J., Predmore, Z. S., Chen, A. Y., & Hays, R. D. (2015). Use of
CAHPS® patient experience survey data as part of a patient-centered medical home
quality improvement initiative. Journal of Health care Leadership, 41-54.
Measure Impact
A successful survey should be relevant to the target audience and produce meaningful results that can be used to inform decisions. For the HHCAHPS Survey, this includes a survey that accurately measures characteristics of quality home health care from the patient’s perspective and produces results that home health agencies can use to improve their care.
In 2016, RTI conducted focus groups and interviews with people receiving home health care, and their family members, to better understand what aspects of care were the most important to them. Results from the focus groups and interviews informed the development of new survey items. For example, respondents explained that one of the most important aspects of care included staff that are caring, supportive, empathetic, and considerate. Because of this, RTI developed a new survey item asking whether home health care providers cared about the patient as a person. RTI also developed a new survey item asking if home health staff provided their family or friends with information and instructions about their care because respondents viewed family member involvement in the treatment plan as important, and there was no question about this in the survey at that time. Note that these are just two examples; additional survey items were developed or revised based on results from focus groups, key informant interviews, and TEP feedback, and the survey was also shortened to reduce survey burden.
In 2018, RTI held a Technical Expert Panel (TEP) to obtain feedback on new survey items. Prior to the meeting, participants were sent a set of questions asking if they currently use HHCAHPS data and how often they use the Compare tool on Medicare.gov. Home health care agency members explained that they use the Compare tool data to compare themselves to similar agencies in their geographic region and to identify areas for improvement. Other TEP members noted that the Compare tool is useful for patients and family members and that accountable care organizations are making decisions about whether to recommend an agency based on the agency’s star ratings.
Activities conducted by RTI during the survey revision process, such as focus groups, interviews, and TEP meetings, helped to ensure that the survey is relevant to the target audience, and that survey results are meaningful to both patients and the home health care agencies that provide care.
A successful survey should be relevant to the target audience and produce meaningful results that can be used to inform decisions. For the HHCAHPS Survey, this includes a survey that accurately measures characteristics of quality home health care from the patient’s perspective and produces results that home health agencies can use to improve their care.
Between February 2019 and November 2020, we conducted 38 in-person and telephone interviews with people receiving home health care and their family members. While the main goal was to gather feedback about new and revised survey items, we also included probes to assess factors that affect people’s likelihood of responding to a survey asking about their experience of care and what makes them more likely to respond. When asked if they usually answer surveys about their experience of care a majority of participants confirmed that they do. When asked would make them more or less likely to fill out a survey like the HHCAHPS Survey a main theme that emerged was whether people thought the results would improve services for others. Specifically, if they thought the results could help improve the agency in the long run or help someone else receiving care. Quotes from interviews are included below*.
“Feeling like you are helping to improve services other patients receive.”
“Whether or not I thought the survey would be useful and if it would help someone else.”
“It would be good if it helped improve the company in the long run and helped them provide better care in homes.”
Results from these interviews demonstrate that the patient population completing the HHCAHPS Survey find that surveys measuring the satisfaction with their care and their experience of care are meaningful if the results can help improve both care the agency provides and the quality of care that the patient receives.
* Allen, R. W. & Lines, L. (2022) Care Experience Surveys: What Motivates Response. Methodological Brief presented at the 77th annual meeting of the American Association of Public Opinion Research, Chicago, IL.
Performance Gap
Performance gap information is not available. We have provided the distribution of scores as requested.
See section 5.1 for a description of the 2022 field test data the results are based on.
The currently implemented instrument has the following values for the last public reporting period covering quarter 4 of 2023 through quarter 3 of 2024, a mean of 88.7%, inner quartile range 86.5% to 92.2%, for 7069 entities and 1,013,291 patients. The results from the field test suggests a lower mean score allowing for more room for improvement and greater distance in the inner quartile range suggesting more ability to separate entities by the new IDM.
Overall | Minimum | Decile_1 | Decile_2 | Decile_3 | Decile_4 | Decile_5 | Decile_6 | Decile_7 | Decile_8 | Decile_9 | Decile_10 | Maximum | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Mean Performance Score | 81.7 | 56.8 | 67.6 | 74.2 | 78.7 | 81.0 | 82.6 | 83.4 | 84.6 | 86.3 | 87.3 | 91.4 | 94.7 |
N of Entities | 100 | 1 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 1 |
N of Persons / Encounters / Episodes | 6150 | 22 | 313 | 395 | 526 | 1435 | 740 | 838 | 495 | 634 | 598 | 176 | 7 |
Equity
Equity
This domain is optional for Spring 2025.
Feasibility
Feasibility
As these patient-experience data are routinely collected from patients based on their home health care experiences, the structured data are not available in electronic sources outside of RTI International. Web-based data collection was tested in the 2022 HHCAHPS Survey field test, with minimal success (i.e., very few responses). Examples of barriers to this population using a web-based mode include lack of email addresses collected by home health agencies, and the nature of the target population (older age group).
Proposed revisions to the current HHCAHPS Survey instrument include shortening it and focusing on topics that are important to home health patients, as developed through focus groups, cognitive interviews, and other instrument-development activities conducted for this revised instrument. For specific changes to the current instrument, please see "Comparison of Current and Revised HHCAHPS Survey Instruments" attachment in section 7.1. The HHCAHPS Survey is administered by independent survey vendors that are approved by CMS. These vendors are audited by CMS’s implementation contractor on a regular basis to ensure that the vendors are following HHCAHPS Survey protocols. Additionally, the survey vendors are required to conduct regular review and monitoring of their own operational systems, whether the survey is administered by mail or telephone. Data at RTI International is assessed quarterly for accuracy and missing data.
HHCAHPS Survey results for the updated survey instrument will be publicly reported once 12 months of survey data have been collected.
Home health agencies contract with independent survey vendors, trained and approved by CMS, to conduct monthly data collection and submit the data files to CMS’s implementation contractor. Procedures to analyze and develop the proposed composite and individual measures are well-established by CMS’s implementation contractor. The proposed revisions will not cause a significant burden to vendors or CMS’s implementation contractor, other than the modification of existing computer programs. No changes are being requested in terms of the data elements that home health agencies already submit to their approved HHCAHPS Survey vendor each month.
HHCAHPS Survey data submitted to the HHCAHPS Data Center (CMS’s implementation contractor) are de-identified. Case-level data are assigned a unique numeric or alphanumeric ID that cannot have any patient or home health agency identifying information. HHCAHPS Survey results are aggregated at the agency level and publicly reported at the agency-, state- and national-level.
CMS-approved survey vendors are required to protect patient confidentiality and follow data security requirements which are described in the HHCAHPS Survey Protocols and Guidelines Manual, available here: https://homehealthcahps.org/Survey-and-Protocols/Survey-Materials. Vendors are required to have all staff who work with HHCAHPS Survey data sign a Confidentiality Agreement. Vendors must limit access to confidential data only to authorized staff and only on a need-to-know basis. Vendors that offer HHCAHPS Survey summary reports to their client agencies are required to withhold demographic data, even at the aggregate level, unless there are 11 responses to each response category for that demographic survey question. This is to avoid the potential of identifying a respondent. CMS allows vendors to link responses to the patient’s name ONLY IF consent is granted via CMS’s “Consent to Share Responses” question that vendors have the option of including in the HHCAHPS Survey instrument. CMS regularly reviews these described confidentiality and data security practices with each active vendor during audits of the vendors’ operations.
After the field test, CMS approved the inclusion of several new items and the deletion of other items from the current nationally fielded HHCAHPS Survey. The revised instrument is being submitted via this Instrument submission. Associated Instrument-derived measures have also been modified from those that are currently reported and are being submitted as part of this effort.
Proprietary Information
Scientific Acceptability
Testing Data
The HHCAHPS Survey can be administered in one of three modes: Mail Only, Telephone Only, or Mixed Mode (mail plus telephone follow up) in national implementation. In 2022, CMS conducted a field test with 100 participating home health agencies, to test the revised survey. Data were collected in English and Spanish, by mail, telephone, mail with telephone follow-up, and (tested) web with mail follow-up. Measures were developed using the completed field test data, which included 6,150 completed surveys.
The national HHCAHPS Survey is offered in six languages: English, Spanish, Chinese (simplified and traditional), Russian, Vietnamese, and Armenian. Once approved for national implementation, the revised survey will also be translated into these languages. Additional translations will be made as needed.
There are no fees or licensing for use of the HHCAHPS® Survey, training or oversight activities, or for accessing publicly reported HHCAHPS® Survey measure scores or star ratings on the CMS Medicare.gov website.
No differences in data used for reliability, validity, exclusions or risk adjustment. Inter-unit reliability were reported for agencies with at least 100 respondents.
The field test of a revised Home Health Care CAHPS Survey instrument was conducted from April through July 2022 with patients from 100 Medicare-certified home health agencies (HHAs) nationwide. The final instrument and associated instrument-derived measures were developed from this field test.
We used a two-stage stratified sample design. In the first stage, we selected 100 HHAs from a pool of over 400 volunteers. We selected the final sample to ensure comparability to the national distribution of agencies across several characteristics: agency ownership, facility type, and urbanicity. Additionally, we looked at agency size and geographic location.
Prior to selecting the HHA sample, we removed the smallest agencies (those with fewer than 100 patients served annually). We made this decision because smaller agencies would not have sufficient sample to field both their regular HHCAHPS monthly survey and enough left over to use for the field test. Our previous research showed that in comparing levels of agency size, the point differences for HHCAHPS items were extremely small. There was also no consistent pattern of largest and smallest scores across all our measures. This suggested strongly that we could accept volunteer HHAs without regard to their size.
The final sample closely mirrored the national population of HHAs, with the exception of small agencies. In the sample and the population, most are urban, non-institutional, and for-profit. The sample consisted of 90 medium-sized agencies (101–1,500 patients annually) and 10 large agencies (1,501+ patients annually).
In the second stage, we selected patients from participating agencies, randomly assigning sample patients to one of the four data collection modes, with a target of completing approximately 1,570 interviews for each mode. This design allowed us to measure mode effects related to nonresponse and measurement differences (e.g., because of social desirability) and also allowed for patient-mix analyses.
In the second stage of our sample design for the field test, we selected patients from participating agencies, randomly assigning sample patients to one of the four data collection modes, with a target of completing approximately 1,570 interviews for each mode. This design allowed us to measure mode effects related to nonresponse and measurement differences (e.g., because of social desirability) and also allowed for patient-mix analyses.
The table below shows the characteristics of the patients who responded to the 2022 field test.
Patient characteristics | n | Percent
Number of respondents | 6,150 | 100
Age*
18–49 | 148 | 2.4
50–64 | 572 | 9.3
65–74 | 1,667 | 27.1
75–84 | 2,165 | 35.2
85+ | 1,593 | 25.9
Gender
Male | 2,460 | 40
Female | 3,690 | 60
Overall health*
Missing | 387 | 6.3
Excellent | 363 | 5.9
Very Good | 879 | 14.3
Good | 1,876 | 30.5
Fair | 1,968 | 32
Poor | 677 | 11
Mental health*
Missing | 264 | 4.3
Excellent | 867 | 14.1
Very Good | 1,556 | 25.3
Good | 2,005 | 32.6
Fair | 1,187 | 19.3
Poor | 277 | 4.5
Live alone*
Missing | 326 | 5.3
Yes | 1,851 | 30.1
No | 3,973 | 64.6
Education*
Missing | 394 | 6.4
Less than 8th grade | 387 | 6.3
Some high school but did not graduate | 566 | 9.2
High School graduate/GED | 1,925 | 31.3
Some college or 2-year degree | 1,494 | 24.3
4-Year college graduate or more | 1,378 | 22.4
Ethnicity*
Missing | 375 | 6.1
Yes, Hispanic or Latino/a | 252 | 4.1
No, Not Hispanic or Latino/a | 5,523 | 89.8
Race*
Missing | 418 | 6.8
White | 4,717 | 76.7
Black or African American | 843 | 13.7
Asian | 68 | 1.1
Native Hawaiian or Other Pacific Islander | 6 | 0.1
American Indian or Alaska Native | 31 | 0.5
Multiple Races Selected | 68 | 1.1
Language of survey completion
English | 6,045 | 98.3
Spanish | 105 | 1.7
* Indicates statistical significance at the p<0.05 level.
Note: Not all percentages will sum to 100% because of rounding.
Reliability
We used PROC CORR in SAS version 9.4 to compute standardized Cronbach’s alphas to determine internal consistency reliability of the HHCAHPS measures at the respondent level. Internal consistency measures how well the items on the scale correlate with each other and therefore appear to be measuring the same concept. For the Care of Patients composite, all items had 2% or less of participants with missing values. The first three items on the Communications between Providers and Patients composite (keep you informed; explain things; and listen carefully) had less than 1% of patients with missing values. Due to the possibility of not applicable responses, the missing data percentages for the remaining two items were substantially larger: Get help when needed (63%) and family and friends (26%). As a sensitivity check, alphas were computed with and without the inclusion of cases with missing values. The values for alpha under the two scenarios were nearly identical with minimal differences: Care of Patients (difference=0.000106) and Communications between Providers and Patients (difference=0.00934). We have presented the Cronbach’s alpha values with the inclusion of cases with missing values below.
Table 1 presents the Cronbach’s alphas for the Care of Patients composite measure overall and if each individual item was deleted from the measure, using respondent-level data. In addition, the table includes Pearson’s correlations between each item and the remaining items (i.e., the composite if the corresponding item is removed) based on respondent-level data.
Table 2 presents the Cronbach’s alphas for the Communications Between Providers and Patients composite measure overall and if each individual item was deleted from the measure, using respondent-level data. In addition, the table includes Pearson’s correlations between each item and the remaining items (i.e., the composite if the corresponding item is removed) based on respondent-level data.
Table 1. Cronbach’s Alpha Coefficients (Overall and With Removal of Each Item) and Item-Total Correlations for CAHPS Home Healthcare Survey Care of Patients Composite
Composite/Item | Item-Total Correlation | Alpha if Item Deleted
Care of Patients (Cronbach’s alpha=0.84)
In the last 2 months, how often did home health staff from this agency seem to be aware of all of the care or treatment you got at home? | 0.65 | 0.80
In the last 2 months, how often did home health staff from this agency treat you with care – for example, when moving you around or changing a bandage? | 0.52 | 0.83
In the last 2 months, how often did home health staff from this agency treat you with courtesy and respect? | 0.65 | 0.80
In the last 2 months, how often did you feel that home health staff from the agency cared about you as a person? | 0.73 | 0.77
In the last 2 months, have the services you received from this agency helped you take care of your health? | 0.63 | 0.80
Table 2. Cronbach’s Alpha Coefficients (Overall and With Removal of Each Item) and Item-Total Correlations for CAHPS Home Healthcare Survey Communications between Providers and Patients Composite
Composite/Item | Item-Total Correlation | Alpha if Item Deleted
Communications between Providers and Patients (Cronbach’s alpha=0.78)
In the last 2 months, how often did home health staff from this agency keep you informed about when they would arrive at your home? | 0.57 | 0.74
In the last 2 months, how often did home health staff from this agency explain things in a way that was easy to understand? | 0.65 | 0.71
In the last 2 months, how often did home health staff from this agency listen carefully to you? | 0.68 | 0.70
In the last 2 months, did home health staff from this agency provide your family or friends with information or instructions about your care as much as you wanted? | 0.49 | 0.76
When you contacted this agency’s office, did you get the help or advice you needed? | 0.42 | 0.79
As shown in Tables 1 and 2, the Cronbach’s alpha for both composites surpassed the 0.7 criterion for acceptable reliability for group-level comparisons (Nunnally & Bernstein, 1994): Care of Patients composite (alpha=0.84) and Communications Between Providers and Patients composite (alpha=0.78). In addition, item-total correlations exceeded 0.4 for all items.
Nunnally J, Bernstein L. Psychometric theory. New York: McGraw-Hill Higher, INC; 1994.
To assess accountable entity-level reliability, we calculated the inter-unit reliability of the Care of Patients HHCAHPS composite measure. This type of reliability represents the amount of variability in composite scores that is attributable to agency differences. These calculations were based on intraclass correlations and general linear models as implemented in SAS version 9.4. Analyses of inter-unit reliability included agencies with 100 or more respondents. Less than one percent of respondents had missing scores for this composite and were not included in these analyses.
See Table 2 in the Reliability_Validity_Questions_CareOfPatients-508_5.2.2 and 5.3.4 attachment in Section 7. It includes the inter-unit reliability of the Care of Patients HHCAHPS composite measure and the items comprising the measure, using agency-level data.
| Overall | Minimum | Decile_1 | Decile_2 | Decile_3 | Decile_4 | Decile_5 | Decile_6 | Decile_7 | Decile_8 | Decile_9 | Decile_10 | Maximum |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Reliability | 0.47 | 0.00 | 0.00 | 0.00 | 0.13 | 0.64 | 0.31 | 0.27 | 0.00 | 0.46 | 0.81 | 0.48 | 0.82 |
Mean Performance Score | 81.6 | 90.0 | 77.3 | 86.5 | 79.7 | 77.4 | 79.6 | 81.6 | 84.5 | 82.1 | 81.5 | 81.5 | 79.6 |
N of Entities | 100 | 1 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 1 |
N of Persons / Encounters / Episodes | 6150 | 2 | 65 | 122 | 151 | 209 | 268 | 320 | 447 | 647 | 1170 | 2751 | 676 |
With an inter-unit reliability of 0.72, the composite exceeds Nunnally & Bernstein (1994)’s criteria for acceptable reliability for making group-level comparisons. Based on Koo and Li (2016)’s classifications for intraclass correlations, the composite would be on the high end of the moderate reliability classification which ranges from 0.50 to 0.75.
Koo, T. K., & Li, M. Y. (2016). A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. Journal of Chiropractic Medicine, 15(2), 155-163.
Nunnally J, Bernstein L. Psychometric theory. New York: McGraw-Hill Higher, INC; 1994.
Validity
Structural validity of the HHCAHPS composites was determined by conducting confirmatory factor analyses of the items comprising the two multi-item measures (Care of Patients and Communications between Providers and Patients), using Mplus version 8. Discriminant validity was evaluated by computing correlations of each item with sum scores for its own multi-item measure, and with the other two multi-item measures. Correlations between an item and its corresponding composite are item-total correlations which correlate an item with all remaining items after removing the item of interest from the measure. To demonstrate good discriminant validity, items should be more highly correlated with their own measure than other measures. Scaling success rates were then calculated to determine the percentage of times each item correlated with its own measures at or above the level it correlated with the other measure. Construct validity was assessed by examining correlations of the Care of Patients and Communication between Providers and Patients composites with each other and with each of the two global measures (Overall Rating of Care from the Agency and Would Recommend the Agency to Family or Friends). Polychoric correlations were computed for analyses at the respondent-level to account for correlations between two categorical variables.
Muthén, L.K. and Muthén, B.O. (1998-2017). Mplus User’s Guide. Eighth Edition. Los Angeles, CA: Muthén & Muthén
See "ValidityTestingResultsTables-508_5.3.4" attachment in section 7.
Based on the correlations of the items with the two composites, scaling was 80% with all but one of the items on the Care of Patients composite having a higher correlation with their own composite (Table 4). Using Cohen’s effect size criterion, a correlation of 0.50 or higher is considered to be a medium-sized effect. Correlations between the Care of Patients composite and the global items exceeded this criterion with values ranging from 0.58 to 0.64 based on respondent-level data, supporting the validity of the composite (Table 5).
Using Cohen’s effect size criterion, a correlation of 0.20 is considered a small-sized effect while a correlation of 0.50 is considered a medium-sized effect. The individual items had small to medium sized correlations with the global items, suggesting acceptable validity: Q3 (r=0.39 to 0.57), Q4 (r=0.35 to 0.41), and Q5 (r=0.33 to 0.46).
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillside, NJ: Lawrence Erlbaum Associates
- Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), 1-55.
- MacCallum, R.C., Browne, M.W., & Sugawara, H.M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1, 130-149.
- Schumacker, R. E., and Lomax, R. G. (2004). A beginner's guide to structural equation modeling, Second edition. Mahwah, NJ: Lawrence Erlbaum Associates.
Construct validity was assessed at the agency level by computing Pearson correlations of the Care of Patients composite with the Communications between Providers and Patients composite and with each of the two global measures (Overall Rating of Care from the Agency and Would Recommend the Agency to Family or Friends).
See Tables 3 and 5 in Reliability_Validity_Questions_CareOfPatients-508_5.2.2 and 5.3.4 file in section 7.
The two-factor confirmatory factor model fit well, supporting the structural validity of the composites (Table 3). The model fit indices fit the criteria for a good fit with values of CFI and TLI exceeding 0.95 (Schumacker & Lomax, 2004; Hu & Bentler, 1999) and the RMSEA less than 0.05 (MacCallum et al., 1996). Using Cohen’s effect size criterion, a correlation of 0.50 or higher is considered to be a medium-sized effect. Correlations between the Care of Patients composite and the global items exceeded this criterion with values ranging from 0.58 to 0.64 at the respondent level and 0.66 to 0.67 at the agency level, supporting the validity of the composite (Table 5).
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillside, NJ: Lawrence Erlbaum Associates
- Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), 1-55.
- MacCallum, R.C., Browne, M.W., & Sugawara, H.M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1, 130-149.
- Schumacker, R. E., and Lomax, R. G. (2004). A beginner's guide to structural equation modeling, Second edition. Mahwah, NJ: Lawrence Erlbaum Associates.
Risk Adjustment
The conceptual model was designed to first test case-mix using standard characteristics that have been found to potentially cause patients to respond differently to surveys. These include demographic measures of interest (age, gender, race/ethnicity, education), Medicare/Medicaid status, social vulnerabilities (live alone, self-reported mental health, self-reported physical health, functional status, diagnoses), source of admission, and survey characteristics (mode, proxy, language). This group resembles the group used in the initial field test for HHCAHPS prior to national implementation and the test groups used in subsequent field tests and mode experiments. Ultimately, mode, race/ethnicity and source of admission did not make it into the final HHCAHPS patient mix adjusters set; however, language in which the survey was completed did. All measures were included in initial regression testing and case-mix factors were removed based on (1) the significance of the factor and (2) the consistency of the direction of the factor as described in 5.4.4.
See 5 attachments in Section 7: TableFacilityType_5.4.3, TableOwnership_5.4.3, TableRegion_5.4.3, TableSize_5.4.3, and TableUrbanicity_5.4.3.
OLS regressions were run using all potential risk factors for which we had data. Levels of the potential risk factors were coded into 0,1 indicators and included with HHA-level fixed effects to control for HHA-level differences not related to patient care. From past experience we identified apriori decision rules for removing independent variables from successive stepwise regression runs based on the statistical significance of the effect and the consistency of the direction of effects. We split independent variables into three groups, based on whether any level of the potential risk factor was covered by the definition of the group.
Group 1 - At least three times statistically significant at p<0.05 and a “general” magnitude of more than 2 percentage points, “general” defined as being at this magnitude more often than not.
Group 2 - At least five times statistically significant and a “general” magnitude of less than 2 percentage points.
Group 3 - Two times statistically significant and a “general” magnitude of more than 2 percentage points OR four times statistically significant and a “general” magnitude of less than 2 percentage points
If any level of a potential adjustment variable from one of the regressions fell into a group above, the entire risk factor was considered to be part of that group. Groups 1 and 2 would be included in the future model while Group 3 would be reviewed for possible inclusion. Any risk factor not falling into a group could be removed before proceeding to the next stepwise regression sequence. For consistency we also considered risk factors in current use in national implementation for further analysis beyond the initial exclusion
The first regression model eliminated diagnoses groups while indicating issues with several other potential risk factors. To be prudent the second model only eliminated the majority of the large number of diagnoses keeping two currently used, to ensure the inclusion of these were not influencing the statistical significance of the remaining risk factors.
The second regression model removed the two diagnosis groups and five other potential risk factors. The third regression model confirmed the remaining risk factors met our apriori decision rules when applied to the model.
After 3 stepwise regression the results were shared with CMS and the model was finalized to include survey mode, survey language, age, proxy status, education level, self-reported overall health, self-reported mental health, and an indicator of whether or not the patient lives alone.
The models were tested to ensure that the use of these measures would impact the final results but be unlikely to themselves create differences in comparisons of patient care from what the measures might have shown before the adjustments. Regression models on the HHCAHPS top-box scores were estimated applying the general form of the equation for calculating adjusted scores. Adjusted HHCAHPS score based on these adjusters were calculated.
To attempt to remove bias from the decision of inclusion or exclusion from the final model, we made our decisions adhere to the a priori groupings as often as possible. The a priori groupings have shown us from past experience which variables are impactful and should be included in our risk adjustment.
The survey mode variable had all levels included in an inclusion group and telephone showed a good number of significant results and consistent negative direction. It was included. The proxy variable showed a large enough number of significant results to include.
The age levels showed a large enough number of significant results to include noting that the oldest age group showed statistical significance quite often. The age levels younger and older than the reference age had negative coefficients confirming that age levels rather than a linear age was most appropriate.
Education showed a large enough number of significant results to include. The higher education levels were often statistically significant. While the lower education levels were not significant, we cannot choose to include some levels and not others. The variable merits inclusion for adjustment.
The live alone variable had the highest number of significant results of any risk factor tested and was consistent in its direction. It should be included.
Both self-reported overall and mental health variables had enough significant results to include.
Sex, the numbers of activities of daily living (ADL) deficits, admission source, payer type, and 22 tested diagnosis code groupings did not show enough significance to merit inclusion.
The survey language variable did not show significance. However, the survey for the field test could only be funded to include two languages. The survey for the full survey has and will continue to be offered in several more languages. As this risk factor was unable to be properly tested against how it will be used it was determined that it should be included in the final model in an abundance of caution.
The final model as noted in item 4.4.4 presents a group of risk factors that have shown that they impact the results seen in a way that is statistically significant. This model will be used to create a series of static (survey mode) and quarterly updated (all other risk factors) adjustments to apply to the measures in the survey, including the individual measures that make up this composite measure. These adjustments will transform results of the measure so the results from HHAs can be properly compared only in terms of patient care.
Use & Usability
Use
Home health agencies
Usability
HHAs’ scores will increase as they target these measures, which improve patients’ perception of their home health experience. Each HHA may have different training mechanisms or internal procedures which can help improve the overall quality of the services they deliver.
HHAs should target the domains of care covered by the instrument (e.g., show courtesy and respect, listen carefully, and explain things in a way that is easy to understand) in order to improve their scores on the proposed measures. Examples of how agencies can implement performance improvement include staff training, checklists of areas to cover during initial and follow-up appointments, and being aware of potential challenges (e.g., language barriers, physical or mental limitations that the patient may have).
No feedback obtained, as the revised instrument has not been implemented beyond the field test stage. With respect to the current instrument that is in implementation, feedback cited in 6.2.3 led to the revisions we are proposing.
The proposed multi-item measure (modified from the current Care of Patients measure used in national implementation) will potentially be one of seven HHCAHPS publicly reported measures that CMS chooses to use. It will potentially be included in the HHCAHPS star ratings methodology and the Home Health Value-Based Purchasing algorithm.
The existing measure has been reported since 2012 and is part of the Home Health Value Based Purchasing program. The HHCAHPS team does not solicit feedback on measure performance other than for targeted instrument development activities, such as that described in 6.2.3.
The revised survey reflects patient experiences with their home health agency across a variety of domains that were identified as important to patients and stakeholders based on focus groups and cognitive interviews conducted during the questionnaire revision development phase.
Qualitative research began in 2016 with literature reviews, focus groups, and telephone interviews with people receiving home health care and proxy respondents (e.g., family members) to understand home health patients’ experiences and what characteristics constitute high-quality home health care. In 2018, RTI conducted key informant interviews with home health agency representatives, hospital discharge planners and Technical Expert Panel (TEP) members to gain their feedback on potential new survey items based on findings from the focus groups and telephone interviews.
In a 2018 article*, RTI reported on a qualitative analysis and conceptual mapping of patient experiences in home health care conducted to support potential revisions to the HHCAHPS Survey instrument. RTI conducted a literature review and identified 104 domains reflecting home health patient experience. We then used conceptual mind-mapping software to arrange the domains in related clusters. The findings from the literature review were used to inform focus group and interviewer moderator guides. We conducted focus groups and interviews with patients to assess the level of importance that home health care patients place on specific components of care. Many of the domains identified in the literature review were highlighted by participants in the focus groups, including domains on the personal characteristics of the caregiver and those relating to technical competency, availability of needed care, and amount of care/time spent with the patient. We used the conceptual mapping and focus group and interview results to develop additional survey questions for testing.
The current Care of Patients measure is publicly reported on CMS’s website and used in Home Health Value-Based Purchasing. We tested a shortened version of the HHCAHPS Survey, which resulted in the removal of one item from the current multi-item measure and the inclusion of two new items.
Citation:
*Lines, L. M., Anderson, W. L., Blackmon, B. D., Pronier, C. R., Allen, R. W., & Kenyon, A. E. (2018). Qualitative analysis and conceptual mapping of patient experiences in home health care. Home Health Care Services Quarterly, 37(1), 25-40.
This measure reflects patient experiences with their home health agency across a variety of domains that were identified as important to patients and stakeholders based on focus groups and cognitive interviews during the questionnaire revision development phase. See item 6.2.2 and 6.2.3 above.
This measure has not been implemented beyond the field test stage. However, since 2021 (with the current instrument), we have seen a modest improvement in the currently implemented measure from 88 to 89.
The revised survey has not been implemented beyond the field test phase. No unexpected findings were observed during the field test. Respondents appreciated the shorter survey.
Similar to other quality measures, this measure may lead to an emphasis on certain aspects of patient experience over those aspects not specifically named (e.g. when a patient is moved around or a bandage changed). However, because this aspect of patient experience has been deemed important by patients, caregivers, and provider stakeholders, we see the adverse consequences of such an emphasis as minimal. The proposed measure will increase transparency of the home health patient care experience to the public.
Public Comments
Public Comment