Agency Information Collection Activities; Submission to the Office of Management and Budget (OMB) for Review and Approval; Comment Request; American Community Survey Methods Panel Tests

Download PDF
Federal RegisterFeb 9, 2021
86 Fed. Reg. 8756 (Feb. 9, 2021)

AGENCY:

Census Bureau, Commerce.

ACTION:

Notice of information collection, request for comment.

SUMMARY:

The Department of Commerce, in accordance with the Paperwork Reduction Act (PRA) of 1995, invites the general public and other Federal agencies to comment on proposed, and continuing information collections, which helps us assess the impact of our information collection requirements and minimize the public's reporting burden. The purpose of this notice is to allow for 60 days of public comment on the proposed revision of the American Community Survey Methods Panel Tests prior to the submission of the information collection request (ICR) to OMB for approval.

DATES:

To ensure consideration, comments regarding this proposed information collection must be received on or before April 12, 2021.

ADDRESSES:

Interested persons are invited to submit written comments by email to acso.pra@census.gov. Please reference American Community Survey Methods Panel Tests in the subject line of your comments. You may also submit comments, identified by Docket Number USBC-2021-0002, to the Federal e-Rulemaking Portal: http://www.regulations.gov. All comments received are part of the public record. No comments will be posted to http://www.regulations.gov for public viewing until after the comment period has closed. Comments will generally be posted without change. All Personally Identifiable Information (for example, name and address) voluntarily submitted by the commenter may be publicly accessible. Do not submit Confidential Business Information or otherwise sensitive or protected information. You may submit attachments to electronic comments in Microsoft Word, Excel, or Adobe PDF file formats.

FOR FURTHER INFORMATION CONTACT:

Requests for additional information or specific questions related to collection activities should be directed to G. Brian Wilson, U.S. Census Bureau, American Community Survey Office, 301-763-2819, George.Brian.Wilson@census.gov.

SUPPLEMENTARY INFORMATION:

I. Abstract

The American Community Survey (ACS) is an ongoing monthly survey that collects detailed housing and socioeconomic data from about 3.5 million addresses in the United States and about 36,000 addresses in Puerto Rico each year. The ACS also collects detailed socioeconomic data from about 195,000 residents living in group quarters (GQ) facilities in the United States and Puerto Rico. Resulting tabulations from this data collection are provided on a yearly basis. The ACS allows the Census Bureau to provide timely and relevant housing and socioeconomic statistics, even for low levels of geography.

An ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue research, testing, and evaluations aimed at improving data quality, reducing data collection costs, and improving the ACS questionnaire content and related data collection materials. The ACS Methods Panel is a research program designed to address and respond to survey issues and needs. As part of the Decennial Census Program, the ACS also provides an opportunity to research and test elements of survey data collection that relate to the decennial census. As such, the ACS Methods Panel can serve as a testbed for the decennial census. From 2021 to 2024, the ACS Methods Panel may test ACS and decennial census methods for reducing survey cost, addressing respondent burden, and improving survey response, data quality, and survey efficiencies. The ACS Methods Panel may also address other emerging needs of the programs.

At this time, plans are in place to propose several tests related to self-response, group quarters, and nonresponse follow up data collection operations. Tests may also be conducted to explore the use of administrative records. Because the ACS Methods Panel is designed to address emerging issues, we may propose additional testing as needed. Any testing would focus on methods for reducing data collection costs, improving data quality, improving the respondent experience, revising content, or testing new questions that have a need to be included in the Decennial Census Program. The proposed tests are outlined below.

Self-Response Mail Messaging and Contact Strategies Testing: In response to declining response rates and increasing costs, the Census Bureau plans to study methods to increase self-response, the least expensive mode of data collection. The Census Bureau currently sends up to five mailings to a sampled address to inform the occupants that their address has been selected to participate in the ACS and to encourage them to self-respond to the ACS. The proposed tests would evaluate changes to the mailings, such as using plain language to improve communication, changing the look and feel of the materials, updating messages to motivate response, and adding or removing materials included in the mailings. Changes to the contact method, the number of contacts, and the timing of the contacts may also be tested. Multiple tests may be conducted.

Respondent Feedback Pilot Test: Currently, ACS participants who want to give feedback on their survey experience must call, email, or send letters directly to the Census Bureau. The Census Bureau is considering adding a method for participants to provide feedback at the end of the survey. Because collecting respondent feedback of this nature is new to the ACS, we conducted cognitive testing to inform recommendations on its wording and display. Additionally, we wanted to learn about participants' perceptions about having this type of item on the ACS generally, including whether they understood its purpose. Cognitive testing resulted in recommended wording for the feedback question and indicated that respondents understood the purpose of the feedback question (Katz, forthcoming). The Census Bureau is now ready to pilot a feedback question to establish a way to process the comments and evaluate the type of comments received. Respondents will not be required to answer the feedback question.

Testing the Use of Administrative Data: The Census Bureau has made significant progress exploring the use of administrative data in surveys and censuses, potentially as a substitute for questions asked of respondents. Administrative data refer to data collected by government agencies and other sources for the purposes of administering programs or providing services. The Census Bureau has evaluated the availability and suitability of several different data sources for use in the ACS to replace or supplement questions pertaining to telephone service, the year a residence was built, condominium status, income, residence one year ago, and self-employment income. We are currently exploring administrative data use to replace or supplement questions pertaining to property values, property taxes, and acreage. Similarly, we plan to evaluate the availability and suitability of using administrative records in lieu of enumeration for institutional GQs (U.S. Census Bureau, 2017).

Administrative data may also be used to reduce burden of existing questions by allowing for modification of the questions. For example, the ACS asks respondents to provide their total income for the past 12 months as well as income received from various sources (wages, interest, retirement income, etc.). By supplementing data collection on income with administrative records, we may be able to modify questions to only ask about the source of income rather than the amount. We are cognitively testing this change as well as changing the reference year from the past 12 months to the previous calendar year to align with administrative records sources. As a continuation of this research, the Census Bureau proposes a field test of revised content for income as well as other topics both for the housing unit questionnaire as well as the GQ questionnaire. Some questions may be modified while others would be removed. Multiple tests may be conducted.

Group Quarters Testing: The ACS samples about 19,000 GQ facilities each year. A GQ is a place where people live or stay in a group living arrangement that is owned or managed by an entity or organization providing housing and/or services for the residents. There are two categories of GQs: Institutional and noninstitutional. Institutional GQs include places such as correctional facilities and nursing homes. Noninstitutional GQs include college housing, military barracks, and residential treatment centers. Most interviews conducted in GQs are interviewer-administered (94 percent of interviews in institutional GQs and 75 percent in noninstitutional GQs), but some GQ respondents self-respond using a paper questionnaire. The Census Scientific Advisory Committee Working Group on Group Quarters in the ACS recommended that the Census Bureau consider making “an internet version of the ACS available to noninstitutional GQ residents, especially in college dorms, military barracks, and group homes.” Additional support was identified for this proposal in a workshop held in 2016 with the National Academies of Science Committee on National Statistics (National Academies of Sciences, Engineering, and Medicine, 2016). The Census Bureau proposes a field test of an internet ACS self-response GQ form for residents in noninstitutional GQs. We would evaluate the quality of the data received from the internet instrument compared with traditional data collection methods for GQs (paper questionnaires and interviewer-administered) as well as assess operational issues with offering the internet option, including feedback from interviewers.

Content Testing: Working through the Office of Management and Budget Interagency Committee for the ACS, the Census Bureau solicited proposals from other Federal agencies to change existing questions or add new questions to the ACS. These proposals included changes to the following questions: Household roster, educational attainment, health insurance, disability, means of transportation to work, income, weeks worked, Supplemental Nutrition Assistance Program (SNAP), condominium fees, and home heating fuel. Additionally, three new questions on solar panels, electric vehicles, and sewage disposal were proposed. The objective of content testing is to determine the impact of changing question wording and response categories, as well as redefining underlying constructs, on the quality of the data collected. The Census Bureau proposes evaluating changes to current questions by comparing the revised questions to the current ACS questions. For new questions, the Census Bureau proposes comparing the performance of two versions of any new questions and benchmark results with other well-known sources of such information. The questions would be tested using all modes of data collection. Response bias or variance may also be measured to evaluate the questions by conducting a follow up interview with respondents. Multiple tests may be conducted. Additional content testing may include a shift in the content collection strategy for the fifth person in the household on the paper questionnaire. In order to reduce respondent burden for large households who self-respond using the paper questionnaire, as well as potentially increase self-response by reducing the size of the paper questionnaire, one testing proposal includes no longer collecting detailed data for Person 5 on the paper questionnaire (i.e., the same items collected for Person 1 through 4) and only collecting basic demographic information (as is currently done for Person 6 through Person 12). Detailed person information for households with five or more people would be collected through a telephone follow-up, similar to what is currently done for households with six or more people.

Internet Instrument Testing: In 2013, the ACS incorporated the use of an internet instrument to collect survey responses. The design of the instrument reflected the research and standards of survey data collection at that time. With a growing population using the internet to respond to the ACS, as well as the increased use of smartphones and other electronic devices with smaller screens, an evaluation and redesign of the internet instrument is needed. Design elements will be developed and tested based on input from experts in survey methodology and web survey design. Testing may include revisions focused on improving login procedures and screen navigation, improving the user interface design, as well as methods to decrease respondent burden. Multiple tests may be conducted.

Respondent Help Testing: If respondents need help completing the ACS or have questions, they can call the Telephone Questionnaire Assistance (TQA) toll-free hotline. When respondents call the TQA, they enter an Interactive Voice Recognition (IVR) system, which provides some basic information on the ACS and recorded answers to frequently asked questions. Callers can also request to speak directly to a Census Bureau employee. The Census Bureau is proposing potential testing of changes to the IVR system to improve content and efficiencies in the system. Other methods of offering help to respondents may also be explored and tested, such as the use of chatbots and live online chat assistance.

Nonresponse Follow up Data Collection Testing: The Census Bureau is proposing to evaluate the use of adaptive survey design techniques for the ACS nonresponse follow up operation (typically referred to as the ACS CAPI operation). Models and rules would be developed to predict case outcomes and determine interventions for a case, such as assigning a case to a refusal specialist. The models and rules would also prioritize cases based on the likelihood of completing an interview. The adaptive approach would be evaluated by comparing results to traditional methods of case assignment and progress.

II. Method of Collection

The American Community Survey is collected via the following modes: internet, paper questionnaire, telephone interview, and in-person interview (CAPI). The Census Bureau sends up to five mailings to eligible housings units to encourage self-response. Respondents may receive help by utilizing an IVR system (though response cannot be provided by IVR). Respondents can also call our TQA for help or to respond. Interviewers may visit a housing unit or sampled GQ facility to conduct an interview in person or may conduct the interview by phone. Administrative records are also used to replace, supplement, and support data collection. ACS Methods Panel Tests use all of these modes of data collection or a subset of the modes, depending on the purpose of the test. Specific modes for the tests are noted below.

Self-Response Mail Messaging and Contact Strategies Testing evaluate mailout materials that solicit self-response using internet, paper questionnaire, and telephone responses. Tests will include housing units only.

The Respondent Feedback Pilot would be conducted with self-response modes of data collection. The pilot would include housing units only.

The Use of Administrative Data Test would test replacing or substituting all or parts of the ACS with administrative data. Respondents could be presented with a new version of the ACS with some questions not asked and others modified, as compared to production ACS. Evidence suggests that the type of response mode that respondents choose (internet, paper questionnaire, or interviewer-administered) is correlated with different socioeconomic characteristics. Therefore, this test will include all modes of data collection. This test would include respondents in both HUs and GQs.

Group Quarters Testing would test the addition of an internet self-response GQ form for residents in noninstitutional GQs. A sample of GQ respondents will be given the option of completing the survey via self-response using an internet instrument. Other residents will continue to be interviewed by paper questionnaire or interviewer-administered modes.

Content Testing for item-level changes will be conducted as a split-sample experiment, with half of the sampled addresses receiving one version of the questions and the other half receiving a different version of the questions. All modes of ACS data collection are included in the test. Additionally, a follow-up reinterview may be conducted with all households that respond to measure response bias or response variance. Comparisons will be made between the treatments to assess data quality. Changes to the person-level content collection strategy will also be conducted as a split-panel experiment involving only the paper questionnaire; comparisons will be made between treatments to assess response and data quality.

Internet Instrument Testing will assess modifications to the internet instrument conducted via split-sample experiments. Only the internet mode of self-response is included in the testing.

Respondent Help Testing of modifications to respondent help may include telephone and internet modes. These tests focus on evaluating mechanisms to answer questions and provide help to respondents.

Nonresponse Follow up Data Collection Testing will focus on in-person and telephone interviews conducted by Census Bureau field representatives (FRs). As part of their interaction with respondents, FRs also encourage response online. Respondents may also mail back a paper questionnaire they received during the self-response phase of the ACS.

III. Data

OMB Control Number: 0607-0936.

Form Number(s): ACS-1, ACS-1(GQ), ACS-1(PR)SP, ACS CAPI(HU), and ACS RI(HU).

Type of Review: Regular submission, Request for a Revision of a Currently Approved Collection.

Affected Public: Individuals or households.

Estimated Number of Respondents:

Test Estimated number of respondents
Self-Response Mail Messaging and Contact Strategies Testing Test A-60,000.
Test B-60,000.
Test C-60,000.
Test D-60,000.
Test E-60,000.
Test F-60,000.
Respondent Feedback Pilot 100,000.
Use of Administrative Data Test 100,000.
Group Quarters Testing 500.
Content Testing Test A-70,000.
Test B-70,000.
Content Testing Follow-up Interview Test A-40,000.
Test B-40,000.
Internet Instrument Testing Test A-60,000.
Test B-60,000.
Test C-60,000.
Test D-60,000.
Test E-60,000.
Test F-60,000.
Respondent Help Testing 40,000.
Nonresponse Follow up Data Collection Testing 100,000.

Estimated Time per Response:

Test Estimated time per response (in minutes)
Self-Response Mail Messaging and Contact Strategies Testing 40.
Respondent Feedback Pilot 42 (40 minutes for the production ACS interview and 2 minutes for the optional follow-up questions).
Use of Administrative Data Test 40.
Group Quarters Testing 40 (including the facility interview).
Content Testing 40.
Content Testing Follow-up Interview 40.
Internet Instrument Testing 40.
Respondent Help Testing 10.
Nonresponse Follow up Data Collection Testing 40.

Estimated Total Annual Burden Hours:

Test Estimated number of respondents Estimated time per response (in minutes) Total burden hours
Self-Response Mail Messaging and Contact Strategies Testing Test A-60,000 Test B-60,000 Test C-60,000 Test D-60,000 Test E-60,000 Test F-60,000 40 Test A-40,000. Test B-40,000. Test C-40,000. Test D-40,000. Test E-40,000. Test F-40,000.
Respondent Feedback Pilot 100,000 42 70,000.
Use of Administrative Data Test 100,000 40 66,667.
Group Quarters Testing 500 40 334.
Content Testing Test A-70,000 Test B-70,000 40   Test A-46,667 Test B-46,667
Content Testing Follow-up Interview Test A-40,000 Test B-40,000 40   Test A-26,667 Test B -26,667.
Internet Instrument Testing Test A-60,000 Test B-60,000 Test C-60,000 Test D-60,000 Test E-60,000 Test F-60,000 40 Test A-40,000 Test B-40,000. Test C-40,000. Test D-40,000. Test E-40,000. Test F-40,000.
Respondent Help Testing 40,000 10 6,667.
Nonresponse Follow up Data Collection Testing 100,000 40 66,667.
Total (over 3 years) * 1,280,500 837,003.
Annual Burden Hours 426,834 279,001.
* Note: This is the maximum burden requested for these tests. Every effort is taken to use existing production sample for testing when the tests do not involve content changes.

Estimated Total Annual Cost to Public: $0. (This is not the cost of respondents' time, but the indirect costs respondents may incur for such things as purchases of specialized software or hardware needed to report, or expenditures for accounting or records maintenance services required specifically by the collection.)

Respondent's Obligation: Mandatory.

Legal Authority: Title 13 U.S.C. Sections 141, 193, and 221.

IV. Request for Comments

We are soliciting public comments to permit the Department/Bureau to: (a) Evaluate whether the proposed information collection is necessary for the proper functions of the Department, including whether the information will have practical utility; (b) Evaluate the accuracy of our estimate of the time and cost burden for this proposed collection, including the validity of the methodology and assumptions used; (c) Evaluate ways to enhance the quality, utility, and clarity of the information to be collected; and (d) Minimize the reporting burden on those who are to respond, including the use of automated collection techniques or other forms of information technology.

Comments that you submit in response to this notice are a matter of public record. We will include, or summarize, each comment in our request to OMB to approve this ICR. Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that your entire comment—including your personal identifying information—may be made publicly available at any time. While you may ask us in your comment to withhold your personal identifying information from public review, we cannot guarantee that we will be able to do so.

Sheleen Dumas,

Department PRA Clearance Officer, Office of the Chief Information Officer, Commerce Department.

[FR Doc. 2021-02606 Filed 2-8-21; 8:45 am]

BILLING CODE 3510-07-P