Agency Information Collection Activities; Submission for Office of Management and Budget Review; Comment Request; National Direct-to-Consumer Advertising Survey

Download PDF
Federal RegisterAug 4, 2016
81 Fed. Reg. 51450 (Aug. 4, 2016)

AGENCY:

Food and Drug Administration, HHS.

ACTION:

Notice.

SUMMARY:

The Food and Drug Administration (FDA) is announcing that a proposed collection of information has been submitted to the Office of Management and Budget (OMB) for review and clearance under the Paperwork Reduction Act of 1995.

DATES:

Fax written comments on the collection of information by September 6, 2016.

ADDRESSES:

To ensure that comments on the information collection are received, OMB recommends that written comments be faxed to the Office of Information and Regulatory Affairs, OMB, Attn: FDA Desk Officer, FAX: 202-395-7285, or emailed to oira_submission@omb.eop.gov. All comments should be identified with the OMB control number 0910-NEW and title National Direct-to-Consumer Advertising Survey. Also include the FDA docket number found in brackets in the heading of this document.

FOR FURTHER INFORMATION CONTACT:

FDA PRA Staff, Office of Operations, Food and Drug Administration, Three White Flint North 10A63, 11601 Landsdown St., North Bethesda, MD 20852, PRAStaff@fda.hhs.gov.

SUPPLEMENTARY INFORMATION:

In compliance with 44 U.S.C. 3507, FDA has submitted the following proposed collection of information to OMB for review and clearance. National Direct-to-Consumer Advertising Survey—OMB Control Number 0910-NEW

I. Background

Section 1701(a)(4) of the Public Health Service Act (42 U.S.C. 300u(a)(4)) authorizes FDA to conduct research relating to health information. Section 1003(d)(2)(C) of the Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 393(d)(2)(c)) authorizes FDA to conduct research relating to drugs and other FDA regulated products in carrying out the provisions of the FD&C Act.

FDA last surveyed patients about their experiences with and attitudes toward DTC advertising in 2002 (Ref. 1). Numerous changes have affected the DTC landscape since 2002, including declines in print readership, the rise in online prescription drug promotion, and self-imposed industry guidelines for DTC advertising (Ref. 2). These changes may have affected consumers' exposure to different kinds of DTC advertising and its influence on their attitudes and behaviors. The purpose of the National Direct-to-Consumer Advertising Survey is to collect updated insights on consumer experiences with and attitudes towards DTC promotion of prescription drugs. This study will build on previous research by recruiting a wider range of respondents, weighting the data to make it nationally representative, and asking a wider range of questions about DTC promotion, including in online formats.

We plan to use an address-based mixed-mode methodology that will direct one randomly-chosen member of sampled households to complete a 20-minute online survey, with non-respondents receiving a paper questionnaire. The sample will be representative of the U.S. population. A sample of U.S. households will be drawn from the U.S. Postal Service Computerized Delivery Sequence File. Adults aged 18 or over will be eligible for participation. Up to five contacts will be sent to respondents by U.S. mail. The contacts will include the URL for the online survey and a unique personal identification number (PIN). This unique PIN will be used to track completed surveys without the use of personally identifying information. The contact method, based on recent recommendations (Ref. 3), includes a notification letter (Day 1), a reminder/thank-you postcard (Day 5), a second letter sent to non-responders (Day 12), a paper version of the survey mailed to non-responders (Day 19), and a reminder postcard sent to non-responders (Day 24).

Based on previous research (Refs. 4, 5, and 6), we plan to recruit using two $1 bills ($2 total per sampled respondent) mailed in advance with the initial invitation letter as a gesture to encourage response and maintain data quality. Offering a small token of value to respondents establishes a latent social contract and subsequent reciprocity (Ref. 3). In the second contact attempt, we will conduct an experiment to test whether a short statement mentioning the previously paid incentive increases survey response, thereby testing whether social exchange can be extended past the initial contact attempt. Half the sample will be provided language that reminds them they received a cash incentive in the previous letter; the remaining half will be reminded they received a letter but will not be specifically reminded about the incentive.

We estimate a 35 percent response rate, based on recent work on similar studies (Ref. 7). Prior to the main study, a pilot study will be conducted to test the data collection process. We estimate 35 respondents will complete the pilot study and 1,765 will complete the main study (see table 1).

The survey contains questions about respondents' knowledge of FDA's authority with respect to prescription drug advertising, their exposure to DTC advertising, their beliefs and attitudes about DTC advertising, and the influence of DTC advertising on further information search and patient-physician interactions. At the end of the survey, respondents will be randomly assigned to view one of two ads for fictional prescription drugs intended to treat high cholesterol. They will be asked questions about FDA's authority regarding specific claims within the ad. The survey will include a debriefing to inform respondents that the advertised drug was fictitious. We will also measure other potentially important characteristics such as demographics, insurance coverage, and prescription drug use. The survey is available upon request.

We will test for any differences between modes (online versus mail survey) and will account for any mode effects in our analyses. We will weight the data to account for different probability of selection and nonresponse. We will examine the frequencies for survey items and the relation between survey items and demographic and health characteristics. We also plan to compare responses between this survey and FDA's 2002 survey for repeated items.

In the Federal Register of February 29, 2016 (81 FR 10257), FDA published a 60-day notice requesting public comment on the proposed collection of information. Nine comments were received. Five comments did not address any of the information collection topics solicited and therefore we do not discuss them in this document (four called for a ban on direct-to-consumer prescription drug advertising and one discussed FDA's response to public comments in general). No comments addressed Topic 2—accuracy of our estimate.

Topic 1—practical utility. One comment suggested that we increase the practical utility of the survey by (1) including teenagers 14-18 years of age, and (2) skewing the survey to include a disproportionate number of Americans over 50 years of age. Another comment suggested we use a quota to ensure that limited literacy respondents are included. One of our main goals is to survey a nationally representative sample of U.S. adults about their experiences with and attitudes towards DTC promotion of prescription drugs. Note that we have designed other studies that specifically examine adolescent and older adults' responses to prescription drug advertising (FDA-2013-N-1151-0004, “Experimental Study of Direct-to-Consumer Promotion Directed at Adolescents”; FDA-2015-N-2163-000, “Hearing, Aging, and Direct-to-Consumer Television Advertisements”). We will measure health literacy within the survey.

One comment suggested that respondents should watch a prescription drug television ad and then answer questions about benefit and risk recall. Although this design is beyond what we can accomplish within a nationally representative survey, we have conducted studies that use this design (for examples, see http://www.fda.gov/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/CDER/ucm090276.htm ).

Topic 3—ways to enhance quality, clarity, utility. Four comments suggested changes to the survey to enhance its quality, clarity, and utility. Three comments suggested changing our terminology throughout the survey for clarity. As suggested, we changed “television” to “TV,” “advertisement” to “ad,” used “health care provider” throughout the survey, and specified that by Internet we mean Internet accessed by computer, phone, or tablet. We changed “small print” to “additional information.” We did not change “prescription drug” to “medicine.” Respondents in cognitive interviews understood the term “prescription drug,” and we are concerned that “medicine” is too broad. We also chose not to highlight or bold “prescription drug” as cognitive interview respondents understood the purpose of the survey and we do not want to overuse highlighting.

Also, two comments suggested deleting survey questions. Two comments questioned the utility of a series of questions about the safety and efficacy of certain products. We agree that these questions are not as central to the survey topic and have deleted them. They also recommended deleting a series of questions about FDA approval of DTC promotion. These questions will highlight claims within the ad to determine whether consumers believe that advertising in general as well as specific claims are approved by FDA. Therefore, we have chosen to keep these questions on the survey. One comment recommended deleting a question perceived to be too negative whereas another comment recommended adding positive answer choices to balance the question; we chose the latter option.

In addition, four comments suggested additional topics for survey questions. In response we added questions about whether prescription drug advertising has caused respondents to talk with their healthcare provider about symptoms or side effects they've experienced, or to look for information about a prescription drug they thought might be helpful for a friend of family member. We also added a question about the respondents' primary language. Finally, we now ask whether respondents have seen prescription drug promotion on streaming services and whether they have looked for information on medical association Web sites.

One comment suggested adding places where consumers could see or hear advertisements (e.g., “on television at the doctor's office,” “in a pharmacy”) to a question that asks about the type of medium where they saw or heard an ad (e.g., “TV,” “print”). We chose not to take this suggestion because the question concerns medium, not location. We are also concerned about measurement error. For instance, some doctor's offices have magazines with DTC print ads, TVs playing broadcast television, or TVs playing videos. This also relies on having gone to a doctor or pharmacist in the last 3 months.

One comment suggested adding additional response options to a question about where consumers might attain more information about prescription drugs. Because this question is focused on adequate provision in DTC television ads, we chose not to add any additional response options beyond those specific to adequate provision (i.e., branded Web site, manufacturer's toll-free number, print ad, and health care provider).

We note that the survey contains a series of questions about various new media, including social media, Web sites, and online videos. It also asks about respondents' attitudes about how benefits and risks are presented, whether they have seen information about the medical condition in TV ads, and whether they've looked for information on government Web sites. We chose not to ask whether they've looked for information on manufacturer Web sites because we don't want respondents to confuse it with the option, “a prescription drug Web site.”

Finally, three comments had suggestions for how we ask our questions. One comment recommended reducing or eliminating the number of open-ended questions. The main survey has only two questions with an open-ended option (allowing respondents to specify another response). If pilot testing reveals potential closed-ended response options for these two questions we will add them to the main survey. One comment suggested changing our scale for how we measure exposure to prescription drug promotion. We changed this scale from qualitative frequency to a yes/no scale. Similarly, one comment asked us to consider how we measure how much of an ad respondents saw or read because there may be many variables that affect this. We have chosen not to change this scale but will consider this point when interpreting the data. One comment suggested that we randomize response order for the paper-based surveys. We plan to create multiple versions of the paper-based scale to account for household sampling and viewing of the ad, so we are concerned that creating different versions to account for response option randomization will be too complex for a survey of this scale. However, we agree that response option order is important to take into account when interpreting results.

Topic 4—ways to minimize burden. One comment suggested we conduct the survey with an online consumer survey panel to reduce time and costs and increase response rates. Although we agree that online survey panels can be an efficient way to collect data, this survey is designed to be nationally representative. Following OMB's advice, therefore, we will use the Internet as one mode of data collection but will not rely on an online survey panel for sampling ( https://www.whitehouse.gov/sites/default/files/omb/inforeg/pmc_survey_guidance_2006.pdf ).

FDA estimates the burden of this collection of information as follows:

Table 1—Estimated Annual Reporting Burden

Activity Number of respondents Number of responses per respondent Total annual responses Average burden per response Total hours
Pilot Study
Survey invitation letter 100 1 100 .08 (5 min.) 8
Reminder postcard 100 1 100 .03 (2 min.) 3
Non-response letter 82 1 82 .08 (5 min.) 7
Non-response questionnaire letter 81 1 81 .08 (5 min.) 7
Second postcard 60 1 60 .03 (2 min.) 2
Survey 35 1 35 .33 (20 min.) 12
Main Study
Survey invitation letter 5,042 1 5,042 .08 (5 min.) 403
Reminder postcard 5,042 1 5,042 .03 (2 min.) 151
Non-response letter 4,173 1 4,173 .08 (5 min.) 334
Non-response questionnaire letter 4,073 1 4,073 .08 (5 min.) 326
Second postcard 3,063 1 3,063 .03 (2 min.) 92
Survey 1,765 1 1,765 .33 (20 min.) 582
Total 1927
There are no capital costs or operating and maintenance costs associated with this collection of information.

II. References

The following references are on display in the Division of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852 and are available for viewing by interested persons between 9 a.m. and 4 p.m., Monday through Friday; they are also available electronically at http://www.regulations.gov. FDA has verified the Web site addresses, as of the date this document publishes in the Federal Register, but Web sites are subject to change over time.

1. Aikin, K.J., J.L. Swasy, and A.C. Braman, “Patient and Physician Attitudes and Behaviors Associated With DTC Promotion of Prescription Drugs—Summary of FDA Survey Research Results,” 2004. ( http://www.fda.gov/downloads/Drugs/ScienceResearch/ResearchAreas/DrugMarketingAdvertisingandCommunicationsResearch/ucm152860.pdf ).

2. PhRMA Guiding Principles: Direct-to-Consumer Advertisements About Prescription Medicines 2008. ( http://phrma.org/sites/default/files/pdf/phrmaguidingprinciplesdec08final.pdf ).

3. Dillman, D.A., J.D. Smyth, and L.M. Christian, Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th ed. Hoboken, NJ: John Wiley & Sons, Inc., 2014.

4. American Association for Public Opinion Research, “Address-based Sampling,” 2016. ( http://www.aapor.org/AAPOR_Main/media/MainSiteFiles/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL.pdf ).

5. Millar, M.M. and D.A. Dillman, “Improving Response to Web and Mixed-Mode Surveys,” Public Opinion Quarterly 1-21. 2011.

6. Shaw, M.J., T.J. Beebe, H.L. Jensen, and S.A. Adlis, “The Use of Monetary Incentives in a Community Survey: Impact on Response Rates, Data Quality, and Cost,” Health Services Research 35:1339-1346. 2011.

7. Montaquila, J.M., J.M. Brick, D. Williams, K. Kim, et al., “A Study of Two-Phase Mail Survey Data Collection Methods,” Journal of Survey Statistics and Methodology 1(1), 66-87. 2013.

Dated: July 29, 2016.

Leslie Kux,

Associate Commissioner for Policy.

[FR Doc. 2016-18425 Filed 8-3-16; 8:45 am]

BILLING CODE 4164-01-P