Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

The challenges of polling when fewer people are available to be polled
Pew Research Center ^ | July 21, 2015 | Drew DeSilver and Scott Keeter

Posted on 10/25/2016 11:40:40 AM PDT by Quicksilver

Around the world, pollsters have had some high-profile flops lately. In both the U.K. and Israel, pre-election polls earlier this year predicted much tighter races than actually occurred. Last year, Scots voted against independence by a wider-than-expected margin. In the U.S., many pollsters underestimated last year’s Republican midterm wave, and some observers have suggested that polls simply aren’t appropriate tools for studying certain subjects, such as religion.

Cliff Zukin, past president of the American Association for Public Opinion Research and a Rutgers University political science professor, wrote recently that “two trends are driving the increasing unreliability of election and other polling in the United States: the growth of cellphones and the decline in people willing to answer surveys.”

Despite those challenges, social scientists, market researchers, political operatives and others still rely on polls to find out what people are thinking, feeling and doing. But with response rates low and heading lower, how can survey researchers have confidence in their findings? Scott Keeter, director of survey research at Pew Research Center, addresses this and related questions below.

Do low response rates in and of themselves make a poll unreliable?

The short answer here is “no.” The potential for what pollsters call “nonresponse bias” – the unwelcome situation in which the people we’re not reaching are somehow systematically different from the people we are reaching, thus biasing our poll results – certainly is greater when response rates are low. But the mere existence of low response rates doesn’t tell us anything about whether or not nonresponse bias exists. In fact, numerous studies, including our own, have found that the response rate in and of itself is not a good measure of survey quality, and that thus far, nonresponse bias is a manageable problem.

For example, our 2012 study of nonresponse showed that despite declining response rates, telephone surveys that include landlines and cellphones and are weighted to match the demographic composition of the population (part of standard best practices) continue to provide accurate data on most political, social and economic measures. We documented this by comparing our telephone survey results to various government statistics that are gathered with surveys that have very high response rates. We also used information from two national databases that provide information about everyone in our sample – both respondents and non-respondents – to show that there were relatively small differences between people we interviewed and those we were unable to interview.

But it’s important to note that surveys like ours do have some biases. Better-educated people tend to be more available and willing to do surveys than are those with less education. Nonwhites are somewhat underrepresented. People who are interested in politics are more likely to take surveys that have to do with politics. But most of these biases can be corrected through demographic weighting of the sort that is nearly universally used by pollsters.

Are some kinds of biases harder to correct than others?

While weighting helps correct the overrepresentation of voters and the politically engaged, it does not eliminate it. This makes it especially important to have accurate ways of determining who is likely to vote in elections, a problem that all political pollsters grapple with.

The one other source of nonresponse bias that seems to persist after we apply demographic weighting is the tendency of survey participants to be significantly more engaged in civic activity than those who do not participate. People who participate in volunteer activities are more likely to agree to take part in surveys than those who don’t. This might lead us to overestimate things like the proportion of U.S. adults who contact elected officials, work with other people to solve community problems, or attend religious services on a weekly basis (though even in surveys with very high response rates, Americans report church-attendance rates that appear to substantially exceed actual attendance). Because of this, we try to be especially cautious in interpreting data about volunteer activity and related concepts. But fortunately, this characteristic of survey participants is not strongly related to most other things we study.

Survey response rates have been falling for many years. Why has this become of particular concern now?

One reason that there’s greater public awareness of falling response rates is because we and other researchers have been closely tracking the decline, constantly monitoring for impact and talking publicly about the issue. Our 2012 study of nonresponse documented the downward trend; at that time, we reported that the average response rate in 2012 was 9%, a figure that’s been widely cited since. There’s also been more discussion lately because of faulty election polls in the U.S. in 2014 and in Britain and Israel this year.

It’s important to keep in mind that even if there is more public discussion about the nonresponse issue now, it’s not a new concern among survey researchers. Scholars were noting the declines in response rates 25 years ago. We conducted our first major study of the impact of survey nonresponse in 1997, when our telephone response rates were 36%.

Do we know why fewer people are willing to respond to surveys than in years past?

The downward trend in response rates is driven by several factors. People are harder to contact for a survey now than in the past. That’s a consequence of busier lives and greater mobility, but also technology that makes it easier for people to ignore phone calls coming from unknown telephone numbers. The rising rate of outright refusals is likely driven by growing concerns about privacy and confidentiality, as well as perceptions that surveys are burdensome.

Does Pew Research Center see the same pattern of low/declining response rates in other countries?

Yes indeed. Nonresponse to surveys is growing in many wealthy nations, and for most of the same reasons it’s increasing here in the U.S.

Are low response rates the reason, or at least a big reason, why so many pollsters around the world seem to have missed the mark recently in their pre-election polls?

It’s not at all clear that nonresponse bias is to blame for the recent troubles with election polls, though that’s one possible source of the errors. Equally important may be the methods used to determine who is a likely voter, or how to deal with voters who tell pollsters that they are undecided in the race. The British Polling Council commissioned a review of the polls in the 2015 general election, following the failure of most polls there to forecast the Conservative victory. That review has not yet been completed.

How do response rates compare between calls to a landline phone and calls to a cellphone?

We are obtaining nearly identical response rates on landline phones and cellphones. However, it takes considerably more interviewer time to get a completed interview on a cellphone than a landline phone, because cellphone numbers have to be dialed manually to conform to federal law. In addition, many cellphones are answered by minors, who are ineligible for the vast majority of our surveys. Unlike a landline, we consider a cellphone a personal device and do not attempt to interview anyone other than the person who answers.

In general, how does Pew Research Center attempt to overcome the challenges posed by low response rates in its survey research?

Pew Research Center devotes considerable effort to ensuring that our surveys are representative of the general population. For individual surveys, this involves making numerous callbacks over several days in order to maximize the chances of reaching respondents and making sure that an appropriate share of our sample are interviewed on cellphones. We carefully weight our surveys to match the general population demographically.

Perhaps most importantly, Pew Research Center’s team of methodologists is engaged in ongoing research into improving our existing survey techniques while also looking at alternative ways of measuring the attitudes and behaviors of the public. As society continues to change and technology evolves, the future of social research is likely to involve some combination of surveys and other forms of data collection that don’t involve interviews. In the meantime, we continue to apply the best survey practices we can and endeavor to be as transparent as possible about the quality of our data and how we produce them.


TOPICS: Extended News; Front Page News; News/Current Events; Politics/Elections
KEYWORDS: polls
Navigation: use the links below to view more comments.
first 1-2021-34 next last
Reference:

May 15, 2012

Assessing the Representativeness of Public Opinion Surveys Overview

http://www.people-press.org/2012/05/15/assessing-the-representativeness-of-public-opinion-surveys/


1 posted on 10/25/2016 11:40:40 AM PDT by Quicksilver
[ Post Reply | Private Reply | View Replies]

To: Quicksilver

People like me are fed up with the polls so we either don’t answer, or answer in a non responsive manner. FU MSM and you push pollsters.


2 posted on 10/25/2016 11:44:45 AM PDT by Don Corleone (Oil the gun, eat the cannolis, take it to the mattress.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Quicksilver
The challenges of polling when fewer people are available to be polled...

In other words...

The challenges of polling when too many people understand the agenda of push polling.

3 posted on 10/25/2016 11:47:09 AM PDT by C210N
[ Post Reply | Private Reply | To 1 | View Replies]

To: Quicksilver

I do not pick up phone calls from anyone I don’t know. None.
If it is important, they will leave a message.

The phone poll people can thank the Robo Call plague for finding few working pickups.


4 posted on 10/25/2016 11:47:53 AM PDT by hadaclueonce (This time I am Deplorable)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Quicksilver

So, in 1997 out of 1000 calls or attempts to solicit a response, there were 139 responses.

In 2012 there were less than 8 per 1000.


5 posted on 10/25/2016 11:49:20 AM PDT by seowulf (Cogito cogito, ergo cogito sum. Cogito.---Ambrose Bierce)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Don Corleone

I couldn’t agree more. If I take a call from a pollster I answer as an Independent and I refuse to answer the demographics questions.


6 posted on 10/25/2016 11:50:10 AM PDT by Quicksilver (Trump / Pence 2016)
[ Post Reply | Private Reply | To 2 | View Replies]

To: Don Corleone

Maybe we would treat them more responsibly if they did not abuse data, make up information, use push-pull polling. Maybe. Maybe. Maybe.

How about just a “reasonable standard” of honesty and some self-policing by the polling profession.

We and they could all sleep better at night.


7 posted on 10/25/2016 11:52:53 AM PDT by Rapscallion (The Clinton cabal like Obama intends to destroy the America you used to love.)
[ Post Reply | Private Reply | To 2 | View Replies]

To: hadaclueonce

“The phone poll people can thank the Robo Call plague for finding few working pickups.”

A plague that is outlawed at the Federal and the State level for most states, but of course is not enforced because it’s not vital like enforcing bogus “social justice” Presidential Executive Orders and economy-destroying and freedom-destroying regulations.


8 posted on 10/25/2016 12:10:08 PM PDT by catnipman (Cat Nipman: Vote Republican in 2012 and only be called racist one more time!)
[ Post Reply | Private Reply | To 4 | View Replies]

To: seowulf

“In 2012 there were less than 8 per 1000.”

And way worse now, four years later!


9 posted on 10/25/2016 12:10:40 PM PDT by catnipman (Cat Nipman: Vote Republican in 2012 and only be called racist one more time!)
[ Post Reply | Private Reply | To 5 | View Replies]

To: Quicksilver

I don’t really believe that the type of person who answers and who doesn’t answer are the same. I think older people answer their phone more. Probably a little lonely.

I seldom answer my phone anymore. The continuous interruptions by sales people finally broke me of the habit.


10 posted on 10/25/2016 12:11:05 PM PDT by FR_addict (Ryan needs to go!)
[ Post Reply | Private Reply | To 6 | View Replies]

To: Quicksilver

“If you look carefully you can see that the response rate dropped 15% in 2009 to 9% in 2012. That means the current response rate on 2016 is about 5% which is very good news. That means pollsters that I can control can make stuff up. Yes! Winning!”

Love, Hillary

“P.S. If the trend lines hold no-one will respond in 2020. I will then be able to lead any opponent 100 to 0%, even in Texas!”


11 posted on 10/25/2016 12:13:49 PM PDT by cgbg (This space for rent--$250K)
[ Post Reply | Private Reply | To 1 | View Replies]

To: catnipman
A plague that is outlawed at the Federal and the State level for most states, but of course is not enforced

One would think the ROBO callers had a paid K Street Lobbyist.

12 posted on 10/25/2016 12:16:44 PM PDT by hadaclueonce (This time I am Deplorable)
[ Post Reply | Private Reply | To 8 | View Replies]

To: Quicksilver

In other words...

Their gonna just...

Make stuff up!!!


13 posted on 10/25/2016 12:16:52 PM PDT by CincyRichieRich (To liberals, lying is like breathing.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: CincyRichieRich

Yes, and then claim they have math on their side!


14 posted on 10/25/2016 12:22:10 PM PDT by fortheDeclaration (Pr 14:34 Righteousness exalteth a nation:but sin is a reproach to any people)
[ Post Reply | Private Reply | To 13 | View Replies]

To: CincyRichieRich

Their solution is “weighting” but the problem with weighting is that if they knew how to do it with scientific accuracy then they wouldn’t need a poll in the first place.


15 posted on 10/25/2016 12:32:38 PM PDT by azcap (Who is John Galt ? www.conservativeshirts.com)
[ Post Reply | Private Reply | To 13 | View Replies]

To: fortheDeclaration

It’s been a long-standing fact that 99.4213 percent of all statistics are made up on the spot.


16 posted on 10/25/2016 12:35:22 PM PDT by vrwconspiracist (The Tax Man cometh)
[ Post Reply | Private Reply | To 14 | View Replies]

To: Don Corleone

Reasons for declining participation rates:
* too many marketing efforts presented as polls, so we turn it off
* the biased polls that we can tell are biased and decide not to participate
* the pollsters who hang up on you or end the interview when you don’t answer the right way
* the polls go on for a long time because they want detailed data to make up for the smaller data set


17 posted on 10/25/2016 12:47:59 PM PDT by tbw2
[ Post Reply | Private Reply | To 2 | View Replies]

To: All

I can’t even remember the last time I was polled. Maybe 5-10 years ago, I think.

I guess if I had the time to speak with someone, I’d probably answer their questions.


18 posted on 10/25/2016 1:21:03 PM PDT by MplsSteve
[ Post Reply | Private Reply | To 2 | View Replies]

To: Quicksilver

Michigan - for 2012 we got several poll calls about Romney/Obama. This time around we got one local Michigan poll back in September, other than that nothing.


19 posted on 10/25/2016 1:29:41 PM PDT by MomwithHope (Missing you /johnny (JRandomFreeper). Time to Pray, Prepare, and Participate.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Quicksilver

“What, you voted for Obama in 2012? We’ll make you a Hillary leaning voter!”


20 posted on 10/25/2016 1:51:52 PM PDT by CatOwner
[ Post Reply | Private Reply | To 1 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-34 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson