Shifting Demographics Across Polls

After the Iowa Caucuses, Pete Buttigieg soared into second place in New Hampshire according to polls done before the primary. Emerson College, working with 7News in Boston, released a daily tracking poll since the day before the Iowa caucus right up to the primary. Each poll used a likely voter screen to ask 500 voters how likely they were to vote and who they will vote for. These polls each carried a margin of error of 4.3%. Looking at the daily tracking poll, Buttigieg’s support increased 10 percentage points from 13 to 23% over that eight-day span. This bounce is large enough to fall outside of the margin of error. Looking at Buttigieg’s polling before Iowa, his support ranged from ~9-17%. However, his support in the final tracking poll (Day 8) ranged from ~19-27%. Like other reputable pollsters, Emerson College makes the questions asked

and the sample demographics available to all for review. While not casting doubt on their conclusions or the poll in general, there are some interesting differences in the questions and samples among the polls in the series that show how pollsters can overweight or underweight demographics in the responses to get to a “true” result.

As with most polls, Emerson College was targeting likely voters. Who’s a likely voter? In the first poll, just over a week before the primary, roughly 95% of respondents self-identified as “very likely” to vote in the primary. That number increased to a full 100% in the poll on Day 8 with all respondents saying that they were “very likely” to vote in the primary. Clearly, the people included in the two samples will be different, but the function of time passing also causes opinions to shift. The pollsters had to account for the group in the first poll that instead identified as “somewhat likely” to vote in the Day 1 poll result.

Another interesting difference between the Day 1 and Day 8 polls appears to be in the mode of data collection. In the first poll, 59% of respondents answered via landline and 41% of respondents answered online.  In the Day 8 poll, 52% of respondents answered via landline and only 10% of respondents answered the poll online. The remaining respondents in the last poll (38% of respondents) answered with a mobile phone. This was a segment that didn’t exist in the first poll. Is the pool of respondents who answer a poll on their mobile phone meaningfully different from the pool that answers a poll online? People responding to polls either online or on their phones tend to be younger. Again, the pollsters would have to account for this when aggregating the results (though the age brackets between the Day 1 and Day 8 polls don’t appear to have a significant difference).

Our Take

Despite the changes in how the poll was collected and respondents’ inclination to vote, the overwhelming trend was that Buttigieg saw increasing support after Iowa that continued into the actual primary vote where he placed second to Bernie Sanders. As we discuss in our class, this demonstrates the power of statistics and pollsters’ ability to control for differences in samples by over or underweighting responses.

Introduction to Polling

A Guide to the 2020 Election Season—our introductory to polling course is designed to teach you a framework to use so that you can confidently follow the polls through this election season and beyond.

View Course

Contact

Bill Ford

Services

Related Capabilities

Share on facebook
Share on linkedin
Share on twitter

Contact

Bill Ford

Subscribe to our Newsletter

This website uses cookies to improve functionality and performance. By continuing to use this website, you agree to the use of cookies in accordance with our Privacy Policy.