Bias detected in pre-election polls

-

The new US Common Core in mathematics provides that students be able to justify the conclusions made from sample surveys (“Recognize the purposes of and differences among sample surveys, experiments, and observational studies; explain how randomization relates to each”), and there’s nothing like a US presidential election to give American high school teachers lots of data points to use in their lesson plans about randomization, sampling methods, and bias.

I would recommend starting with this post by Nate Silver of the New York Times. This is from the now-famous 538.com idea Mr Silver started before the 2008 presidential election and has been successful in calling the winner to within a few electoral votes now in two out of two presidential elections.

In his analysis, Mr Silver writes about bias, some of which he found using the national polling averages from the three weeks immediately preceding the Nov. 6 election:


Some possible sources of bias include the method used by various polling organizations to determine whether someone was a “likely” or just a “registered” voter. In particular, Gallup, which had the highest error in terms of the difference between their poll and the actual vote, used a series of questions to respondents and may have biased the results by failing to identify people as likely voters who ended up voting.

I can personally identify three problems with the polls in my case:

  • I have a telephone number for Illinois but live and vote in Maryland (I just never changed the number after I moved). If a robocall simply assumed I voted where my area code placed me on the map, the results would be biased.
  • Second, I didn’t even answer several calls that originated from a number that reverse look-up said was Gallup. I didn’t answer, because I didn’t want to pay for minutes on my cellphone so someone could conduct a poll, especially if it was a push poll.
  • Third, I did pick up one and only one phone call from a polling organization this election season. I was so frustrated with having to disregard polling calls that I lied about my vote for president.

Another possible source of bias is the methodology used to select people. The polling cannot eliminate bias, so I would recommend incorporating other contributors to bias in lesson plans, such as:

Person doesn’t answer. When a polling organization gets no answer at a number that was randomly selected by sampling algorithms, the right thing to do is not abandon the number. Gallup didn’t abandon my number but tried calling at different times of the day. For example, if I didn’t answer during a weekday, they tried calling on a weekend night. That’s sensible-sounding, but the fact that I and not the sampling algorithm decided who would and wouldn’t be included in their polling results introduced bias.

Person answering refers the call to another party. Another problem people noticed with the polls is that sometimes a child answers the phone. That child must select one of his parents to respond to the call. Therefore, it’s not the randomized sampling algorithm that’s determining who is sampled but a child.

Cellphones are important. Many American voters—probably about 25 percent of them—don’t have landlines. They rely strictly on cellphones. This population tends to be younger more than older, so polls that only call landlines will tend to introduce a bias against young voters.

Paul Katula
Paul Katulahttps://news.schoolsdo.org
Paul Katula is the executive editor of the Voxitatis Research Foundation, which publishes this blog. For more information, see the About page.

Recent Posts

Chocolate is shrinking and spiking

0
Technicians have identified the mechanical failure behind "gelatinous" milk in BCPS cafeterias. "White milk only" is the new rule for the district's 111,000 students.