Saturday, August 21, 2004

Pollsters Suck

When it comes to polls, people generally pay attention to little besides the numbers. But it's equally important - in fact, I would say more important - to know the exact questions being asked. I just received a computerized poll asking me questions such as:

"Do you agree with the NRA and their strong support for gun owners' rights?"

"Do you believe that our taxes should be not raised and, if possible, cut?"

The first one is obviously constructed in order to distort the numbers in favor of the NRA. I would wager that most people believe that some restrictions on the free flow of firearms are appropriate, but who the hell is going to say they don't support the rights of gun owners? The use of vague idealistic terms instead of specific policy proposals tilts the question.

Also, who the heck is going to answer no to the second question (besides me when I'm messing with pollsters)? "Yes, I'd like to pay more in taxes, please." A better question would be "Would you be willing to pay higher taxes to fund (insert specific government program here)?" That would actually tell us something. A poll that reports on how 80% of Americans support lower taxes is not news. Watch the pollsters turn this into support for Bush's tax cuts.

There are more subtle ways of manufacturing opinion. Take, for example, this question from the same poll:

"Do you agree that marriage between one man and one woman should be the only legally binding marriage in America?"

On the surface, this is not a biased question - and in a paper poll, it wouldn't be. But it seems to me that the use of the word "agree" in posing the question verbally is meant to create a knee-jerk reaction in most people. People don't like to disagree unless the option is given to them, so a lot of people would automatically say "yes, I agree" without giving much thought to the question. Even I hesitated, and I disagree very strongly with the statement.

In fact, there are even more subtle ways to influence poll response. Tone of voice can be very important in the posing of a question. The previous question placed an emphasis on both "one"s rather than remaining monotonous throughout. As a result, it came off as more admonition and less question. Even the most innocently phrased questions - for example, "If the election were held to day, would you vote for John Kerry, George Bush, other, or don't know (responses rotated)?" - can be biased if the tone of the question changes. Imagine that question if "John Kerry" were said in an excited, happy voice while "George Bush" was said in a disaffected, sad-sounding voice. It becomes less "who would you vote for" and more "do you like happy?"

Moral of the story: watch out for the poll numbers. They're not always the whole story. You should also pay attention to the questions asked and to who's conducting the poll. (Mine was conducted by something called "Helping Hand International." They don't show up in a Google search unless they're a Kentucky-based adoption agency, which I doubt. My guess is that they're a Republican-leaning political group.)

No comments: