Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a rel="nofollow noreferer" href="">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Ok, I admit polling stories are really not the stuff that constitutes basic Journalism101 – yet survey stories are a dime a dozen.

By that measurement you would think reporters are skilled in the fine print of survey research methodology. Unfortunately, most are not, and many rely on the sponsors of these surveys for analysis and interpretation.

Some surveys are scientifically conducted – we can rely on the findings with a high degree of confidence – others are little more than “push polls” – pushing us towards a pre-packaged storyline.

The chasm between the two is as wide as the difference between liberals and neo-conservatives.

So as the air temperature drops in Jersey and campaigns across the state heat up, we should brace ourselves to be inundated with polling stories and the accompanying public relations spin. To help sort out the good from the bad, this column will keep track of the coverage.

The focus will be on method not content.

Case in point: recently we reported on a new statewide survey of voters’ attitudes on New Jersey’s newly enacted civil union law. The poll was commissioned by a group with the moniker Garden State Equality.

Although our story accurately portrayed the results, our readers fundamentally don’t know if the findings are accurate. We do know who paid for and sponsored the poll.

Disclosure is essential in this category.

But after that, it gets murky. Gaps in the information raise questions about how the sample households were selected and weighted, as well as measurement validity.

Question ordering potentially introduced bias into the results. On three occasions, the survey tells respondents about problems with implementation of the law, then asks voters how they feel about fixing the law by giving gay couples the same right to marry as heterosexual couple. Two thirds (63%) responded they were “fine with that”. Is anyone surprised?

No doubt a survey story sets up technical questions that are difficult to field and find answers to – but we and others in the media must ask the basics before running with a story.

The New York Times public editor Clark Hoyt invoked in his column on Sunday the old newsroom maxim: If your mother tells you she loves you, check it out.

Same with poll results.

The American Association of Public Opinion Research (AAPOR) publishes a great primer for journalists covering poll stories ( It’s also a great consumer guide for anyone who wants to find out if they have just been informed or duped.