How polling is done today...thanks NBC...

billc

Grandmaster
Lifetime Supporting Member
Joined
Aug 12, 2007
Messages
9,183
Reaction score
85
Location
somewhere near Lake Michigan
Here is a look at a recent poll and how the sample was gathered.

http://hotair.com/archives/2012/07/24/whos-up-for-a-new-nbcwsj-poll-based-on-a-bad-sample/

Last month’s sample was questionable. This month’s sample is terrible:
sample.jpg

The 2008 national exit poll sample, taken when Hopenchange fever was at its zenith, was 39D/32R/29I, or D+7. This one, after three years of Obamanomics dreck, is somehow D+11 if you include leaners and D+12(!) if you don’t. Anyone feel like taking these results seriously?
And yet we soldier on, my friends, reminding ourselves at every step that lame content is still content. One interesting takeaway: The attacks on Bain are driving Obama’s favorables down too, even with a very Democratic sample.
 
Here is a new poll and how it really breaks down...and why the main stream media can still, no longer be trusted...


http://www.powerlineblog.com/archives/2012/08/about-that-cbsnytquinippiac-poll.php

I found the internals of the poll together with a full set of the results online, which you can view here in PDF. According to the poll:
In Florida Democrats outnumber Republicans 36 to 27 = D+9, and these people polled voted in the last election for Mr. Obama by 53 to 40 = D + 13.
Actual result in 2008: 51.2% to 47.2%, Obama winning + 4.
Elected Republican senator in 2010.
In Ohio, Democrats outnumber Republicans 35 to 27 = D+8, and these people polled voted in the last election for Mr. Obama by 53 to 38 = D + 15.
Actual result in 2008: 50.9% to 48.4%, Obama winning +2.5.
Elected Republican senator in 2010.
In Pennsylvania, Democrats outnumber Republicans 38 to 32 = D+6, and these people polled voted in the last election for Mr. Obama by 54 to 40 = D + 14.
Actual result in 2008: 54.7% to 44.3%, Obama winning + 10.4.
Elected Republican senator in 2010.
(Here are the 2008 results.) It appears that this poll determined whether you are a likely voter simply by asking the person polled whether he is likely to vote and only counting the opinions of those who say they are likely to do so.
The internals show that, if people told the truth about how they voted in 2008, the pollsters managed to find a population that voted for Mr. Obama more frequently than actually occurred. It’s certainly possible that, out of political correctness, some people say they voted for Mr. Obama in 2008 when they did not, but no way is it extensive enough to explain the discrepancy
.

This, Sukerkin, is why some people don't trust the main stream media here in the states.
 
The 'media' did not complete the survey research.

You appear to be referencing work completed by two survey research professionals, one is a visible, recognized
and respected Republican: Bill McInturff: the other is Peter Hart, who frequently works for Democrats.

Mr. McIntuff is a partner and vice-president in Public Opinion Strategies, a well recognized company which works for Republican candidates.

These two individuals jointly direct survey research work that is not skewed by partisan preferences. You may identify flaws in their methodological approach (though you've not done that) but Bill McIntuff is not going to countenance intentional mis-representation or purposeful methodological bias.

"Indeed, the percentages signaling a less favorable impression about these candidates – especially at this point in the race – are greater than what the NBC/WSJ poll showed in the 2004 and 2008 presidential contests. This is not characteristic … for July,” says Republican pollster Bill McInturff, who conducted this survey with Democratic pollster Peter D. Hart. “These are numbers you usually see in October.”'Bill McInturff is a partner and co-founder of Public Opinion Strategies, a national political and public affairs survey research firm. Since its founding in 1991, the firm has completed more than six million interviews with voters and consumers in all fifty states and over a dozen foreign countries, and conducted more than 4,500 focus groups. Called by The New York Times, "the leading Republican polling company," Public Opinion Strategies currently represents 19 U.S. Senators, six governors, and over 70 Members of Congress.

His prior experiences include ‘hands on’ campaign management experience at the local, congressional, and the presidential level. He also held senior positions with the Republican national party committees prior to entering the field of survey research.






 

Waiting your reply to previous post.

And regarding this link: there is no discussion of the methodology of this survey in the article you've linked to.

However, w/in that article is a link to a article that contains a link to the pdf that contains material on the actual items
and the methodology used (brief but useful).

Your original commenter at 'hot air' does not address the methodology of the survey research in any way.

As I posted previously, the methodology and survey items were constructed and implemented under the direction of two survey research professional, one of whom regularly conducts research for Republican candidates. This cogent point is not mentioned by 'hot air' or by you.

The 'hot air' commenter takes a further step, using exit poll data with no information regarding methodology
(frequently unreliable) to contrast with a published telephone survey. This (using these two for 1-1 comparison)
is a major error and remains useless no matter what individual or group presents them as comparable, regardless of partisan affiliation.

Where is the problem with the sample selection in the actual survey? Is there any problem with methodology that
affects outcome reliability?

If the 'hot air commenter' can identify any methodological problems why are those not presented to Bill McInturff for his response.

His reputation for professional integrity is on the line. His company's name is in the research. I believe it is unlikely that Republican candidates would continue to retain a consultant on survey research whose methodology is unreliable and/or flawed.

with respect,
 
I heard an interview with one of the guys who runs the polls and he tried to explain why the oversampling of democrats occurred and he wasn't convincing in the least.

Here is the interview with the guy...
Peter Brown, assistant director of the Quinnipiac Polls,

http://www.hughhewitt.com/transcripts.aspx?id=1c1a7295-7ce1-47e7-8074-4ce24952aceb

PB: So what’s important to understand is what we are doing is we’re asking voters what they consider themselves when we interview them, which was in the last week.
HH: Now what I don’t understand this, so educate me on it, if Democrats only had a three point advantage in Florida in the final turnout measurement in 2008, but in your poll they have a nine point turnout advantage, why is that not a source of skepticism for people?
PB: Well, I mean, clearly there will be some people who are skeptics. This is how we’ve always done our polls. Our record is very good in terms of accuracy. Again, remember, we’re asking people what they consider themselves at the time we call them.
HH: But I don’t know how that goes to the issue, Peter, so help me. I’m not being argumentative, I really want to know. Why would guys run a poll with nine percent more Democrats than Republicans when that percentage advantage, I mean, if you’re trying to tell people how the state is going to go, I don’t think this is particularly helpful, because you’ve oversampled Democrats, right?
PB: But we didn’t set out to oversample Democrats. We did our normal, random digit dial way of calling people. And there were, these are likely voters. They had to pass a screen. Because it’s a presidential year, it’s not a particularly heavy screen.
HH: And so if, in fact, you had gotten a hundred Democrats out of a hundred respondents that answered, would you think that poll was reliable?
PB: Probably not at 100 out of 100.
HH: Okay, so if it was 75 out of 100…
PB: Well, I mean…
HH: I mean, when does it become unreliable? You know you’ve just put your foot on the slope, so I’m going to push you down it. When does it become unreliable?
PB: Like the Supreme Court and pornography, you know it when you see it.
HH: Well, a lot of us look at a nine point advantage in Florida, and we say we know that to be the polling equivalent of pornography. Why am I wrong?
PB: Because what we found when we made the actual calls is this kind of party ID.
HH: Do you expect Democrats, this is a different question, do you, Peter Brown, expect Democrats to have a nine point registration advantage when the polls close on November 6[SUP]th[/SUP] in Florida?
PB: Well, first, you don’t mean registration.
HH: I mean, yeah, turnout.
PB: Do I think…I think it is probably unlikely.
HH: And so what value is this poll if in fact it doesn’t weight for the turnout that’s going to be approximated?
PB: Well, you’ll have to judge that. I mean, you know, our record is very good. You know, we do independent polling. We use random digit dial. We use human beings to make our calls. We call cell phones as well as land lines. We follow the protocol that is the professional standard.
HH: As we say, that might be the case, but I don’t know it’s responsive to my question. My question is, should we trust this as an accurate predictor of what will happen? You’ve already told me there…
PB: It’s an accurate predictor of what would happen is the election were today.
HH: But that’s, again, I don’t believe that, because today, Democrats wouldn’t turn out by a nine point advantage. I don’t think anyone believes today, if you held the election today, do you think Democrats would turn out nine percentage points higher than Republicans?
 
Well, you can always direct questions to the commentator at Hot air and see how he answers your question. It doesn't change how these polls are representing actual results from the last election versus what they are putting out today. The news is running with the polls and are not pointing out the methodology or the samples. Hearing the polls in the news it gives a different impression as to what might really be happening. Of course, we won't know for sure till the election, but it wouldn't be the first time polls put the Republican candidate behind, only to have them win. the Ted Cruz election on Tuesday is a great example of this.
 
Here is a look at a poll used by CBS...

http://www.americanthinker.com/2012/08/does_nytcbs_poll_mean_shenanigans_for_the_sunshine_state.html

The survey purports to show that Obama has reached the suddenly magical number of 50% in Ohio, Florida, and Pennsylvania. The authors of the report all but fail to admit that they accomplish this by weighting the polling samples toward Democrats -- and heavily weight them, at least in the case of Florida.
Buried in the Quinnipiac College report on the poll is the actual number of people it queried in telephone surveys. Of the 1,177 people polled in Florida, the numbers by party affiliation break down like this:
Republican - 359 or 30.5%
Democrat - 373 or 31.6%
Independent - 393 or 33.4%
Other/Don't know/etc. - 52 or 4.4%
These are seemingly realistic numbers, representative of what a random sampling might produce.
However, once the pollsters "weighted" the numbers, this is what they came up with use:
Republican - 27%
Democrat - 36%
Independent - 32%
Other/Don't know/etc. - 5%
Republicans' responses were decreased and Democrats' responses were increased to give Democrats a 9% overall margin. Independents were also decreased slightly.
How did this affect the overall result?
This correspondent's reading of the CBS/NYT survey produced no useful breakdown of Republicans' and Democrats' voting preferences for president. It could be assumed that the vast majority of each -- 90% or more -- would vote for their party's nominee. However, you know what they say about assuming.
Instead, we went to the last credible poll of Florida voters, conducted by Democrat-leaning Public Policy Polling. The survey of 871 likely voters, conducted July 26-29, appears to be unweighted and gave Obama a 1% overall lead. It found that 84% of Florida Democrats planned to vote for Obama and 83% of Republicans plan to vote for Romney. Significantly, the PPP survey found that only 40% of independents plan to vote for Obama, and 47% for Romney.
Applying these numbers to the unweighted raw numbers from the NYT/CBS poll produces a startling result. Before the heavy thumb of weighting was applied, Obama would be well below the 50% figure. In fact, Mitt Romney might actually have a slight lead.
Following is the PPP breakdown by party and presidential preference for Florida:
Base Democrat Republican Independent/Other
Barack Obama 48% 84% 14% 40%
Mitt Romney 47% 12% 83% 47%
Undecided 5% 4% 3% 13%
Multiply the actual number of respondents -- and not the weighted percentages -- in the NYT/CBS by these percentages, and you get the following results.
44.812% for Romney.
44.246% for Obama.
It is through the magic of weighting ("Ignore that man behind the curtain!") that the pollsters were able to bump Obama into the lead in Florida, and above 50%.
As Richard Baehr has pointed out elsewhere on AT, the weighting is so skewed in this poll that, when respondents were asked whom they voted for in 2008, the numbers don't come close to actual results. Its weighted results would have given Obama a 53-40 margin in 2008, as compared to the real 50.9-to-48.4 difference.


 
Last edited:
Back
Top