we give more weight to respondents from demographic groups underrepresented among survey respondents, like people without a college degree
Oooooohhh
All of sudden it makes sense
Here’s their methodology page, with in addition to that fuckin fascinating tidbit you quoted, some other things of note:
The New York Times/Siena College Poll is conducted by phone using live interviewers at call centers based in Florida, New York, South Carolina, Texas and Virginia. Respondents are randomly selected from a national list of registered voters, and we call voters both on landlines and cellphones.
In the end, fewer than 2 percent of the people our callers try to reach will respond. We try to keep our calls short — less than 15 minutes — because the longer the interview, the fewer people stay on the phone.
We call more people who seem unlikely to respond, like those who don’t vote in every election.
But the truth is that there’s no way to be absolutely sure that the people who respond to surveys are like demographically similar voters who don’t respond. It’s always possible that there’s some hidden variable, some extra dimension of nonresponse that we haven’t considered.
To be clear, polling theory is totally valid and an established science within statistics.
But the challenge is always with methodology, because you can never get a perfect simple random sample. And the methodology here certainly seems terrible.
If the 2024 presidential election were held today, who would you vote for if the candidates were
Then it lists the usual suspects including third parties. The only age group for that question voting for Biden is 65 and older. Maybe so, but that doesn’t seem right.
I suspect that out of the 2% of people who answered the phone (and the smaller percentage that stayed on for the whole poll), there were some number of young people whose parents answered the phone and then answered all the poll questions for them, or something weird like that.
Maybe not. But in general, the whole methodology starts to look like a big pile of garbage the closer you look at it. It’s not surprising for some answers to come out of it that are very obviously wrong.
Oooooohhh
All of sudden it makes sense
Here’s their methodology page, with in addition to that fuckin fascinating tidbit you quoted, some other things of note:
It is, indeed, always possible.
To be clear, polling theory is totally valid and an established science within statistics.
But the challenge is always with methodology, because you can never get a perfect simple random sample. And the methodology here certainly seems terrible.
Something fucky is going on. From the page:
Then it lists the usual suspects including third parties. The only age group for that question voting for Biden is 65 and older. Maybe so, but that doesn’t seem right.
I suspect that out of the 2% of people who answered the phone (and the smaller percentage that stayed on for the whole poll), there were some number of young people whose parents answered the phone and then answered all the poll questions for them, or something weird like that.
Maybe not. But in general, the whole methodology starts to look like a big pile of garbage the closer you look at it. It’s not surprising for some answers to come out of it that are very obviously wrong.