General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsShould we worry about a Bradley effect?
The Bradley effect skews polls with a social desirability bias. Individuals being polled dont want to.be considered racists or misogynists and may say theyre voting for Harris but actually end up voting for TSF. We all thought Clinton would win in 2016. We cant be complacent even though the polls seem to be encouraging.
obamanut2012
(27,896 posts)Coexist
(26,202 posts)Women who don't want their TFG-loving mates know they'll vote for a dill pickle before they vote for him. They're gonna vote for Harris on the DL. And fathers of daughters who will make the right choice on their ballots, despite not sharing that info before the fact, or maybe never.
https://www.salon.com/2024/08/14/can-my-husband-find-out-i-am-voting-for-the-big-question-touching-a-nerve-this/
"I think 'secret voting' by MAGA partners is a more widespread issue than most people think," one woman replied. Another man wrote, "As a poll worker, I have had to deal with husbands and fathers who want to join their wives or daughters in the voting booth to 'make sure they vote the right way.'"
central scrutinizer
(12,441 posts)Here in Oregon, all voting is done by mail. Im sure there are some families where the dominant partner fills out both ballots.
Irish_Dem
(59,932 posts)Researchers in the social sciences have been doing it for decades.
We can determine fake good or fake bad answers to questions and then make an adjustment
to polling/testing results.
Think. Again.
(19,315 posts)I thought they stopped at just writing the questions so that they encourage certain responses. Or timing their calls when certain respondents would answer certain ways. Or focusing their calling maps to areas that would mostly respond certain ways. Or repeat calling certain previously polled numbers they know will respond certain ways. Or cutting off their calling when the responses reach their desired outcome. Or any number of other tricks to assure the poll will satisfy, and therefore be purchased by, the specific media source they intend to sell the poll results to, or have already "contracted" for desired results.
Irish_Dem
(59,932 posts)We have ways to account for people lying on tests and questionnaires.
And then have a way to account for that lying in the statistical analysis.
It is similar to margin of error statistics.
You can measure the amount of error in your results and account for that in the analysis.
People always talk about measurement error, it is not considered cheating at all.
It is called the scientific method and research design.
Well trained and ethical scientists want the truth and reality.
And can measure how many people are lying to them
Think. Again.
(19,315 posts)Sympthsical
(10,411 posts)You're saying "manipulating results was openly discussed" as if it was a conspiracy to achieve a specific result.
That is not what the poster is explaining. The poster is trying to tell you that statistical methods involve mathematical models that strive to find accurate measurements. In polling, it's voter intent. But everything under the sun is measured in this way.
In some elections, including 2016, pollsters couldn't get an accurate read on how many Trump voters were out there. Trump voters kept getting undercounted. There are a variety of reasons for this. Social stigma, hostility to media or the pollsters, etc.
So they realized their models were lacking accuracy. As a result, they went in and saw what in the model and method could be tweaked that would give results closer to what the reality turned out to be.
You want whatever model you build to reflect empirical evidence. Just like if you make a model about how stars work, then you see a star that isn't operating the way your model says it should, you go back and look at the model. See what changes you can make to account for why yours isn't working. Or maybe there's something about the star you missed that's causing it to behave that way.
It's basic science. No conspiracy. No nefariousness.
Think. Again.
(19,315 posts)BeyondGeography
(40,077 posts)Fiendish Thingy
(18,864 posts)Simon Rosenberg reports the recent massive Harvard youth survey shows that pollsters may be severely undercounting young Harris voters.
Most polls have a young voter subgroup of 200 or less, with a corresponding large MOE that renders any reported results meaningless.
The Harvard survey had a sample of thousands of young voters, with an expected low MOE.
WarGamer
(15,786 posts)Those refusing to answer polls or hanging up are by a vast majority Trump voters.
central scrutinizer
(12,441 posts)Wanderlust988
(590 posts)I hadn't heard of this team in over a decade now. I think America is a different place now.
H2O Man
(75,845 posts)place value on polls. I don't. I think we should all be doing our best now, no matter what any poll says.