How to Dedupe Republicans

Seriously? You think the media is to blame?

I believe they’re not blameless. I’m not saying that it’s every media outlet, but when media companies are getting revenue from clicks . . .

No. Actual voter fraud did not become controversial. Stop the steal fake voter fraud did. A lie became controversial in the minds of Republicans because conservatives from Trump, to elected representatives, to conservative religious voices, to Fox news celebrities, to alt-right internet voices, have repeated it over and over again and continue to repeat it to this day. There is no evidence contrary to Biden receiving 81.3 million votes and 306 electoral votes versus Trump’s 74.2 million votes and 232 electoral votes, yet the lie persists. The undermining of a belief in fair elections persists. The shitting on American Democracy persists. All this because some privileged shit stains didn’t get what they tried to steal. Yes, tried to steal. There is evidence of widespread voter suppression efforts from the local to the national level.

I’m not sure which media outlets or which particular articles you are referring to. Can you provide some examples and describe how they should have acted differently?

I assume the social media companies and the algorithms that tend to push more extreme views to the top of the pile.

How to Dedupe Republicans?

Done and done.

Kind of interesting to read about. The word “deprogramming” doesn’t imply kidnapping or whatever. It just has those associations because that’s what shrinks/parents did when the term was invented in the 70’s.

Don’t have time to do a full listing . . . as much as I’d like to . . . but I’ll definitely provide one that illustrates what I’ve seen that relates to the bolded.

In many interviews, the media has treated Trump very differently than his election rivals. One that comes to mind is an interview by ABC (I think it was a 20/20 special).

While you had two different interviewers, I would expect similar treatment of their guests in terms of pursuit of questioning. In the interview with Trump, he would give an actual answer to a question posed and the interviewer followed up with digging for more details (some might say in a pushy way, but that isn’t my point here). When interviewing Biden and Harris (separately), each would actually give a non-answer to a question with zero follow up.

So it seems that there was actually Trump-baiting in order to generate the headline “Trump up and leaves an interview during filming”.

What makes it worse is that while Harris was given a “normal” interview that was appeared to show case her personality, Pence was only given one question that was aired: “Why did Trump leave suddenly?”

I probably should point out that my stance actually comes from having training in the area of polling and poll-based research. At the heart of drawing conclusion of such research is to take care about how generalization is done. And this is where my skepticism pops in.

Blindly accepting “statistics” because it’s statistics and the procedures are properly followed will lead to some of the problems we’re seeing in the presentation of “statistics” in popular media. However, if there are underlying issues with the generalization of the results, then it is very appropriate to “question the statistics”.

For the case at hand, I question the generalization of a survey that is most likely conducted with volunteer responses and applying results to a segment that usually decline participation in such polls. In addition, I wonder if there was any work done on how different people would “interpret” questions presented.

Having done survey construction in my graduate program, and getting good feedback from my peers about the solidity of the questions/statements presented, I followed up with a handful of other people (especially those who were not college educated themselves) and asked them how they interpreted the questions/statements, I got a pretty different view of what they saw.

And FWIW, I’m not questioning your assertion of people believing something; my question is on the quantification of the number of people who are believing something. But that questioning originates with my knowledge of how these things are often put together and administered; for example, how many of these polls are going out to these rural areas and seeking input? My guess is that most are done close to the more urban areas and extrapolating respondents from the more rural areas as being fully representative of that area–and it this aspect that sends up my red flags.

1 Like

I appreciate this full response, particularly given the adversarial nature of my post.

Blindly accepting “statistics” because it’s statistics and the procedures are properly followed will lead to some of the problems we’re seeing in the presentation of “statistics” in popular media. However, if there are underlying issues with the generalization of the results, then it is very appropriate to “question the statistics”.

It’s a bit confusing because we’re talking about both the media and the surveys. I believing in very carefully choosing my media based on how well it interprets data. If a media outlet presents flawed data or flawed interpretation (eg. Fox), I stop trusting it. Otoh, I need data to know anything about the world, so I do rely on other sources that present and properly interpret data (eg. the Economist). Otherwise I’m just counting on gut, which is also very biased. I appreciate that you have a common-sense response here, but you have to allow for the fact that you could be wrong.

In this case though, we don’t need to talk about the media at all. Surveys, methodology, and data are always published online. It doesn’t matter what Fox or Vox or CNN or NYT or NPR says, because the survey itself, with data and methodology, is always one click away. The only thing we have to assume is that the statisticians aren’t completely lying.

In this case we have many different polls. At the very least, it suggests that if one of them has flawed questions, and another has valid questions, then we should see different results. But here they all have about the same results. They all say that a majority of republicans believe in widespread/a lot/election changing voter fraud. They all say that very few Biden voters think the same. Most also track over time, to before the election, and previous elections, demonstrating not just the belief but the change in belief.

https://www.monmouth.edu/polling-institute/documents/monmouthpoll_us_111820.pdf/
4. Overall, how confident are you that the 2020 election was conducted fairly and accurately
6. Do you believe Joe Biden won this election fair and square, or do you believe that he only
won it due to voter fraud?

9. Do you think that Joe Biden’s victory in the 2020 presidential election is legitimate or not legitimate?
10. Do you believe there was widespread voter fraud in the 2020 presidential election, or not?

How much voter fraud do you think occurred in this election?

https://www.politico.com/f/?id=00000175-d6fb-d1da-a775-deffac670000
Table POL6_3NET: You mentioned that you don’t believe the 2020 presidential election was a free and fair election. Why, specifically, do you think
the election was not free and fair? Please select all that apply.
Mail-in voting led to widespread voter fraud

There’s a handful more (like the Fox one above, or the Pew/Gallup) but these should get you started…

Also, I don’t quite know what you mean by “going out to rural areas”. These days polling is done with a large random sample of internet or phone (or robocalls). Most pollsters use demographic weighting methods to get back to a representative sample. In all cases, they also publish the unweighted samples.

Not all polls are equally valid of course. And of course participation is a huge problem. And we know that for example uneducated people are much less likely to pick up the phone. But we know that polls work (and some polls work better than others). Again largely thanks to 538 doing an immense amount of analysis after every election. You yourself pointed to the recent US Presidential Elections, and we know that the polls worked fine in those elections. They also work well in Congress/Senate and other elections around the globe.

I guess you could argue that these opinion-polls are fundamentally different than voter-polls, but I would need to be convinced why voter-polls are off by 2% on average while these are somehow off by 30% or 40% or whatever.

Do you not see the inherent bias introduced by this method? Does everyone in the population participate in these types of data gathering?

How do you “demographicly weight” your survey data to those who do not participate in this style of data gathering?

FTR, I do not respond to robocalls. I do not participate in most internet surveys (the exception being when someone I know has asked for participation). And I don’t see myself as some sort of outlier when it comes to the rural population. (Yes, I do have a rather extensive network of agricultural contacts across the “fly-over” states.)

I might consider participating in a phone survey if there is a live person who is addressing me from the get-go.

I stopped answering my phone unless I know the number since the time the dentist office I stopped going to called me and I picked up and had an incredibly awkward conversation. Never again! :judge:

Worst case they leave a VM and I call back. BETTER STILL the eye doctor sent me a text with the deets.

Yup, I generally do not answer the phone unless the caller ID states that it is someone I know (or if I know someone’s phone number by memory). If it’s important enough, they’ll leave a message. If it’s someone I know, they should call or text my cell phone and their contact info is already on that phone.

My phone has an a different area code from where I live, and anyone from that area code that I expect to call me is programmed in. So, if I get an unknown call from there - it’s almost certainly a call I shouldn’t bother answering. If it’s from the area code I live in, it’s probably the dentists office confirming an appointment or something, so I’ll generally answer. If it’s from some other area code it’s hit or miss.

No one under the age of 35 answers their phone.

unknown ID = hang up.

known ID = hang up too.

How do you “demographicly weight” your survey data to those who do not participate in this style of data gathering?

Basic example:
Old ladies like to answer the phone, young men do not. So you end up with too many old ladies in your survey.
But we know how many old ladies there are. And we know young men there are.
Multiply by the population of old ladies, divide by your sample.
Multiply by the population of young men, divide by your sample.

That’s your adjusted answer. Is it perfect? No it is not.

FTR, I do not respond to robocalls. I do not participate in most internet surveys (the exception being when someone I know has asked for participation). And I don’t see myself as some sort of outlier when it comes to the rural population. (Yes, I do have a rather extensive network of agricultural contacts across the “fly-over” states.)

I find it weird that you highlight your ruralness here. It seems obvious to me that the large majority of people, anywhere, do not answer their phone or take random surveys. Do you really think you have a very different mindset than urbanites about things like that?

To answer your big question, yes, of course there is inherent bias in these methods. Demographics aside, people who answer the phone or take surveys are bound to have somewhat different opinions than those that don’t. Above you talk about correlations with education/rural but actually it’s more problematic when there is less correlation, because then demographic weightings don’t work as well.

But, to respond to your big question with another big question, do you not see that polling does works for the sake of predicting elections?

If sampling bias ruined polling, don’t you think it would also ruin polling for elections?

What do you do when your sample has zero of the segment you’re wanting to make an inference for?

Sounds like the time for my actuarial judgment pen

1 Like

What do you do when your kids keep asking silly questions!

Try clicking on my links, and analyzing the data for yourself.

And in case you want one more link, here’s a quick discussion around how pollsters try to deal with bias these days. As noted though, the main point is that election polls just aren’t as bad as everyone feels like they are.