Jump to content

Fox News Viewers Uninformed


Texsox

Recommended Posts

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:36 AM)
Plus, New Jersey.

LOL, awesome.

 

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:36 AM)
I don't think MSNBC comes near the level of open, blatant and deliberate bias that Fox does. It is nearly impossible to match that.

I think they are damn close. I'd say Fox is maybe a little more openly biased, but I also think MSNBC is sloppier and less journalistic (tallest midget contest).

 

Link to comment
Share on other sites

  • Replies 71
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

QUOTE (NorthSideSox72 @ Nov 23, 2011 -> 11:32 AM)
2. 612 respondents means you probably have less than a hundred for each news outlet, and I consider that way, way, way too small a data set to be reliable. The T-score calcs being used are using the 612 basis, which is not at all an accurate use of that test. This poll, in particular, is highly flawed.

 

The poll is pretty shady, I agree... but I think you're wrong about it being 100 respondents per outlet. It's taking the answers from the number of people who use method X as the answer number.

 

So for NPR - that's about 120 people (21%)

For Fox News - that's a lot more.

Link to comment
Share on other sites

QUOTE (Rex Kicka** @ Nov 23, 2011 -> 09:38 AM)
The poll is pretty shady, I agree... but I think you're wrong about it being 100 respondents per outlet. It's taking the answers from the number of people who use method X as the answer number.

 

So for NPR - that's about 120 people (21%)

For Fox News - that's a lot more.

NPR/PBS, CNN, MSNBC, Fox, other... at best, there is enough room in that poll for a little over a hundred each. And if some are a lot higher than 100, then others are a lot lower, so again, poll is highly sketchy. This sort of multi-variable framework for a poll basis, you need to be calling thousands of people.

 

Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 09:37 AM)
Sure, if they were in fact "random".

 

I can conduct a poll here in Chicago via the phone. Do you know the easiest way to fix that poll? With the use of the Prefix code. How about I poll 200 "random" people, 100 from the buck town area, and 100 from the englewood area.

 

Sure, they're "random", but I bet the results they yield will be VASTLY different.

 

That's why there's background questions before the poll questions to ensure you get a representative sample.

 

Look, you don't need to convince me that there are ways to make s***ty, biased polls (see the recent one in the Rep thread!). I don't particularly care about this poll beyond pressing you on your claims that the questions were deliberately picked to make Fox look bad, because imo if that claim is true that still makes Fox look bad. Maybe even worse if people can intuitively guess which basic current events information Fox viewers don't understand.

Link to comment
Share on other sites

QUOTE (NorthSideSox72 @ Nov 23, 2011 -> 09:32 AM)
1. There really is no point discussing any polling data with Y2HH, because he's made it pretty clear that any poll is useless.

 

On your point, it's not that I feel polling data is "useless", it's just untrustworthy. Because I believe people have an agenda, I feel most, if not all polls are compromised.

 

IF their methods, as written, were actually followed, perhaps it's believable data. But how do we know that for sure?

Link to comment
Share on other sites

QUOTE (NorthSideSox72 @ Nov 23, 2011 -> 09:38 AM)
LOL, awesome.

 

Even leaving deliberate shots at NJ aside, NJ is not a representative sample of Fox viewers.

 

 

I think they are damn close. I'd say Fox is maybe a little more openly biased, but I also think MSNBC is sloppier and less journalistic (tallest midget contest).

In journalistic integrity or quality? Sure. They're both garbage. But in terms of deliberate bias? No, Fox takes the cake on that one by a long shot.

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:40 AM)
That's why there's background questions before the poll questions to ensure you get a representative sample.

 

Look, you don't need to convince me that there are ways to make s***ty, biased polls (see the recent one in the Rep thread!). I don't particularly care about this poll beyond pressing you on your claims that the questions were deliberately picked to make Fox look bad, because imo if that claim is true that still makes Fox look bad. Maybe even worse if people can intuitively guess which basic current events information Fox viewers don't understand.

 

That's not really my point in all of this. I think that SAME poll can be turned on it's ear and yield opposite results just as easily. The sample size here is simply too small. It needs to be FAR more than 4 questions, and FAR more than the amount of people they polled to even come close to an acceptable size.

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:42 AM)
Even leaving deliberate shots at NJ aside, NJ is not a representative sample of Fox viewers.

 

In journalistic integrity or quality? Sure. They're both garbage. But in terms of deliberate bias? No, Fox takes the cake on that one by a long shot.

 

I disagree. I think MSNBC is JUST as biased.

 

I just took a poll of this and 4 out of 5 people agreed with me.

 

So I'm right. A poll said so. :P I'll post my methodology later.

Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 09:40 AM)
On your point, it's not that I feel polling data is "useless", it's just untrustworthy. Because I believe people have an agenda, I feel most, if not all polls are compromised.

 

IF their methods, as written, were actually followed, perhaps it's believable data. But how do we know that for sure?

 

How do we know anything for sure? How do I know you're real? How do I know I'm real?! emot-psyduck.gif

 

:P

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:43 AM)
How do we know anything for sure? How do I know you're real? How do I know I'm real?! emot-psyduck.gif

 

:P

 

There are some things we just know.

 

This is how I feel about polls -- in a strange way -- I think the people taking them are more often than not compromised by their own opinions, whatever they may be.

 

Watch this video about religious people being morally compromised: http://videosift.com/video/The-Religious-M...d-Demonstration

 

It's exactly how I feel about MOST pollsters. ;D

Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 09:42 AM)
That's not really my point in all of this. I think that SAME poll can be turned on it's ear and yield opposite results just as easily. The sample size here is simply too small. It needs to be FAR more than 4 questions, and FAR more than the amount of people they polled to even come close to an acceptable size.

Well there's mathematical formulas for determining appropriate sample size for a given level of confidence. I don't know what that'd be in this case but for national political polls its only a couple thousand. I'd agree that more questions would be better, but question construction is critical in making sure that the right and wrong answers are clear and universally agreed upon and not dependent on a level of nuance.

Link to comment
Share on other sites

QUOTE (NorthSideSox72 @ Nov 23, 2011 -> 09:45 AM)
Is that a duck-billed pladypus (sp?)?

 

I think it's a pokemon character. It's used as :psyduck: smilie on another forum I frequent to convey that "omg!" sudden realization of your world being turned upside down, but usually ironically.

 

edit: I believe most of the smilies I borrow from that other forum are 'borrowed' from SomethingAwful originally.

 

edit2: did you know that the platypus is a monotreme, an egg-laying mammal?!

Edited by StrangeSox
Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 09:46 AM)
There are some things we just know.

 

This is how I feel about polls -- in a strange way -- I think the people taking them are more often than not compromised by their own opinions, whatever they may be.

 

You post-modernist, you.

 

But yes, this is a recognized problem in the field and one that good, legitimate pollsters strive to be cognizant of and correct for.

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:49 AM)
You post-modernist, you.

 

But yes, this is a recognized problem in the field and one that good, legitimate pollsters strive to be cognizant of and correct for.

 

I've explained this before -- at least I think I have. I did my entire college project on poll taking (I got an 'A' BTW). I stood out in the streets (random locations downtown) and took random made up polls for months to do this project. The subject matter was far reaching, anything from questions about finance to government to automobiles. My aim was to show that I could "fix" a poll in any way I wanted to fix it IN SPITE of using random people.

 

Further, the experiment showed that depending on WHO on my team took the poll, it would ALWAYS sway in the favor of their own opinions on the specific subject matter, but not intentionally. It just happened, because I believe subconsciously, they wanted it too.

 

I came out of that experiment NOT liking polls, and having NO trust in them.

 

So in that regard, when it comes to polls -- even completely legitimate ones -- I'm compromised.

Edited by Y2HH
Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 09:53 AM)
I've explained this before -- at least I think I have. I did my entire college project on poll taking (I got an 'A' BTW). I stood out in the streets (random locations downtown) and took random made up polls for months to do this project. The subject matter was far reaching, anything from questions about finance to government to automobiles. My aim was to show that I could "fix" a poll in any way I wanted to fix it IN SPITE of using random people.

 

Further, the experiment showed that depending on WHO on my team took the poll, it would ALWAYS sway in the favor of their own opinions on the specific subject matter.

 

I came out of that experiment NOT liking polls, and having NO trust in them.

 

So in that regard, when it comes to polls -- even completely legitimate ones -- I'm compromised.

I don't remember you explaining this before, but I appreciate you sharing it.

 

I guess my comment on that would be that a team of undergrads taking polls isn't the same as a professional, competent political scientist taking polls.

 

But this ultimately leaves you with no real way to determine policy preferences or examine any large-scale issues with any degree of confidence.

Edited by StrangeSox
Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:55 AM)
I don't remember you explaining this before, but I appreciate you sharing it.

 

I guess my comment on that would be that a team of undergrads taking polls isn't the same as a professional, competent political scientist taking polls.

 

But this ultimately leaves you with no real way to determine policy preferences or examine any large-scale issues with any degree of confidence.

 

I'd say that undergrads would have less at stake than a political strategist.

 

A quick example of poll fixing would be location. I could just cite "Downtown Chicago" in my methodology, and it would be true. But that "downtown Chicago" location could be outside an Apple store. If I polled 200 people in that specific location, what kind of people do you think would end up being the majority polled? I can tell you now, it would be liberal leaning people.

 

Now if I move just a bit north to the Gold Coast...I bet 200 people would yield a much more republican leaning result.

 

It's just a guess. But depending on what results I was looking for, I'd simply move to a location that would lend the best chances at the results I want...

 

And my methodology saying "200 random people in downtown Chicago", wouldn't be a lie.

 

My results, however, would be totally compromised despite my listed methodology.

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 09:55 AM)
I don't remember you explaining this before, but I appreciate you sharing it.

 

I guess my comment on that would be that a team of undergrads taking polls isn't the same as a professional, competent political scientist taking polls.

 

But this ultimately leaves you with no real way to determine policy preferences or examine any large-scale issues with any degree of confidence.

 

Oh, and sorry for calling you a moron earlier. You're not a moron. I just sometimes let my emotions get the best of me. :D

Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 10:01 AM)
I'd say that undergrads would have less at stake than a political strategist.

 

A quick example of poll fixing would be location. I could just cite "Downtown Chicago" in my methodology, and it would be true. But that "downtown Chicago" location could be outside an Apple store. If I polled 200 people in that specific location, what kind of people do you think would end up being the majority polled? I can tell you now, it would be liberal leaning people.

 

Now if I move just a bit north to the Gold Coast...I bet 200 people would yield a much more republican leaning result.

 

It's just a guess. But depending on what results I was looking for, I'd simply move to a location that would lend the best chances at the results I want...

 

And my methodology saying "200 random people in downtown Chicago", wouldn't be a lie.

 

My results, however, would be totally compromised despite my listed methodology.

 

But a competent pollster looking at that would question the actual makeup of your sample and whether it was representative or not. That's why the cross-tabs break down with so much granularity on a good poll. You can't just say you polled 200 random people without verifying that your sample was a representative sampling of the larger group you're trying to extrapolate your data to. I'm still not disagreeing that there are many, many ways to construct a bad poll, some deliberately and some intentionally.

 

RE: political strategist, obviously a DNC poll is going to be biased but there are plenty of political scientists out there who do not partisan strategists but take pride in presenting representative data and understanding of a population's views and ideas. And not just in politics either, but brand preference, etc.

Link to comment
Share on other sites

QUOTE (StrangeSox @ Nov 23, 2011 -> 10:06 AM)
But a competent pollster looking at that would question the actual makeup of your sample and whether it was representative or not. That's why the cross-tabs break down with so much granularity on a good poll. You can't just say you polled 200 random people without verifying that your sample was a representative sampling of the larger group you're trying to extrapolate your data to. I'm still not disagreeing that there are many, many ways to construct a bad poll, some deliberately and some intentionally.

 

RE: political strategist, obviously a DNC poll is going to be biased but there are plenty of political scientists out there who do not partisan strategists but take pride in presenting representative data and understanding of a population's views and ideas. And not just in politics either, but brand preference, etc.

 

There are some scientific polls that are not biased...I openly admit this. I simply have little trust to find which ones ARE and which are not. In this world, I question almost everything...including scientific polls. How do we know that their posted methodology isn't a complete lie? We don't. We just have to take their word for it.

Link to comment
Share on other sites

QUOTE (Y2HH @ Nov 23, 2011 -> 10:08 AM)
There are some scientific polls that are not biased...I openly admit this. I simply have little trust to find which ones ARE and which are not. In this world, I question almost everything...including scientific polls. How do we know that their posted methodology isn't a complete lie? We don't. We just have to take their word for it.

This is true for any information outside of your own personal perception, and even that is your brain's interpretations of the signals that various sensors are receiving. Unless you want to collapse into a solipsistic view, sometimes you have to trust people not to be lying.

 

One way to check is peer-review, though. Especially on large national issues, there will be dozens of polls conducted by numerous different outfits. And the ultimate test is a poll of something put to a vote, since you can see clearly how closely a poll corresponded the to final result it was trying to predict.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...