Tags

, , , , , ,

Quoting statistics has long been one of the preferred methods people use when proof of their position is demanded.

These people are often either baldfaced liars, or they’ve just been misled by the former.  After all, you can massage statistics to show just about anything you want.  76% of Americans know that.

Now, data doesn’t lie.  Data is neutral; it exists.  When people start manipulating that data into the form of statistics, however, is where the trouble starts.

The particular bit of statistics that sparked this post was fairly innocuous; I was reading a piece on gun control, and the issue of limitation of suicide methods came up.  The argument was made that, if a quick and easy means of suicide is eliminated, then the suicide rate as a whole will decrease, and used the example of the coal gas stoves in the UK as an example.

For those who weren’t aware of the issue, Britain used to use coal gas rather than natural gas in their gas lines.  Coal gas carries with it a very high amount of carbon monoxide, so one of the preferred methods of suicide at the time was to simply blow out the pilot light and stick your head in the oven.  No fuss, no muss, and in theory you just fall asleep and never wake up.

When Britain phased out coal gas, they saw a drop in overall suicide rates.  In theory, then, the conclusion is supported.  Or so you’d think.

The reality, as always, is more complicated.

What happened was that the suicide rate in women dropped noticeably and remained at a slightly lower level afterward.  The suicide rate among men dropped briefly and immediately bounced back upward.  Indeed, the suicide rate among men in the UK actually went up considerably after the elimination of coal gas.

So does it work, or doesn’t it?

The numbers tell us, rather than means limitation affecting suicide rates as a whole, that instead the elimination of a theoretically painless means of suicide may affect the suicide rates of women in the long term, but even that is incredibly hard to analyze with any sort of accuracy given that simultaneous to the phaseout of coal gas was the easing of postwar austerity in Britain, a number of important steps in the Women’s Liberation movement, and enough random societal factors that while yes, we can definitively say that the elimination of coal gas caused a marked drop in the number of suicides by coal gas, we can’t really say much more than that.

Chances are that your favorite statistic, the one you quote to prove some arguably controversial or inflammatory opinion that you have, falls within the realm of bullshit.  Or at the very least, is based on an incomplete reading of the data.  Likely intentionally so, though not necessarily by you.

Do women earn 77 cents for every dollar that a man earns in the United States?  Well, that depends on how you look at the data.  The answer is yes, assuming you fail to take into account differences in things like hours worked, and average the pay across all jobs rather than looking at the pay within a given job category.  If you do actually take account for all of those things, let’s collectively refer to them as “reality”, then the answer is no, women earn around 93 to 96 cents for every dollar that a man earns, and that, while still a gap, isn’t quite so blatant and horrifying, so it’s obviously not the statistic that people pushing an agenda are going to be using.  And even that reduced figure admittedly fails to take into account a fairly wide array of factors revolving around different cultural attitudes towards work as held by women and men.

Again, this shit is complicated people, and if you run around spouting statistics you don’t understand because they sound good, or align with your political beliefs, you’re contributing to the sort of willfully divisive ignorance that prevents us from getting things done.  Because there’s gonna be someone out there who does know the truth behind the numbers game, and when you shove your educated ignorance in their face they’re just going to write you off as another blinkered ideologue.

The most galling moment for me is when people who do, in fact, know that the statistics they’re quoting are radically flawed, that they have minuscule sample sizes, or misleading surveys, or biased interpretations will turn around and make some variation of the following statement:

“Sure, the statistics aren’t the best, but just because they’re not perfect doesn’t mean that there’s not a problem!”

Because that’s not what they’re saying.  Not really.

What they’re saying is this: “Just because the statistics I’m using to support my position are incorrect, and other, better statistical analyses contradict them, that’s no reason to doubt the existence or scope of the problem I’m trying to describe, even though reality disagrees with me!”

Reality interferes with agenda.

Take, for example, the wage gap.  Base it on reality, not agenda.

How do we “fix” it?

Because I can’t figure out a way that doesn’t involve throwing equality of opportunity out the door in favor of equality of outcomes.  You can’t force women into higher-paying careers, so you’re left with only three options.

You either arbitrarily decide to pay women more for less-demanding work, you make it easier for women (and, by definition then, harder for men) to get into remunerative careers, or you do what the Democrats do, and you put together a government mandated placebo and let the Republicans shoot it down.

Yes, that happened.

And don’t even get me started on how easy it is to generate a social sciences study that supports exactly the agenda that you’re pushing.  There’s a reason why a recent study of studies found that 75% of social psychology studies couldn’t be replicated.

What could possibly have caused that, I wonder?

It’s trivially easy to skew the results of a social experiment without calling into question its validity, at least among those predisposed to accept its conclusion.  Corrupt the pool, carefully craft the questionnaire, or simply retroactively read more into results than the respondents intended.  Hell, some of those methods make the study easier to conduct under any circumstances: minuscule sample sizes from among non-representative groups is pretty much the norm in most biased studies, after all, because it’s pretty easy to leverage your undergrads into participating.

So what’s to be done about it?

Be skeptical.  Realize that much like things that are too good to be true, many things are simply too awful to be true, in no small part because good and bad are simply matters of perspective when we’re talking about agendas.  Look for data, look for methods, don’t just accept the conclusions of a study because it agrees with your own preconceived notions.  If those things aren’t available to you, if the publishers of the study refuse to tell you how they came to their conclusions?

Yeah.

Just walk away.

Don’t quote it, don’t reference it, don’t believe it, because you always need to see the man behind the curtain.

Advertisements