• References Matter | Click The Links!

    Y’all, one of my favorite people called me yesterday and reminded me to discuss my references soapbox. I can’t believe I haven’t done it yet, because standing on a soapbox is literally the only time I make it to average height.

    Here’s what’s up:

    There are metric tons of misinformation in the media, especially the mother ship of fake news, social media. Most of that misinformation is actually pretty easy to recognize, because it’s not supported with any real references.

    What do I mean by “real references”? Let’s break it down.

    Real References Are Peer Reviewed

    First of all, Gwyneth Paltrow’s blog is not a real reference. I’m pretty sure she doesn’t know that bacteria are actually necessary for our survival, so her 3 am epiphanies about feminine hygiene do not qualify as legitimate information.

    Now, there’s no real way to ensure that any piece of scientific evidence is 100% bulletproof. But the scientific community has several principles that ensure we get as close to bulletproof as possible over time. The first of those principles is peer review. You can read the different takes on peer review from Wikipedia, Elsevier, and Wiley, but they all say essentially the same things.

    Peer review maintains standards of quality, improves the science, and provides credibility for published data.

    The process is pretty simple. An investigator (scientist, doctor, enterprising fifth grader) documents how they did an experiment, their results, and what they believe those results mean. They then submit that documentation to a scientific journal. If the submitted information is strong enough, the journal shares it with individuals who are knowledgeable in the area of investigation. Those individuals (peer reviewers) then go through the entire document with a fine toothed comb. If they find a problem with the manuscript, they send it back to the original investigator for additional experiments, revised conclusions, and any number of other modifications intended to increase the quality of the information.

    Peer Review Rejects Flawed Attempts At Science

    Most publications are not accepted for publication without changes, and the vast majority will go through at least two rounds of revisions before the reviewers deem it high enough quality to share with the world.

    And – the scientific community actually rejects a significant portion of submitted manuscripts for scientific shortcomings. This applies only to situations in which the reviewers find scientific shortcomings too serious to be fixed through the process of peer review, but it happens fairly often. It also protects the integrity of published information by preventing mass-distribution of incorrectly reasoned or executed experiments.

    Real References Include Statistics

    Listen, I hate statistics too. Believe me. But much like a Good Housekeeping seal of approval or a great score from Consumer Reports, statistics help us determine whether or not we can trust data.

    Let’s say you’re doing an experiment. You want to see if fifth grade girls carry the same amount of gum as fifth grade boys. You bump into one fifth grade girl who has 2 pieces of gum, and one fifth grade boy who has 4 pieces of gum.

    At first blush, it seems like you could say that fifth grade girls do not carry the same amount of gum as fifth grade boys. But this is where statistics would help. If you did a simple test on your data, you would find out that your data is not significant – i.e. statistics say that the 2 pieces of gum for the girl and the 4 pieces of gum for the boy aren’t necessarily different. Probably because your sample size – i.e. the two kids you polled – was too small.

    Now, if you expand that experiment to polling 500 fifth grade girls and 500 fifth grade boys, you might find out that girls have 2.3 pieces of gum on average. And boys have exactly 3 pieces of gum on average. Well those two are actually closer than our first small poll, right?

    But statistics takes into account the fact that your second poll involved 1000 kids instead of just two. And then statistics say that the difference between 2.3 pieces of gum is significant. Statistically significant.

    That’s an important phrase – statistically significant. It’s your neon flashing light saying “this data matters.” You don’t have to know what a p-value is or a chi squared test. You can even skip right over t-tests and quartiles. All you really need to know is that published data with real value is statistically significant. And the authors will usually come right out and say, “our results were statistically significant.”

    Statistics involve many more subtleties, and I refuse to execute that deep dive until someone requests it.

    Because I hate statistics too. I just know they’re necessary. Like doing laundry and not wearing yoga pants to work.

    This Blog Includes References and Statistics

    Sort of. I absolutely include references. Any time you see me mention a study, a researcher, an investigator, the word is a link to the study. Every single time. If ever you can’t find a link, that’s a mistake. So let me know, please!

    I will not include the actual p-values and statistical information you might find buried in the scary parts of my dissertation. If you want information like that, please find the mustiest part of your local university library and go nuts. Or click the links to the studies.

    In this blog, I will include statements about statistical significance. Like in this post, where I say there is a “real statistical difference” between the two groups of women tested. Or in this one where I make several statements including the fact some data was not statistically significant.

    Reliable Data Can Be Repeated

    When a scientist is lucky enough to achieve statistically significant results, either through a deal with the devil or a well-designed experiment, the data still isn’t beyond question. Once published, other researchers around the world execute their own similar experiments, building on or directly repeating the original experiment.

    That’s how people discover what turn out to be the biggest scientific scandals ever. Bunches and bunches of scientists repeat experiments without arriving at the same results as the original publication, and then the internet yells about it. A lot. All leading to the giant discovery that either someone lied about their original data, or they left important information out of their methods.

    Repeatability also leads to what we call scientific consensus. Science Moms actually do a pretty great commentary on the value of scientific consensus in this video. The gist is that over time, scientists all over the world repeat and build upon previously published data. Eventually the scientific community as a whole arrives at a consensus, an accepted agreement that the original data is valid. Because. So many people directly reproduced it or built further data upon the validity of the original data.

    Is This Reference Legit: A Checklist

    As you’re scrolling through Facebook, wondering why your grandmother posts so frequently about the health benefits of red wine, use this checklist to decide if grandma knows her stuff.

    Or is just an apologetic wino.

    • Is there a reference for the stated “facts”?*
    • If so, is that reference peer reviewed?*
      • Not sure?Go to PubMed and search for it. If it’s in PubMed, it’s probably peer-reviewed.If it’s not in PubMed, try Googling the publisher. If it’s a legitimate journal, they will have a site and an explanation of their peer review process.No site, no explanation of peer review? Not legitimate.
    • Does the reference say the results were statistically significant?*

    If you answer yes to all of the starred question above, the reference is probably be legitimate. As with all things, there are exceptions. But this list will get you to the right conclusion 95% of the time.

    Not sure if a reference is legit? Don’t have time to check it out yourself? Just do what my fabulous friends do, and ask me in the comments or send me a message!

    Photo by Sunyu on Unsplash