Without a doubt, you’ve heard about fake news. At this point, the prevalence of fake news stories, which proliferate on venues like social media, has become a major public talking point. But did you know, especially among the scientific and academic communities, fake research is likewise becoming a growing issue?
Especially when it comes to hot-button political issues, such as racial and gender inequality, some organizations stand to gain by making claims supported by studies that are tenuous at best. As professionals, it’s important we don’t fall for clickbait headlines full of false or misleading information.
Want to know how to spot fake research so you can disregard it? Here are some of the best ways to discern what’s real and what isn’t.
Look for a Methodology Section
Any good publication, regardless of topic, should include a comprehensive methodology section. Even if it’s not an academic paper, there should be a section describing how the author gathered the information they are presenting. It could a simple reference list, a bibliography or footnotes.
Somewhere in the document should be a section telling you where the information is coming from. In this section, the publication addresses the who, what, when, where, why, and how of the research. This could include the sample size, the response rate, the timeframe, the setting, and other information about how the experiment was conducted.
If, however, the methodology section only gives a small location, a very limited timeframe, or neglects to explain other crucial details of the methodology, you should be very wary of the research. An even bigger red flag for spotting fake research is if the methodology section is missing altogether.
Another hallmark of fake research is if the publication claims an over-generalized conclusion based on a small number of observations. For instance, the researchers may pick a handful of carefully selected candidates for their study and then draw a conclusion about an entire population. More specifically, does the population in the research resemble the population you are working with? Do the people you are working with have anything in common (education level, geography, gender, age) with the folks presented in the research you’re reading. If they don’t, move on – you can’t easily apply this research to your clients or patients. If the group in the research is so specific, ask yourself if the author could truthfully make those big bold generalizable claims based on such a narrow focus.
Aside from looking at the methodology to see if the scope of the study is large enough to draw any conclusions at all, you should naturally be aware of any big, bold claims that a study claims to ‘prove’. If something seems too good (or bad) to be true, then, as any savvy internet user knows, it probably is.
Whether it is financially, politically, or otherwise motivated, some research loses its credibility because it asks too many leading or loaded questions. An impartial, unbiased, and credible study tries to ask open-ended questions that don’t lead to certain responses.
For instance, in a study about workplace behaviours, a fake study may ask employees, “What is it that you don’t like about your boss?” In this case, a dislike of the boss is already presumed. A better question would be, “How do you feel about your boss?” Likewise, asking, “How satisfied are you with your current role?” is a much more objective question than, Why are you dissatisfied with your current role?” If you find the study draws conclusions from this type of leading questions, you should consider whether you can trust it.
Using Multiple, Reliable Sources
Just like we’ve learned to do with the news in recent years, we can apply the same lesson to sorting out fake research. If you come across a study in your email inbox or social media feed that you find interesting but whose publisher you aren’t familiar with, it behooves you to take time to look into the source. Have you ever heard of the journal that published the research? You can’t trust everything you read on the internet, so your best sources of information are always going to be well-regarded publications.
Besides using reliable sources, it also helps to cross-check multiple sources. Though this isn’t always possible for groundbreaking studies, the scientific community will always double-check each other’s work when a finding is significant enough. You should be able to find duplicates of the study with the same conclusion across multiple reliable sources. This is one of the best things you can do to make sure your information is credible.
Look for Biases
Though even the most reliable and trustworthy of studies contain some inherent biases, there are definitely some more biased than others. If someone went out of their way to create a fake study, chances are they are trying to push a certain agenda.
One way you can look for biases in research is by asking if the conclusions help to advance the sponsor’s agenda. If you see weight loss research paid for by a diet pill company, this can alert you of bias in the research. Instead of looking for the truth, as science is meant to do, these studies serve to prove a point the company wants to make in order to sell its products.
In today’s age of information, we need to be wary of the spread of misinformation. It can be hard to discern truth from falsehood, especially when the fake stuff is designed to so closely resemble genuine studies. However, by approaching every study with a critical eye and by looking for biases, misconstructions, and missing elements, you can spot fake research for yourself.
Remember, always double-check your information and try to use credible sources whenever possible. It’s up to us to stop the spread of dangerous misinformation so, if you get a clickbait study in your inbox or you see a colleague share one on social media, don’t be afraid to point it out to them and help them to see the truth.
The battle against fake news and fake research alike starts and ends with us. The best weapon we have to fight against it is an awareness of its existence and the red flags we can use to identify it.