Research Citation Lifespan
Made an interesting discovery recently regarding research citations. Apparently, all research has an expiration date…sort of. As originally presented in my Masters program at CSU Global, citing research older than five years would not be viewed very favorably by the Thesis Committee. Specifically, the professor offered this in the discussion forum: “Always strive to obtain reliable and valid sources within 5 years whenever possible.
My response was that this may be a good general rule. However, in some cases, for example with computer science, the window is smaller. References related to coding practices and examples are frequently outdated after two years – less with web technology. In other cases, for example the work done by Taubes (2007), 5 years is way too short. Much of the critical research referenced by Taubes is from the 1950’s and 1960’s with some reaching back into the 1800’s.
A subsequent conversation shook out a number of exceptions to The Five Year Rule.
- It is preferable to use primary sources, regardless of how old they are.
- Citing primary (original) sources is appropriate when the research is definitive and no other original research has been done relating to the primary source.
The Five Year Rule was contrary to what I had learned as an undergraduate biochemistry student. But admittedly, my experience with writing rigorous research papers is a bit rusty, so things could have changed. I decided to explore a bit. What I discovered was interesting.
I queried a few of the chemist/scientist types I keep in contact with and their response to a five year limit on references was unanimous: They couldn’t do research with this limit and wouldn’t trust research that did. Looking at the references to research articles in several science journals, such as Science and Nature, was consistent with this view. A visit to the Public Library of Science and looking at the reference list to virtually any of the biology or medical journal research articles was also consistent with this view. Reference lists are lengthy and frequently go back to the 1970’s with a few references back as far as the 1940’s.
While these examples are from the hard sciences, I’m skeptical a philosophy student, for example, would be constrained from citing Aristotle in her thesis. So perhaps there is something different about “academic” research in general or educational research in particular that I’m not understanding. I haven’t been able to find anything as yet that describes the five year limit or the reasons behind such a limit.
What I did discover was an entire specialty (bibliometrics) devoted to the statistical analysis of research paper citations. Pendlebury (2008) describes the value of this research:
Citations, the references researchers append to their papers to show explicitly earlier work on which they have depended to conduct their own investigations, shows how others use a work in subsequent research. Tracking citations and understanding their trends in context is a key to evaluating the impact and influence of research. (The Development of Publication and Citation Analysis section, para. 1)
There is a lot of jargon in the literature, such as the ‘ISI Journal Impact Factor’ and ‘Google Scholar h-index,’ which in its own right suggests a level of maturity in the field. Consequently, there is much more material than I have time to figure out. But I did find several references to the age of references as one among many indicators for measuring a research paper’s value. Far and away, however, the most important measure of a paper’s value or impact is the number of citations it has received, i.e. how frequently it has been referenced by other publications. Several sources proposed systems by which to score research papers based on the number of citations, the age of the citations, and the connectivity of the citations. They actually analyze the citation pathways through various research papers to determine knowledge value within the paper.
The professor did offer one reference (Olhoff, 2011) which had the following to say:
Citations should usually be seven years old or less. A lot can change in seven years, so if your citations are older than that, your readers will wonder why. However, most topics have classic studies that are old, but regarded as seminal or critically important. All the researchers should build off those classic studies. If your topic has classic studies, you can and should cite them, even if they are older than seven years. (Kindle location 138)
The Five Year Rule in place at CSU Global, then, appears to be somewhat arbitrary and doesn’t exist in the wild as a definitive constraint. It is at best a guideline. Yet it was very clear the challenge I made to The Five Year Rule was novel within the context of CSU Global. I’m left wondering how many institutions of higher education have indoctrinated how many students with the belief this is a hard rule. It might explain a few other things I’ve noticed about the education field.
- Depending how you look at it, The Five Year Rule either exacerbates or supports the “publish or parish” death march. If research work has a shelf life of five years before it’s relegated to the citation dust bin, the risk is that a researcher pressed for time will generate research that is less valuable, shallow, or perhaps even based on compromised data.
- The Five Year Rule prompts the reinvention of the wheel as older research is intentionally discarded. This might explain an observation that public education “reinvents” parts of itself every ten years or so and therefore frequently does little more then repeat mistakes of the past. As an educator collegue (now retired) recently expressed in exasperation with respect the the resurection of one particular teaching practice, “We tried that in the 1970’s and it didn’t work. Not something ‘like’ that, but THAT exactly. It failed then and it’s failing now.” The Five Year Rule doesn’t support or promote reconsidering older research in the light of new understanding. As a result, there is a lot of “new” educational research which is simply a repackaging of old ideas.
Olhoff, J. (2011). How to write a literature review [Kindle Android version]. Retrieved from Amazon.com
Pendlebury, D. A. (2008). Using bibliometrics in evaluating research. Philadelphia, PA: Thomson Reuters
Taubes, G. (2007). Good calories, bad calories. New York, NY: Anchor Books.
(Note: I’m in the thick of the capstone project for my Masters. Consequently, posts will be few and far between until after the first of the year.)