Is university research is worthless and meaningless to nearly everyone, including the researchers themselves?
In an argument for reforming colleges and universities to include less focus on research, Steven Pearlstein of the Washington Post cited a startling statistic to argue that virtually all academic research vanishes into the void without a trace:
In his new book "Higher Education in America," former Harvard president Derek Bok notes that 98 percent of articles published in the arts and humanities are never cited by another researcher. In social sciences, it is 75 percent. Even in the hard sciences, where 25 percent of articles are never cited, the average number of citations is between one and two.
A new book by a former Harvard president seems like it should be a credible source of information on this topic, but the shocking statistic turns out to be a rehash of a 1991 study of publications from 1984 that suffers from extreme methodological flaws — and that refuses to go away, in part due to revivals like Bok's.
Where this statistic comes from
The research on this issue comes from 1991, when the journal Science published an article analyzing data from the Institute for Scientific Information, a database that tracks citations in the most prestigious academic journals.
The analysis looked at publications from 1984 and counted whether they were cited within five years of when they were first published. And it found the dismal results Pearlstein cites (by way of Bok) for the humanities: 98 percent of articles went uncited, as did 75 percent of articles in the social sciences.
But the data included all items published in journals, not just research. It also counted obituaries, letters to the editor, and meeting minutes, as David Pendlebury, a researcher at ISI, pointed out in a 1991 letter to the editor of Science critiquing the study. And those non-article items were particularly prevalent in humanities, where they made up 69 percent of the "journal articles" in the citation index.
Using better social science methods makes social science research look better
In other words, some of the uncited "articles" weren't articles, and were probably never meant to be cited in the first place. Excluding those items, as Pendlebury did, didn't improve the results in the humanities much — 93 percent of humanities research was not cited in the first five years after publication — but it made a dramatic difference in the social sciences, where the percentage of uncited articles dropped to 45 percent.
Pendlebury pointed out that social science research appeared to be getting more relevant, not less: The share of articles that were not cited within the first five years after publication had actually dropped in the late 1980s.
In the humanities, 93 percent still seems awfully bad. But as Pendlebury and others have pointed out, books, not journal articles, are often the most relevant publications in those fields. And the share of articles that remained uncited after five years hadn't changed throughout the 1980s.
So the statistic isn't just about university research published more than 30 years ago, when many of today's faculty weren't yet publishing. It was also inaccurate — and in the case of the social sciences, dramatically so — even when it was first published.