June, 2014.
Science in the golden shackles of its imaginary impacts
by Muru Venkatapathi.
The impact of science on our daily lives is ubiquitous, but tracing bits of this
impact to individual works of science with reasonable certainty is becoming impossible
in most cases. Nevertheless, considering that we spend a notable part of our financial
commons (or GDP) on scientific endeavors, such a microscopic estimate of the impact
by each scientific work seems unavoidable. Even in the case of a work in fundamental
sciences that is far away from any immediate use, an estimate of its impact on our
knowledge is quite pertinent. As a first approximation, measuring impacts of science
have been relegated to the quantification of citation-
Let us start with the broad agreements among the scientific community on this issue:
1) Measuring impacts is necessary
2) Citations earned by a scientific work have a positive correlation with its actual technological and scientific impacts
3) Citation based indicators are far from perfect primarily due to uncertainty in the relationship between real and citation impacts (notwithstanding any advanced processing of citation data).
The strong disagreements arise from the effects of (3), especially its effect on
the way we do science in the long term [1-
Does every citation indicate an identical impact? Does this fallacy result in a folly?
When one attempts to derive metrics for scientific impacts from citations, the following issues should be pondered.
A] Grades of citation: A citation earned by a scientific work indicates any of the 3 kinds of contributions to the citing publication. The first more notable kind is a contribution to the methods used in the citing work; the second is a relevant work with comparable/contradictory results; and the third is a related work used to highlight either the historical antecedent or the contemporary significance of citing work. The first kind is enumerated in the methods and introductory sections of a manuscript, whereas the second type is typically found in the introduction and results/discussion. The third kind is limited to the introductory section of a manuscript. It is thus natural to require that citations are distinguished based on their graded relevance to work as the difference in the real impacts to the citing work may be separable by orders of magnitude. On an average less than 20% of the references of a typical manuscript are unique indispensable citations, more so in the applied areas of science.
B] Methods matter: Also to be noted is that the current practices of highly visible journals (as described in the next section) explicitly discourage a detailed description/verification of methods, to be replaced by longer introductory sections and more plots of the results. The questionable justification is that today many of the methods are eventually repeated in the prolific publications of increments, and also, they do not appeal to a wider readership. The above factors introduce a large bias against manuscripts describing new essential analytical/experimental methods that are fundamental and general.
C] Citations can be inherited: Even before the era of search engines, it was showed
that indicators like citations have had the characteristics of a greedy propagator
(i.e.) the effect of rich getting richer [6], making advisors/co-
D] More the authors more the merrier: One of the most glaring faults in the current
indicators is that total citation-
E] Quality of citations: Recently, there has been an effort to include the apparent quality of a citing publication in the determining the impact of a work. In principle, this can be done using the citation data provided the pitfalls A, C and D are sufficiently addressed. If these are allowed to linger, impact indicators based on advanced data processing techniques can only enlarge those lacunae.
F] Blind spot of industrial impacts: One other glaring deficiency of citation-impacts is that an industry using the work in a publication has a large disincentive to reveal its trade secrets by citing that work.
Conflicts of interest: Science Vs the Journal
Monetizing the scientific publications has resulted in a necessity to make journals
highly visible. It is in the interest of scientific community that parochial interests
do not trump the larger interests of science. Unfortunately, a high standard of
science does not necessarily have a notable correlation with a wide readership (that
is needed for high visibility and citation impacts). Large increases in doctoral
students and the number of publications along with this need for journals to be distinguished
have severely stressed the peer-
It is a system that is designed to publish manuscripts that are appealing to even
the people who may not understand the contents of the manuscript sufficiently. Based
on a superficial understanding, a vicious cycle of inflation in publications on any
subject along with its citation-
Above all, the above practice and negative consequences have been justified based
on the imaginary impacts enumerated by the citations accrued to journals. But the
actual signal-
Finally, a specific example of the uncoupling of citation-
References:
1. www.ascb.org/SFdeclaration.html.
2. Luís A. Nunes Amaral, “Measuring Impact: Scientists must find a way to estimate the seemingly immeasurable impact of their research efforts,” The Scientist (Opinion), February 24, 2014.
3. George A. Lozano, Vincent Larivière and Yves Gingras, “The weakening relationship between the Impact Factor and papers’ citations in the digital age,” Journal of the American Society for Information Science and Technology 63, 2140–2145 (2012).
4. Richard Naftalin, “Rethinking Scientific Evaluation: Asymmetry in the Research Excellence Framework in the U.K. is a threat to basic medical sciences within British medical schools”, The Scientist (Opinion), July 16, 2013.
5. Orion Penner, Raj K. Pan, Alexander M. Petersen, Kimmo Kaski, and Santo Fortunato, “On the Predictability of Future Impact in Science”, Scientific Reports 3, 3052 (2013).
6. Matthew J. Salganik, Peter Sheridan Dodds, Duncan J. Watts, “Experimental Study
of Inequality and unpredictability in an artificial cultural market,” Science 311,
854-
7. Steen RG, Casadevall A, Fang FC, “Why Has the Number of Scientific Retractions Increased?,” PLoS ONE 8(7): e68397. doi:10.1371/journal.pone.0068397 (2013).
8. D. Pines and D. Bohm, “A collective description of electron interactions:I and
II” Physical Review 82, 625-
9. Douglas N. Arnold and Kristine K. Fowler, “Nefarious Numbers,” arXiv:1010.0278 (2010).
I don’t mind if you think slowly, but I object if you write papers faster than you can think.
-
It was very easy in those days for any second-
-