scientific legitimacy in publishing
in Op-ed / Politics / Science on Op-ed, Publishing, Science
Science and climate science in particular has always been at the center of what, post US election, is being described as fake news. Fake news or “post-truth” (more honestly plain lies) have been shaping the discussion around climate change for years. Over the past years the scale of fake news grew and with it mainstream media outlets lost authority and trust.
This flood of fake news is at it’s core a form of obfuscation. Obfuscation aims to hide a true message or signal by increasing the noise which feeds the same channel. It clutters the news sphere using a false equivalency that all information sources (regardless) merit equal weight. Tactics that dominate science discussions that were fed by fake news and fought in the public news sphere are slowly shifting to the formal academic world of scientific publishing as fake (science) open access journals become more common.
The past few years there has been a push for open access journals. Open access journals rely on academics to pay for the final publishing of the journal article, rather than asking for exorbitant access fees post publication. Although promising in terms of free access to scientific work the push for open access has led to a flourishing business of shady journals; facilitated by the publish or perish culture in academia. As with fake news, fake academic journals and fake science obfuscate valid research results by increasing the number of low quality research publications one has to wade through.
For example the journal “Expert Opinion on Environmental Biology” seems like a respectable if not high flying journal with an impact factor of 4.22 (above average in ecology). However, the devil is in the details as the footnote attached reads:
*Unofficial 2015 Journal Impact Factor was established by dividing the number of articles published in 2013 and 2014 with the number of times they are cited in 2015 based on Google search and the Scholar Citation Index database. If ‘X’ is the total number of articles published in 2013 and 2014, and ‘Y’ is the number of times these articles were cited in indexed journals during 2015 than, impact factor = Y/X
Generally journals use citation indices, or impact factors, to indicate their visibility within the academic community. Proper journals are mostly listed by the Institute of Scientific Information(currently ISI Web of Knowledge) and summarized in a yearly Science Citation Index report. Most fake journals can’t establish these credentials and therefore trick scientists by publishing fake numbers (( More so, when searching on the web for ISI one easily comes across imposters as well. Here the service International Scientific Indexing (or ISIndexing.com, the name is well chosen) provides a service focussed on “… to increase the visibility and ease of use of open access scientific and scholarly journals.” )). Although the journal might still contain valid and good research, the tactics used do not instill trust.
More alarming than the profiteering from desperate scientists who chase metrics and the resulting obfuscation is a recent trend of merger acquisitions of more respected journals by fake academic publishers. Here the tactic is to buy small legitimate journals to intersperse with their lesser variety, borrowing trust. Not only will these mergers make it harder to distinguish good from bad journals, it will also increase chances of low quality peer-review, as solid science was never the motive of these predatory publishers. If this is a new trend the question remains how to safe-guard scientific legitimacy of open access journals and science in general, and what format to use?
I would argue that to solve the issue of shady open access journals we need even more radical openness in science. If one is forced to publish data, and code (if not links to how to obtain the data from 3th party sources) it become easier to separate those with quality research from those containing nothing but random noise.
The time invested in a fake research article becomes significantly larger, discouraging abuse. In addition, it will force people into good data management as ugly code and data structures will reflect badly on the scientist as well. Furthermore, since all pieces of the research are available it will also solve issues regarding reproducibility and inter-comparison of research results. Finally I would argue that similar practices could be used in conventional journalism, reporting all raw data used, sources (if not endangering lives) and statistics (if applicable). Transparency is the only way forward in an age of fake news and science, lack of it should be regarded as suspicious.