The month of September was an eventful time in academia. Peer Review Week kept us on our toes and we were delighted with the great response our articles and updates about Peer Review Week got. But that didn’t keep us from reading about other interesting things in academic publishing. In this month’s round up, we have a mixed bag that includes the controversial redefinition of clinical trials, s-index, a great new metric that will measure how many times a scientist cites his own work, and more. Read on!
1. Is a new index needed to measure self citation? Citing ones previous work is a common practice among researchers. But is there a need for an index to measure self-citation? According to a paper published by Justin Flatt in Publications, a postdoc at the University of Zurich, s-index needs to be introduced to measure and report to what extent an author is self-citing. Flatt and his co-authors argue that researchers use self-citations excessively to promote themselves, and this can have "a negative impact on the scientific workforce." To curb this practice and bring more transparency, they believe that s-index is essential. Phil Davis, a publishing consultant specializing in the statistical analysis of citation, has raised some questions about the need for a new index. He says that the intentions behind self-citation are several and it would be difficult for interpret the s-index score. "It is puzzling that Flatt argues for a separate s-index over the option to calculate an h-score without self-citation," states Davis. He further adds that it remains unclear how the index would curb the abuse of self-citation, and suggests a more "rigorous curation of what gets indexed and included in the citation dataset" as the solution to this problem.
2. Delays do not end with a manuscript's publication: One of the biggest woes of researchers is that they have to face delays after manuscript submission. So it is assumed that their worries would end once the manuscript is published. However, Elizabeth Gadd, the Research Policy Manager (Publications) at Loughborough University, discusses the problems authors face after the manuscript is submitted. Relating her personal experience, she mentions three major challenges she faced: 1] The copyright policies of journals, even those with an open access model, sometimes make it difficult for authors to allow their work to be shared freely. Gadd found that she was unable to permit the republishing of parts of her paper without treading over the copyright agreement of her journal. 2] Journals may introduce errors in your publications. Gadd expresses her displeasure at the typographical errors she noticed in her paper that were introduced by the publisher. She points out that these revisions were not highlighted for her approval. 3] Delays between the online publication of the paper and its inclusion in the journal's issue can affect the publication's visibility and citations. Gadd writes that till the time her paper was not allocated to the journal's issue, it could not be indexed; and according to her, this can impact the author as well as the journal. She concludes her post by expressing her empathy for academics who inevitably have to face several challenges both before and after their work is published, and encourages the academic community to fix these issues.
3. What makes science work: You must have read up a lot about irreproducibility, but have you watched a video on it? Here's a great video titled "What makes science work?" in which the idea of building a solid body of scientific knowledge is conveyed through some thought-provoking visual and some great interview snippets with publishing industry professionals. Watch the part where Trevor Butterworth explains the concept of reproducibility using the simple example. Imagine you drive to work every morning. You get in your car, turn the ignition on and the engine comes to life. By now this is an every day fact. No matter what you turn the ignition on and the car starts - this is based on the basic laws of internal combustion which will work today, tomorrow, and in the future. This is what the reproducibility of science achieves - the ability to ensure that results are replicated so that they can be applied to finding useful solutions for humanity. The video goes on to show why science is becoming more and more irreproducible (e.g., one reason is researchers' fear of publishing negative results and dealing with rejection because of that) and suggests some simple solutions that we could follow as responsible members of the scholarly community.
4. The need for education for local Asian journal editors: Dr. Hyungsun Kim, President of the Korean Society for Science Editors and Secretary General of the Council of Asian Science Editors, recently published an interesting piece about "Education for local Asian journal editors." Even though the number of research papers published by Asian authors in local and international journals is on the rise, the citation numbers for these papers aren't high. Also, local academic journals in Asia are commonly published by academic societies, universities, or research institutes. As such, Asian journal editors need to increase their efforts to promote the visibility and accessibility of their publications. However, this is not easy because of some obstacles. First, the fact that English is not the first or official language in Asian countries continues to pose challenges to publishing professionals from this region. Additionally, there are several smaller publishers and manuscript editors (MEs) who aren't updated with the latest and best practices in digital publishing. Given these challenges and the urgent need to bring about improvements, Dr. Kim emphasizes the need tor training, orienting and supporting MEs and talks about some of the initiatives being undertaken in this direction. Dr. Kim's passion for the cause of academic publishing in Asia is evident and we realize that perhaps we need more such voices to support these efforts.
5. The idea of a new preprint server receives mixed responses: Medical researchers at Yale University and Yale School of Medicine are preparing to launch a preprint server, called MedArXiv where they can post unreviewed results from clinical trials. The plan, presented at the Eighth International Congress on Peer Review and Scientific Publication, has however, received a mixed response. Many in the medical community are skeptical about the idea as they fear that such papers might sway clinical practice, or prompt patients to try treatments on their own, before reviewers can vet the findings. ArXiv that hosts physics pre-prints and bioaRxiv, a repository for life sciences, are quite popular, so some medical researchers feel that it’s time for the medical community to join the bandwagon as it would speed up research. However, MedArXiv will have a hard time attracting preprints if mainstream medical journal editors decide they won’t publish final versions of the papers. Currently, The BMJ and The Lancet are among the few medical journals that have explicitly said that posting a preprint doesn’t preclude publication, but at the JAMA Network, which publishes a dozen journals, the issue is hotly debated. “There are very strong opinions on both sides,” says Howard Bauchner, editor-in-chief of JAMA. “I suspect we’ll consider papers posted on preprint servers, but we’ll discourage it.”
6. Clinical trials redefined: Clinical trials are used to describe a study that tests drugs or behavioral therapies in people. But new government rules aimed at increased transparency of studies involving people have unsettled researchers. The U.S. National Institutes of Health (NIH) have revised the definition of ‘clinical trials’ and intend to put it in effect starting from January 2018. The rules would be applicable only to research funded by the agency. According to the new definition, interpretation of ‘intervention’ and ‘outcome’ could apply even to studies of basic biological mechanisms. By this logic, almost all basic research counts as a clinical trial. Although these changes were in the offing for the pats three years, now many researchers knew about it. Over 3,500 researchers have signed open letters to NIH director Francis Collins, to highlight the need of input from researchers before implementing regulations that have such a huge impact. According to NIH officials, the new rules were designed to make sure that researchers do not needlessly duplicate results.
We’d love to hear from you. Do write to us and tell us if you found these posts interesting. You can also share any thoughts you may have in the comments section below. If you’d like to stay updated about events in the scholarly publishing industry, follow our Industry News segment, where we share regular updates.