It is hard to argue with a desire for anyone everywhere to be able to access and use the output of research. If we can build on the global body of research to deliver new insights into vital solutions to problems from climate change to curing disease, this is something we can all strive to achieve. But what do we do when it looks like this goal gives rise to fake papers and a proliferation of uncertain results?
How we got here
The drive for openness from funders, such as cOAlitionS and OSTP among others, challenged publishers to come up with new models to pay for journal publishing. The introduction of the article processing charge moved the cost of providing content from the library to the researcher or, perhaps more likely, their grant funder. cOAlitionS introduced the concept of the transformative journal – one that has a goal to move from subscription to 100% open access OA) – and the transformative agreement. This concept was designed to help libraries drive the change and make agreements with publishers to pay for both subscription and author processing charges for a journal.
The requirements that authors pay for OA publishing to fulfill funding mandates drove the launch of new journals that provided a pure OA publishing option. Shortly after individual OA journals were launched, we saw the introduction of dedicated open access publishers such as PLoS and in time larger commercial publishers, such as BMC, Hindawi and, more recently, MDPI and Frontiers. For many of these journals and publishers, success comes from publishing more content (i.e., more articles, to generate more revenue). So, the first driver of the huge increase in the number of papers came inadvertently from a drive for openness.
The rise of paper mills
Unfortunately, so-called paper mills saw the opportunity to make money from placing content in these journals and getting paid by researchers who needed a published paper but lacked the time or resources to do the work. In 2022[1] Maverick partnered with the Council of Publishing Ethics (COPE) and STM on a paper that researched some of the drivers of these companies and how they get around the normal peer review processes of journals. The result raises concerns throughout the scholarly communication process that papers that are available for anyone to read and use may not represent valid research.
One of the results of the activities of paper mills is articles being retracted in significant numbers.2,3,4,5 Publishers are increasingly employing dedicated teams and sophisticated technology to identify fake papers, authors, or peer reviewers.
Predatory publishers
In addition to fake articles, we have also seen the relentless rise of the fake publisher or so-called predatory publishers. These publishers pursue legitimate content from valid researchers and then promise peer review and dissemination in return for an article processing fee. Often the journals look like well-known titles but with slightly different wording. Researchers submitting their work generally find it published immediately without peer review after they have paid their fee in journals that are not recognized by their institution, funder, or peers. There are guides6available to help authors avoid these predatory publishers but they can be hard to spot.
Preprint Servers
Lastly, we have seen the rise of preprint servers7. These repositories allow researchers to publish early versions of their articles either before or as they submit the article to a journal. Preprints can be helpful in getting research out to their communities quickly, but they are not generally peer reviewed and as such have not been vetted.
Preprint servers play a role in the scholarly publishing cycle. They can help foster communication about research efforts under way, as occurred during the pandemic. Repositories can also provide a home for researchers that need to make their work open and available to others. Such early access to research that has not been peer reviewed can lead to misinformation and cause confusion if early results are taken out of context. The Directory of Open Access Repositories (OpenDOAR) is the quality-assured global directory of academic open access repositories that enables the identification, browsing and search for repositories, based on a range of features, such as location, software or type of material held.
Solving the problem
So how does a researcher know if an article they find on the internet is real, grounded in research that has been undertaken as shown, and vetted by experts who can advise if the results are valid and a useful contribution to the body of knowledge? Or, to put it another way, how does a researcher know if an article that they are reading is fake in some way (i.e., fake data, manipulated images or complete fabrication) or has not been vetted by an expert.
There is considerable effort going into finding ways to answer these questions. Increasingly, publishers are investing in dedicated teams who are experts in content to review any papers that might have been flagged as suspect. Technology is being brought to bear, very much like plagiarism checking software, and there are now technologies that can process papers and help to flag areas of concern that show the paper might not be genuine.8,9
These efforts will certainly help to reduce the chance of these papers getting into the scholarly record. There will also need to be new ways to indicate whether an article that is published has been vetted by a machine and/or a human expert and ideally to show the peer reviewers’ comments on the paper.
However, probably at least as important is the need to work on understanding and changing the drivers that are pushing researchers to publish more and more papers. Publishers, academic institutions, and research funders need to work together to find new ways to measure the value of research output. Current methods narrowly focus on where a paper is published and how many people have cited it in their work. But the impact of a piece of research is so much more than a research article. How the research was conceived and undertaken (i.e., the protocols employed) is as important as the data that was produced from it. A new way to measure a researcher’s work would look at all of these elements, and also the peer review reports, for the output.
In forthcoming posts, Maverick will explore research integrity from the perspective of the researcher, its relationship to technology, and conclude with a deep dive on paper mills.
Maverick offers a program of research integrity services to help publishers achieve and maintain best practices. It can help publishers operationalize research integrity to ensure safeguards are integrated throughout the workflow, from manuscript submission and peer review to publication and data management. For a free consultation, contact your Maverick representative or email info@maverick-os.com.
Download Maverick’s Research Integrity service sheet.
By Jayne Marks, Maverick Senior Associate
Jayne Marks brings over 40 years of scholarly publishing experience to Maverick. She has worked at senior levels in a variety of companies helping to devise and deliver on business strategies tailored for different markets. Throughout her career Jayne has responded to ever changing market environments by developing new product, sales, or content strategies to maximize new opportunities. Jayne’s primary focus has been on understanding the needs of the customers and markets that her products serve and ensuring they evolve to meet changing needs.
Further reading
Highlights from STM’s Research Integrity Master Class
Answer These Five Questions to Ensure an Effective Research Integrity Strategy
1 https://publicationethics.org/resources/research/paper-mills-research
2 https://retractionwatch.com/2023/04/05/wiley-and-hindawi-to-retract-1200-more-papers-for-compromised-peer-review/
3 https://retractionwatch.com/2020/09/17/publisher-retracts-nearly-two-dozen-articles-blocks-nearly-three-dozen-more-from-alias-employing-author-who-plagiarized/
4 https://retractionwatch.com/2014/02/24/springer-ieee-withdrawing-more-than-120-nonsense-papers/
5 https://retractionwatch.com/2022/08/03/exclusive-plos-one-to-retract-more-than-100-papers-for-manipulated-peer-review/
6 https://thinkchecksubmit.org/
7 https://www.surrey.ac.uk/library/open-research/preprints
8 https://www.stm-assoc.org/papermillchecker/
9 https://www.digital-science.com/product/ripeta/