×Close

On-Demand Webinar | Survey Results: Publishers’ progress towards workflow & automation   Watch Now

December 5, 2017

Key trends in Open Access

The ninth annual conference on Open Access Scholarly Publishing was held in Lisbon, Portugal, in late September. The conference brought together 180 delegates including funders, publishers, academicians, librarians, technologists, and government policy makers interested in open access publications. I had three key takeaways from the meeting.

1. Open Access Publishing is evolving as part of an Open Science Movement.

2. Web and software technology is a critical enabler, but it is not the prime driver of change.

3. Resistance to open access publishing has eroded and is eroding at an increasingly rapid pace. However, cultural resistance to open access publishing and open science is real within select groups.

Open Access publishing and Open Science

Jean-Claude Burgelman, European Commission, struck the tone for the conference with the opening keynote describing the European Union (EU) policies on open access publishing, timing for 2020 requirements across Europe, and the commission’s H2020 initiative for a publishing platform: Open Research Europe. Open Research Europe will be an EU-centric, Open Science megajournal.

The open access megajournal concept is not new news, but rather an endorsement of the approach the Public Library of Science initiated in the early 2000’s. The concept, open access publishing for governmentally funded research, is based on the tenet that it is good for the advancement of science and the scientific community overall, and, a prediction that in 10 years all science will be open access. The EU commission expects a shift from traditional science toward more inductive science in which data is published and readily available for generating new hypotheses. My take – this is Open Science with open support driving it at a governmental level.

I found it interesting that the start-up for Open Research Europe has quite a long-ramp time. Reasons cited were organizational and participant factors justified by Burgelman with the use of an old proverb.

If you want to go fast, go alone.
If you want to far, go together.

It reminded me of the natural tensions that exist between the passion of scientists for discovery and credit, commercial entities like publishers and research-oriented industries, and funders (private and governmental) who want to see the maximum gain out of their investments. I know of no scientists who would turn down having the cover article in Nature or Science, or being noted as “first” in a line of discovery even though their work often builds on work done by others. I know many scientists who realize that compelling research increasingly requires a suite of technical and analytic skills best handled by team. The way we do science is evolving.

The Web and software as enablers

Burgelman started emphasizing this point in his keynote by stating that we can do things today on the web we could not do 15 years ago. Nearly every speaker thereafter had some story to tell about new tools, new services, introducing social networking, or developing new methods of sharing scientific information. The novelty was not in the tools or the XML or the platform. The novelty each emphasized was in what the tools, the XML and the platforms allow one to do, and how quickly one can do it.

Transitional forces external to pure science have help mold the direction science is taking. Google, Facebook, Apple smartphones and tablets have forever changed what it means to be connected and how to communicate. But these technology influences cannot separate the value of the human intellect in assessment. Martin Quack (ETH Zürich) emphasized the importance of involving trusted experts in the evaluation of science rather than relying on a high rejection rate by a journal as an indicator of quality.

Web enablement is both an addictive and an evolutionary force for science. Software algorithms for search guide scientists to new data, new work, and fresh conclusions. They also breed a form of impatience. Who among us has not been afflicted by ‘clickitis’, that sudden urge to skip a link because it is not resolving fast enough? All of us have seen the increase in bibliometric data designed to help sift what is more important from what is less important. My take is that technology is enabling the move to open science, a move I feel strongly positive about, but it is not without risk. Our ability to use technology is still in its infancy when it comes to ensuring our ability to recognize and reward good science.

Cultural resistance to Open Science

I grew up in academics and trained as a research biologist. It was clear during my training there was interest by academicians in sharing data, but not if it negatively affected a professor’s chances for tenure or other academic rewards. Danny Kingsley (University of Cambridge) brought back a flood of memories about how senior academics are slow to embrace the change to open science because the current system has rewarded and continues to reward them. This puts tremendous pressure on early career researchers that if not addressed effectively at the institutional level is a threat for them to get funding.

Change is coming in several forms. The most visible example is in the growing number of preprint servers. It is likely many publications will never go beyond being available on a preprint server. Preprint servers are good for open science, and, good for early career researchers as a path toward recognition and reward.

My take is that cultural resistance to open science in academia is a difficult underlying reality affecting the open science initiative that will take time to resolve. It is a generational issue born out of a successful practice and reward system, and, it is influenced by the degree to which outside factors (government/funder decisions and technology enablement) are putting pressure on institutions.

I am optimistic we are on the right path and COASPA 2017 was an excellent forum for bringing these issues into focus.

About Greg Suprock

Greg is Head of Solutions Architecture at Apex. He has over 20 years of experience in XML workflows, content management, web application development, and prepress. Greg excels at collaborative efforts to achieve project and business goals. He has developed XML workflows for the Public Library of Science, HighWire Press, The Library of Congress, and many more.

Questions?