The current appetite for STM research and related content is insatiable; the market is hungry for more – more articles released faster with no- or low-cost access for more users via more platforms. If organizations don’t accelerate their processes, they lose subscribers or members to Google search. If researchers don’t publish quickly, they lose the benefit of more citations and higher visibility. It’s a considerable challenge to produce a steeply increasing number of publications rapidly without increasing relative spend.
How can societies, associations and other publishers increase the volume and velocity of their offerings without sacrificing quality or blowing their budgets? By improving efficiency and doing more with available resources.
Improving efficiency is an obvious answer but what does that mean in practice? Re-engineering workflows and business processes? Increasing use of automation? Reconfiguring content management systems? Measuring performance to ensure compliance with best practices? Improving efficiency could entail any of these but organizations must begin the conversation by understanding their publishing mandate and strategic goals, and then evaluating whether they have the means to succeed.
Leaders must define the value proposition and strategy for their publishing activity and assess whether the infrastructure supports that mandate and is scalable. To do this, it’s necessary to consider the organization’s existing structure, processes, and technology. Many societies have moved to primarily digital development and delivery of content but they’re still using legacy processes and tools associated with lower volume and slower-moving print formats. Bottlenecks are inevitable when production is designed for print, but the product is destined for online consumption. For example, repeating cycles of Quality Control (QC) typically clog the pipeline instead of adding value. Flawless quality mattered more when the cost to correct involved long lead times and several services, such as printing and shipping.
Looking at process, organizational design and tools in that order, what should publishers consider when transforming to increase efficiency? Here are some suggestions to get the ball rolling.
What is the process, the content journey?
A deep dive into processes and workflows should focus on value. Understand the steps in the existing process, asking does each step or activity add value, is it necessary? At every step, is the content being transformed in a way that adds value to the publication? Deprecate or sideline activity that’s extraneous to value by eliminating rework and reducing touchpoints. Create a streamlined critical path and an exception process to accommodate necessary deviations. Try to avoid creating multiple unique processes.
Who does what when on the content journey?
Ensure you’re utilizing the expertise and experience of staff and vendors, including freelancers, appropriately. Create capacity by reducing touchpoints, outsourcing, or automating some activities, and increasing the use of technology.
Are you managing the peer review process or do reviewers have à la carte, infinite time-loop choices? Limit peer reviewers’ veto power and move to collaborative peer commentary using any of the dozens of content editing tools available. Clearly define what is subject to peer comment (content) and what is not (font choices). Enforce deadlines in the review period.
What technology tools are in use to produce, deliver, and store the content?
Are the tools utilized to their full functionality? Often, organizations will build or license tools that are only partially utilized. Explore the use of programs and scripts to automate repetitive tasks in content creation and production. Can your content management system (CMS) drive content through the process to reduce manual activity? Perhaps you could expand use of XML to structure content for delivery to different platforms, to improve discoverability, automate publication, and archive content assets.
Most societies would benefit from a release-and-correct approach, which could look very broadly like this:
- Content assessment is done with automated linguistic analysis.
- Initial “machine edit” pass is reviewed by people with expertise, who bring their judgment to bear only where it’s required.
- Peer review is carried out by a collaborating panel, time-boxed with a hard stop.
- Aggregated metadata enables searchable, discoverable content.
- Releases are continuous and automatic after a program has performed scripted QC.
- User interface with customers enables direct, real-time feedback for corrections and necessary improvements.
This high-level process supports nimble, fluid, continuous publishing that’s both efficient and scalable.
It’s vital that you share the purpose of your exploration with staff and trusted service providers. Those closest to the work will have the best ideas of where efficiencies can be gained. Transparency is important in order to get enthusiastic buy-in from stakeholders when the time comes to make changes. Although some investment in technology tools may be necessary upfront, the best ROI will be realized with a free-flowing pipeline and continuous improvement with contributions from all stakeholders. You’ll be maximizing resources to publish at pace.
Maverick offers a suite of services for societies and associations. Learn more here.
By Alison Maclean, Senior Associate
Alison Maclean is a publishing operations executive with international experience in concept-to-delivery editorial and content management in trade, education, reference and professional publishing. Most recently, she held the position VP Content Enablement at Wiley. Prior to that, she was a tenured professor and program director of Creative Book Publishing at Humber College (Ontario).