Writing wrongs

Article by David Margetts

Digitalisation progress in pharmaceutical manufacturing has been too slow. David Margetts looks at what can engineers do to speed up adoption

WOULD you take a medicine if you knew the production records were written down by pen?

One of the pharmaceutical (pharma) industry’s hidden secrets is the difficulty and reluctance to move manufacturing data into electronic systems. A recent news article1 reported that paper and “analog systems” are affecting the credibility of India’s pharma companies, the third largest source of medicines globally. These issues are not restricted to India. While multinational corporations have deployed manufacturing software at their premier “blockbuster” factories, across the world far too few facilities have digitalised their production operations. Why is this? And what can be done to make faster progress towards Industry 4.0?

Why we should move from paper and legacy systems

My mantra is that if it is on paper, it is not data. If you walk through a pharmaceutical factory, you will quickly understand why insiders joke that Good Manufacturing Practice (GMP) actually stands for Great Mounds of Paper!

Relying on paper to guide and record manufacturing creates significant inefficiency, risks human error, and locks away information that could be made useful through systems integration and analytics. Huge amounts of hard copy forms, standard operating procedures (SOPs), checklists, recipes, test methods, and logbooks are manually filled out and checked alongside the actual production process. Often data is transcribed from paper and duplicated into standalone Excel files as the only electronic “solution” companies have. In 2023 we assessed a typical mid-sized factory and found over 1,500 Excel files, and for each batch of medicine over 5,000 manual entries were made on paper. This is clearly not sustainable, and increasingly clashes with a modern expectation to rapidly access data and use it for monitoring and continuous improvement.

Pharma’s legacy practices have focused for too long on “perfect” documentation as evidence of product quality. As long ago as 2004, Ajaz Hussein led the US Food and Drug Administration’s team on Process Analytical Technology (PAT) and Quality by Design (QbD) to break this legacy thinking. Hussein succinctly stated: “Today’s pharmaceutical companies rely too heavily on testing document quality rather than Quality by Design.”

The benefits of electronic records in pharma

In 2023, we visited a leading cell and gene contract manufacturer responsible for producing novel and personalised medicines starting from collected human cells. This organisation assessed the impact of eliminating paper in their cell therapy production. The resulting savings cut 30,000 person hours, 200,000 pages of paper and over US$1m in costs per year. There are also significant non-cost-related benefits as Figure 1 explains.

Figure 1: Helping a cell and gene contract manufacturer digitalise their operations

What is holding back digitalisation?

Cost vs value

The experiences of the few pharma factories that have deployed manufacturing systems shows that it takes two to three years, a large team, and millions of dollars to implement. Below the large global pharma corporations, smaller companies struggle with budgets and being sure that they can achieve a return on investment. In conversations with owners, they often do not fully understand what they will gain from digitalisation, worry about deployment risk and would rather invest in new equipment to increase output. Engineers need to find and propose cost-effective solutions that can quickly demonstrate achievable value.

Lack of confidence and change culture

Pharma is an inherently change-resistant environment. QbD as an example carried great potential but was never widely adopted. During software projects in pharma there is a constant need to convince quality assurance people that a particular software is “GMP-compliant”. Engineers who also understand GMP can help us move on from interpreting regulations as an excuse not to change.

Only new medicines need QbD and digitalisation

Since the early 2000s, the US FDA has been promoting a shift to “continuous GMPs (cGMPs) for the 21st century”2 and encouraging continuous verification. Today’s biotech and cell and gene medicines are inherently variable, and for them there is a strong relevance and renewed interest in PAT/QbD and digitalisation. Engineers working in process design for all products should look for opportunities to assure quality in real-time. We still need to encourage systems and process engineering that moves away from fixed process setpoints, and quality checking through offline testing.

Lack of supplier innovation and legacy systems

Perversely, one of the barriers to wider digitalisation are the already deployed large IT solutions in “Big Pharma”. These projects involved heavily customising already outdated software platforms from the on-premise and client-server era. This has led to supplier innovation slowing down, as their business model and product roadmaps are dependent on supporting these large complex installations at multiple sites around the world.

Reliance on legacy interface technologies

Interfacing data is a fundamental prerequisite for Industry 4.0. However, integration is a moving target due to a constant evolution of protocols and methods. Automation and systems engineers need to keep learning and researching what is current and what is trending. The few pharma factories that have fully adopted large IT systems now have an integration spider’s web of legacy and newer protocols. Each connecting line shown in Figure 2  could be a flat file exchange, a set of direct database queries and inserts, proprietary Remote Function Calls or supplier specific adapters of a “standard” such as Open Platform Communications (OPC).

This mess of point to point-type integration results in huge setup and testing efforts, and seemingly endless systems engineering work to maintain the duplication of masterdata and real-time data across these interconnected enterprise systems.

Figure 2: Too often I’ve seen pharma companies adopt large interconnected IT systems like the one below that produce a complex and ineffective spider web

Curing pharma’s digitalisation disease

Taking into the account the aforementioned challenges, what are the enablers that can lower the barriers for pharma digitalisation?

Start the war against paper

In my opinion, it is not a tenable situation that as an industry and as patients, we continue to accept that medicines are manufactured using paper records. Will the need for continuous data and analytics be the turning point for companies to decide to invest? Or will regulators step in and mandate e-reporting requirements? Engineers are uniquely placed to find solutions that fit the business and available budget and demonstrate that things can be done better when we eliminate paper.

Put QbD at the centre of GMP

The US FDA from 2004 onwards has promoted QbD, PAT, and Continuous Process Verification as a better approach to process control and product quality. Engineers can look to other manufacturing industries that have long applied these principles of using real-time data and control. If food and beverage, tobacco, and oil and gas products are important enough to warrant continuous monitoring and verification then why not pharma?

Shift to IIoT integration

Industrial Internet of Things (IIoT) protocols such as Messaging Queue Telemetry Transport (MQTT) started as a lightweight, publish and subscribe method for monitoring data from remote oil and gas wells. Publish and subscribe protocols such as MQTT have been adopted recently in factory applications as they are widely supported by the large cloud service providers, easily configured, and offer a decoupled integration of data producers and data consumers. A key difference with publish-subscribe is that no request or acknowledgments go back to the source device/system. This has benefits for security and scalability as multiple applications that want to use data only have to subscribe to a central broker.

The potential of a unified namespace

Building on the ability of open and easily accessible IIoT protocols, a new data management concept in industrial automation is emerging known as the Unified Namespace (UNS). A simple diagram is shown in Figure 3. A UNS itself is not a particular product or feature, but a conceptual term coined by Walker Reynolds of Intellic Integration3 as a way to design solutions that have an ethos of central sharing, distribution, and referencing of data without duplication and to avoid point to point connections.

In a UNS example, a pH measurement from a probe can be subscribed to from an Electronic Batch Record (EBR) to record the pH data, and check that it conforms to specification. If not, the EBR can ask a supervisor to add an investigation context, and this EBR exception event can be published back to the UNS for a QMS to record and publish a Corrective and Preventative Action (CAPA). This simple orchestration of data and events across an organisation and systems is not easily possible currently due to the integration mess that pharma has become used to.

Figure 3: A Unified Namespace centralises data exchange across an organisation’s factory and enterprise systems

End-user configurable cloud alternatives to large enterprise out-of-the-box software

In the last decade, a new breed of configurable applications for industrial usage has arisen known as no-code and low-code platforms that are deployed online through the cloud. These allow non-programmers to fit the software to their business process using drag-and-drop modelling. Process engineers do not have to be software experts to easily learn these tools. And the engineers working in facilities can quickly demonstrate how to digitalise a process in a bottom-up manner.

While manufacturing elsewhere has rapidly embraced cloud solutions, pharma has been too slow to adopt cloud due to the glacial pace of its digitalisation and perceptions on data security. IT engineers can help decision-makers by creating fair comparisons in terms of the cost, security, and capability to get beyond pharma’s emotional reluctance.

Conclusion

Digitalisation in pharma manufacturing has not spread beyond the large global corporations, and so tens of thousands of GMP-regulated factories continue to rely on paper. By not replacing manual records, critical product data remains at risk of human error, and cannot tap into the exciting potential of AI/machine learning to gain value from data. For all pharma companies to move forward, engineers will play a key role in bridging concerns of compliance, change and cost by designing solutions and integrations using cloud and IIoT-enabled technologies.

It is the job of all industry stakeholders, from suppliers and manufacturers to regulators, to keep chipping away at the barriers to digitalisation, and to motivate modern methods to monitor, control and continuously improve production processes.

References

1. https://s.nikkei.com/4bLEGqR
2. https://bit.ly/3wUOP5r
3. https://bit.ly/3xcAelG

Article by David Margetts

CEO of Factorytalk, and an advisor, consultant, and technical expert in the areas of compliance and information technology

Recent Editions

Catch up on the latest news, views and jobs from The Chemical Engineer. Below are the four latest issues. View a wider selection of the archive from within the Magazine section of this site.