David Gemmell discusses the evolution of continuous and intensified bioprocessing
THIS article will give an insight into why the biopharmaceutical industry is driving towards achieving intensified, connected, or continuous, manufacturing processes. It will discuss the various challenges which must be overcome to be able to operate a controllable process capable of providing high quality therapeutics, in a manner comparable to other industries which have been utilising continuous processing for decades.
In many other industries, continuous manufacturing strategies and methodologies are employed to deliver high equipment utilisation, smaller unit operations and a greatly reduced overall cost of goods (COGs). The numerous advantages of this approach easily outweigh the greater design complexity. However, biopharmaceutical manufacturing is inherently more complicated.
Classical small molecule pharma could arguably have moved towards a continuous model already, as product quality is essentially governed by raw material quality, control of chemical reactions unit operations being undertaken to make that drug. However, pharmaceutical manufacturers operate in a highly risk-averse industry. This inertia combined with a historic lack of guidance from regulatory authorities and high profit margins have contributed to most of the industry choosing caution over innovation.
Biopharmaceutical products, on the other hand, are manufactured in living systems and while these systems can be manipulated in a controlled manner, it is significantly more challenging and there are also all the same inertial factors described above to contend with.
The US Food and Drug Administration (FDA) has issued guidance around process analytical technology (PAT) to create a framework that assists companies producing innovative solutions to build quality into manufacturing by ensuring product critical quality attributes (CQAs) are met by controlling critical process parameters (CPPs). However, despite this framework being launched in 20041 and a general encouragement for innovation from the FDA, creation and adoption of continuous manufacturing techniques has been slow.
The overarching goal of the industry is to create a connected end-to-end manufacturing process which begins with a perfusion bioreactor and a cell retention device. This allows fresh nutrients to be continuously charged into the bioreactor, waste materials (metabolic byproducts) and the valuable drug substance to be removed while the cells are retained. The product solution is fed directly into a downstream purification train which is designed to complete all the typical purification and polishing (clarification, chromatographic separation, viral inactivation/clearance, and final concentration/sterilisation) activities the individual batch unit operations would achieve (see TCE 970 for an overview of a common monoclonal antibody manufacturing template).
This process would collect data in real time (or near real time) from a variety of sensors to establish effective control of a continuous process. Conductivity, pH, temperature, pressure, dissolved oxygen, CO2, ultraviolet light (UV) absorbance and fluid flow are all commonly-measured parameters, but more sophisticated sensors which can determine not only protein concentration, but size and composition must also be collected and analysed in an in-line, on-line or at-line manner to provide the real-time response required.
The overarching goal of the industry is to create a connected end-to-end manufacturing process which begins with a perfusion bioreactor and a cell retention device
Collected data must be fed into a digital modelling system which has been carefully developed to be able to predict the most appropriate response, based on the process variables, and monitor the system response. Significant design of experiment (DoE) approaches are required to generate this digital model2 – commonly known within industry as a “digital twin”. The process variable data for that given instrument and time point will be compared with a large historical matrix of data generated from previous manufacturing runs, and the assessment of this will then determine what action must be taken to prevent the system from going out of specification.
As with all modelling, and with computing in general, the output quality of the model is governed by the input data quality and the quality of the model itself. This is no mean feat where complex biological process streams are involved. Persuading a mass of cells to synthesise a large foreign protein, which itself can then change based on interactions with other chemicals, proteins and waste products is challenging as it is. This is without trying to develop an in-line real-time control system that is capable of determining the protein drug substance yield and quality at all times and adjusting the bioprocess to maintain or optimise this.
One of the most recent advances has been the utilisation of in-line Raman spectroscopy, an optical analytical technique which can discern different molecules based on observing their vibrational properties. A laser is used to illuminate the sample in-line and a spectrophotometer measures the scattered photons. The feedback is analysed, and a unique spectral fingerprint is created using a software model. The output peaks correlate to different molecules and the concentration of the molecule in that mixture is represented in the intensity of the peak. This can be used to assess and control processing in real time and is now being widely considered by manufacturers as a key PAT tool.
Batch manufacturing operations allow quality control (QC) departments time to analyse the output from a given unit operation to assess product quality. Offline measurement with HPLC/UPLC (high or ultra pressure liquid chromatography) or UV light absorbance or other techniques generate both qualitative and quantitative data, which can be used to confirm product/feed acceptability. This provides assurance that the drug substance is ready for the next unit operation.
This is vital not only for paramount product quality concerns (and therefore patient safety), but for process economics. The cost of chromatography media which is carefully packed into process-scale columns and used in the processing of large batch sizes, can easily run into the millions of dollars, with an approximate lifetime of 200 cycles – if maintained correctly and only contacted with suitable quality feed streams.
The most sophisticated virus filtration devices could easily be as expensive as brand-new mid-range cars, and these devices are single use, with typical batch sizes requiring multiple devices. This gives you an idea of the kind of consumable costs involved in biopharma. The consumables and equipment are inherently expensive in an industry that demands quality to achieve quality. It is critical to ensure that expensive purification technologies are protected, and understanding the impurity profiles of the process is key to achieving this.
The control of batch manufacture allows only suitable feeds to enter into the next processing step. Sampling operations and off-line analysis produce the assurance of in-process yield and quality. They also easily facilitate critical batch definition and traceability.
However, batch manufacture requires larger individual unit operations, higher labour requirements, more opportunities for human error, slower processing times, more opportunities for product degradation, lower equipment utilisation, many large collection systems, larger facilities in terms of physical footprint and cleanroom HVAC (heating ventilation air conditioning) capabilities and larger coldroom storage. The result is that this operating methodology has a significantly higher capital cost, labour cost and therefore a higher total COGs.
While the most efficient process may be fully continuous, there are options available to achieve some of the benefits while making things a little easier to achieve. Intensified processes are considered to be ones which have adopted techniques such as perfusion-based bioreactor setups with advanced analytics and downstream purification trains which utilise cutting-edge separation technologies such as multi-column chromatography or membrane chromatography, inline viral inactivation, and single pass tangential flow filtration (SPTFF – see TCE 971 for a description of this technique). Some unit operations may be combined together but the entire train is not fully connected. These strategies can leverage significant process consumable reductions, decreases in facility/cleanroom footprint, and decreases in cycle time and process labour requirements.
Chemical and biomanufacturing engineers must undertake detailed feasibility studies, front-end engineering design and process design reviews to identify the best technological fit when developing or optimising a biochemical process. Detailed modelling of process cycle time, facility fit, mass and energy balances and economic models must all be generated to confirm which unit operation is the most appropriate. This must also be balanced against the expected requirements from the regulators in terms of the quality of the product delivered and process robustness. Fortune favours the brave, but in an industry that prefers to be risk averse it’s often the more established technology which is selected instead of one which is relatively new in the marketplace. Chemical engineers must work closely with process development (PD) scientists undertaking small-scale studies to size the different unit operations. Careful scale up must be undertaken to ensure commercial processing is representative of what happened in the PD lab.
Let’s take a closer look at some of the technologies enabling process intensification. One of the key technologies which is being implemented by established manufacturing companies is the use of multi-column chromatography (MCC) for primary capture of the protein product before virus removal and final product polishing. The operation of an affinity chromatography column typically consists of five distinct phases: column
equilibration, column loading, column washing, column elution and column sanitisation.
Column equilibration involves loading the column with a buffer designed to ensure that the Protein A resin is primed and ready to bind to the molecule of interest. Once the column equilibration is complete, the drug substance containing feed is then loaded onto the column and the target product binds to the chromatography media. Loading then moves on to another column. Column washing steps will remove loosely-bound impurities. Once these loosely-bound or unspecific contaminates are removed, the elution buffer (whose chemical composition is different from the equilibration buffer) is applied. The change in the chemical environment will cause the desorption of the protein of interest. This can be controlled so that fractions coming off the column are either discharged to waste or collected for further processing downstream. Once elution is complete the column is then sanitised and regenerated, typically with an acidic or caustic cleaning agent depending upon resin type.
The rapid reuse and recycling results in significantly smaller column sizes being required and significant media cost reduction and greater overall process efficiencies. While the cost of chromatography media can run into the millions in large-scale manufacture, the cost per unit dose of final drug product is typically low given the large volumes produced and the low dosage requirements of some therapeutics. However, MCC systems can further reduce these costs while achieving other benefits. For new facilities or smaller companies, the significant reduction in chromatography resin volumes (and therefore initial capital cost) may be hugely important. Biomanufacturing engineers will also be responsible for packing the chromatography column with the delicate and expensive resin media required for purification.
Column packing can be a daunting task, and the larger the column, the more difficult this operation becomes, with additional lifting equipment or column packing skids (to pump the media from storage into the column) being required. If the column is incorrectly packed the performance will be poor. The worst case is that the resin can be damaged, leading to high costs for corrective actions and facility shutdowns while the packing operation is re-done. Smaller columns used in MCC systems are much easier to pack and can often be purchased in a ready-to-use, prepacked format, avoiding the need for in-house packing requirements.
Some aspects of the conventional monoclonal antibody process template are, by their nature, batch based. Bind or elute chromatography and viral inactivation by low-pH hold are classic examples. This makes operating a continuous process very challenging without significant surge tank capacity. Flow-through technology fills this gap, where the feed stream is never “held” within a unit operation.
Other important advances in chromatographic separation have led to new products in the marketplace that suit the flow-through paradigm. Membrane chromatography devices can give excellent product and impurity separation with extremely short residence times, removing the need for laborious and variable resin packing and allowing rapid cycling technology. Rapid cycling is a technique when chromatography units, particularly membranes which tolerate high flow rates and low residence times, are loaded, eluted, regenerated, and equilibrated over a matter of minutes. With multiple devices cycling, this allows very small devices indeed, utilising the full cycle lifetime within a few hours of a single batch and providing a truly single use solution, unlike MCC which is a multi-use technology.
This flow-through processing rationale can also be applied to an in-line viral inactivation step. This technique replaces the traditional 60-minute low-pH hold step after affinity chromatography. The inactivation kinetics have shown that target viruses are generally destroyed after only a few minutes.3 Combining these flow-through technologies gives the potential for a section of the process to be connected and operate continuously using an innovative single-use flow path to provide appropriate validated residence times.
Flow-through polishing chromatography systems are also on the market, and in development. These include specialised chromatography methods such as frontal displacement chromatography that allow superior separation of closely related contaminants (such as aggregates) whilst retaining much higher productivity than standard bind/elute methods. These systems are ideally suited directly after in-line inactivation. The more flow-through methods that are incorporated into the process, the easier it is to run continuously.
All of these options to decrease physical footprint, initial capital cost, consumable volumes (such as chromatography resins) allow manufacturers to achieve overall lower COGs while maintaining high levels of quality and process safety.
Bioprocessing has become safer and more operationally efficient over the lifetime of the industry. Connecting unit operations together in a fully-closed fashion eliminates the possibility for adventitious organisms to contaminate the feed stream between inactivation or clearance, thus offering greater overall levels of product quality. Holistic viral safety is a challenging topic which merits further discussions in the fourth article of our biopharmaceutical series.
Innovation is occurring, even in such a risk-averse industry. There are companies which have started to build biopharmaceutical processes designed to fit into a series of shipping containers, incorporating a modular strategy that allows an incredibly small facility (around six containers) to be deployed to facilitate the self-contained manufacture of vaccines or other therapeutics. These are being designed to allow developing and emerging countries to be able to produce life-saving treatments locally.4
With the development and increased adoption of continuous processing, modular process and facility design and other modern manufacturing techniques, innovations such as these bio-containers will become more prevalent and allow cutting-edge medicines to reach those who need them.
The challenges discussed in this article cannot be overcome without chemical engineers. We are uniquely placed to provide the guidance necessary to support innovation and technological advancements. By bringing skills and insight from other process industries to bear, together we can turn the scientific developments of today into the technology of tomorrow.
©2022 Merck KGaA, Darmstadt, Germany and/or its affiliates. All Rights Reserved. Merck is a trademark of Merck KGaA, Darmstadt, Germany or its affiliates. All other trademarks are the property of their respective owners. The Life Science business of Merck operates as MilliporeSigma in the US and Canada.
Acknowledgments: Michael Burns, Paul Beckett, Stuart Rolfe, and Tabea Lumpp for providing assistance in writing this article.
1. Administration, USFD, PAT; A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance; US FDA Guidance Documents 2004; FDA-2003-D-0032.
2. Canzoneri, M, De Luca, A, and Harttung, J, “Digital Twins: A General Overview of the Biopharma Industry”, Adv Biochem Eng Biotechnol 2021, 177, 167-184.
3. Gillespie, C et al “Continuous In-Line Virus Inactivation for Next Generation Bioprocessing”, Biotechnol J 2019, 14, (2), e1700718.
4. BioNTech develops modular mRNA manufacturing facilities for Africa, https://bit.ly/3weIA9g.
This is the third in an ongoing series of articles on how chemical engineers contribute to the biopharmaceuticals industry. To read the full series as it develops, visit: https://www.thechemicalengineer.com/tags/chemical-engineers-and-the-biopharmaceuticals-industry/
Catch up on the latest news, views and jobs from The Chemical Engineer. Below are the four latest issues. View a wider selection of the archive from within the Magazine section of this site.