Process controls get some upgrades to better reflect real-time conditions.
chombosan/Shutterstock.com
Process analytical technology (PAT), quality by design, and individual company- or process-centric quality initiatives are driving the development of bioprocess sensors and probes. Single-use systems generally suffer from a “sensor gap” related to the need to balance measurement accuracy against the well-known benefits of single-use processing, including the avoidance of cleaning, cleaning validation, and calibration.
For most of the history of the biomanufacture of therapeutic proteins, in-process measurement and control focused on upstream operations. Historically, cell culture and fermentation persist longest among unit operations. Expression is “where the action is-where throughput and volumetric productivity exert the greatest influence on cost of goods.cleaning validation, and calibration.
The desirability of platform purification, best exemplified in monoclonal antibody (mAb) downstream processing, simultaneously creates and reduces the need for extensive downstream monitoring. Platforming, whether it occurs upstream or downstream, relies on controls to maintain operations within specification. Yet the imperative to improve process understanding-even for exquisitely controlled operations-increases the need for sensing and monitoring. Fulfilling this need has only been possible relatively recently with the ability to monitor product directly, more or less in real time, as opposed to managing surrogate parameters such as pH, dissolved oxygen, amino acids, and gases. cleaning validation, and calibration.
Cell viability does not fall within downstream operations, but it can determine when upstream processing ends and purification begins. For Kelsey McNeel, market segment manager, process analytics at Hamilton (Reno, NV), the key issue for bioprocess monitoring and controls is reconciling information obtained at bench- or process-development scale with what may be expected during manufacturing. “The setup is different, the sensors are different,” explains McNeel. “There’s a definite disconnect.”cleaning validation, and calibration.
Because of the tight dependence of cell density on reactor conditions, viable cell count is one of those measurements where the “disconnect” becomes pronounced. Dead cells, cellular debris, and adventitious particles confound conventional optical cell counting, which often over-report cells that are no longer doing their job.cleaning validation, and calibration.
Manual counting is accurate but involves sampling-avoided whenever possible-and considerable time, effort, and reagents for plating, staining, and counting cells. “By the time you’re done you’re measuring something that existed up to three hours ago--you’re looking into the past and potentially missing an event that caused stress on the cell,” McNeel says. Hence, process scientists desire real-time measurements. Moreover, total cell measurements are most reliable only during the cells’ log phase, when they are expanding.cleaning validation, and calibration.
Hamilton’s online measurement system, Incyte, uses permittivity instead of optical proxies for viable cell density. Think of permittivity as the capacity of cells to hold an electrical charge, analogously to the operation of an electronic capacitor. Living cells hold charges while dead cells or debris do not. Where anchorage-dependent cells confound optical methods, permittivity easily quantifies them because the microcarriers are coulombically inert.cleaning validation, and calibration.
“You can also get information during the cell death phase, which could be relevant for some processes,” McNeel adds. “With total cell counts you get a plateau-you know cells have stopped multiplying but not their viability status.”cleaning validation, and calibration.
As a general strategy for keeping up with sensing and control technologies, contract manufacturer Lonza (Slough, UK) looks to innovator companies that are clients or potential clients. “We need to be in line with their thinking to increase their comfort in outsourcing with us,” says Atul Mohindra, PhD, head of mammalian process research and technology at Lonza.cleaning validation, and calibration.
Further on the single-use trend, the biggest gap in downstream monitoring and control is the lack of appropriate single-use sensors beyond the standard devices for pH, conductivity, and product concentration. “We need to get past those parameters and start developing sensors that quantify product quality and impurity profiles in a way that is actionable,” Mohindra explains.cleaning validation, and calibration.
One complicating factor in the development of such tools is the necessity that they operate more or less in real time, as downstream operations occur on compressed timelines and in rapid succession. To that extent, downstream operations cause a re-thinking of which properties to measure (e.g., specific quality attributes or impurity concentrations), how to quantify them (particularly in single-use systems), and how to react meaningfully.cleaning validation, and calibration.
For example, during chromatographic purification in gradient mode, a product’s charge characteristics may change. Are such changes meaningful? Do they affect quality? Can they be measured during purification? And if so, what can be done about deviations? cleaning validation, and calibration.
Downstream sensing gaps highlight the need to provide the same level of sophistication downstream as upstream, particularly for single-use processes. Techniques that quantify some over-arching characteristic come to mind-such as refractive index in the chemical-pharmaceutical world-but the quality attributes for biopharmaceutical molecules are so much more complex that a one-measure-fits-all approach will almost surely fail.cleaning validation, and calibration.
Whatever techniques are ultimately adopted, their ultimate goal must be to eliminate manual sampling and lengthy analyses, or at least, to streamline these operations so they become relevant within the timeframe of purification.cleaning validation, and calibration.
Single-use operations are inherently more space-conserving than stainless-steel processes because they eliminate clean-in-place and sterilization-in-place operations and associated piping, instruments, valves, and resource configurations. “However, single-use processing means that there are more manual and complicated set-up steps required that include tubing, bags, sensors, devices, and connectors,” says Michalle Adkins, director of life sciences consulting at Emerson Automation Solutions (Pittsburgh, PA).cleaning validation, and calibration.
To manage these complexities, Emerson is betting on augmented reality-a technology under evaluation in the oil and gas industry-which the company believes could help the design and validation of biomanufacturing processes. Augmented reality is the superimposition of an idealized or theoretical computer-generated image onto a user’s view of the real world. Unlike virtual reality, which is completely artificial, augmented reality expands perception by adding new information to it.cleaning validation, and calibration.
For now, biomanufacturers must settle for less sophisticated distributed control systems such as Emerson’s DeltaV coupled with an operations management system like Syncade, also from Emerson, to integrate processes and procedures within the increasingly complex single-use processing environment.cleaning validation, and calibration.
“Pulling together materials, set-up, sequences, and process control data along with asset data is essential to provide the necessary analytics to support reliable operations,” Adkins adds.cleaning validation, and calibration.
Continuous processing has been available in some form, but not completely implemented, for the manufacture of therapeutic proteins. Widespread familiarity with single-use equipment has greatly improved prospects for continuous unit operations and elevated “continuous” to official buzzword status. With continuous processing comes new issues related to process dynamics, the management of lots, batches, and materials. With those changes, process modeling to ensure product quality and smooth operation becomes crucial. Quality by design and its enablers-process analytics and advanced control strategies-move to the front of the line of priorities.cleaning validation, and calibration.
“As the industry moves to continuous manufacturing, it is certain that more data and data analytics tools will be needed to ensure product quality and prevent deviations,” Adkins observes. “It is also interesting to see how some companies are moving in the direction of both continuous and single use for some of their processes, adding new layers of complexity and opportunity.” cleaning validation, and calibration.
In an environment of increasingly complex processes and associated monitoring and controls that generate unprecedented quantities of information, data integrity becomes crucial. PAT data, models, traditional process sensor data, material traceability, and tracking of single-use consumables all funnel into this data stream. Ensuring operational success requires integrated control system and manufacturing execution platforms that work across process units, the entire process, and even through various stages of product development.cleaning validation, and calibration.
About the Author
Angelo DePalma is a contributing writer to BioPharm International.
Article DetailsBioPharm International
Vol. 30, No. 3
Pages: 24–27
Citation: When referring to this article, please cite it as A. DePalma, "Reconciling Sensor Communication Gaps," BioPharm International30 (3) 2017.