Setting Up Bioprocessing Systems for Digital Transformation

Publication
Article
BioPharm InternationalBioPharm International-06-01-2021
Volume 34
Issue 6
Pages: 26–30

Digitalization of bioprocessing operations, equipment, and facilities can improve workflow and output, but maintaining data integrity is a concern.

catalin/Stock.Adobe.com - conceptual image showing a DNA helix interposed with data board circuitry to illustrate the digitalization of biologics

catalin/Stock.Adobe.com

The advancement of digitalization technologies is transforming manufacturing processes and operations in the biopharmaceutical industry. Bioprocessing is no exception, as the industry continuously seeks to improve large-scale commercial production of traditional biologics, in addition to seeking innovative ways to meet the bioprocessing needs of new biotherapeutics.

Evaluating data integrity

With the transformation to digital technologies in bioprocessing operations, one major concern that companies have to address is maintaining the integrity, as well as quality, of their data.

Bob Lenich, director of global life sciences at Emerson, says that manufacturers are well aware they will be audited, and thus recognize the need to have strict compliance in place. The challenge here is that the Industrial Internet of Things (IIoT) has added more information from an ever-wider array of sources. “Plants need strategies and technology to maintain ‘chain-of-identity and meta data’ for data as it moves across various systems,” Lenich says.

He notes that new IIoT devices add a vast array of connectors to traditional process and manufacturing data flow: lab measurements, asset health measurements, building automation measurements, and more. Thus, staying compliant means ensuring that the connectors between systems capture all that is needed and that they do not manipulate data. “Many organizations are looking to data lakes to help meet ALCOA+ [attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available] principles for data integrity. Advanced data lakes use connectors specifically designed to ensure that, as context is added to data, metadata are generated to paint a clear picture of what was added and when,” Lenich states.

Single-use systems risks

The same principles and need for data integrity apply to single-use systems but at a higher volume, says Lenich. He emphasizes that, in addition to maintaining data integrity as critical information passes across systems, organizations also need to track all the single-use consumables to ensure the right components were used under the right conditions at the right time throughout the entire production process.

Issues with data integrity in single-use systems arise from the fact that, with every batch, a new single-use component is used, according to commentary provided by CPI’s Harvey Branton, technology lead, Biologics; John Liddell, chief technologist, Biologics; Lukas Kuerten, senior scientist, Data Analytics; and Sean Ruane, scientist, Downstream. There is also the issue with data tracking, so that the components used are assigned to the batch they are involved with, for traceability and any subsequent process investigations, Dr. Liddell states.

Additionally, because all connections are configured again for each batch—with new components—there is a need to check and confirm that the correct connections within and between equipment have been made, he adds. This typically involves an operator manually checking and confirming, which, on a complex unit operation, requires time and cross checking. Data on consumables used should normally be documented in batch records, which can be either paper or electronically based. Paper records, while flexible, are difficult for extracting data and trend information (such as when different media or consumable lots are used and in which batch).

Given the large-scale switch-over to single-use components in biopharmaceutical manufacturing, there is an urgent need to have greater control over manufacturing operations with single-use components, Dr. Liddell asserts.

Another key concern around data integrity when using single-use systems, especially for downstream processes, is having incomplete information from the processes, says Tobias Hahn, CEO and chief technology officer, GoSilico. “Vendors typically characterize the components of single-use systems by means of few general aspects, seeing their products as homogeneous and identical. But, in fact, lot-to-lot variabilities show, that these products are not identical,” Hahn states.

Hahn points out that, for instance, pre-packed single-use chromatography columns of the same type may differ in terms of column packing and resin lot capacity. “In practical application, these differences may have a big impact on the process performance of the individual single-use system and lead, for example, to higher product loss or low impurity removal on the other side,” he says.

There must be an acknowledgement that process performance is impacted by material and system properties, he adds. In addition, profound understanding is required to predict the magnitude of the impact. “For this purpose, reliable and accurate digital twins based on mechanistic models can be used, but vendors must provide more specific and system-dependent characteristics in advance.”

Data vulnerabilities in mAb bioprocessing

For traditional biologics such as monoclonal antibodies (mAbs), it is important to assess what particular steps in the manufacturing process may be vulnerable to data integrity loss during the process of digital transformation.

As John Liddell, chief technologist, Biologics, at CPI indicates, mAb processes can generate significant amounts of data because, in upstream processing, cell culture processes typically run for 14 days, and this generates significant online data (e.g., pH, dissolved oxygen, etc.) and offline data attached to the runs. Meanwhile in downstream processing, chromatography and tangential flow filtration (TFF) systems also gather different, but significant amounts, of data, which can be used to monitor and control processes. As these data become a key part of the process control strategy, there is increasing focus on control system reliability through the risk of batch failure if the monitoring and control systems fail during batch processing, Liddell cautions.

Currently the values are often manually recorded in the batch manufacturing record, rather than being recorded in electronic records. This, however, means that the full benefit of the recorded data is not necessarily realized, and new algorithms—which could be used to demonstrate real-time batch performance against previous runs—are not used, according to Sean Ruane, scientist, Downstream, CPI.

Meanwhile, for therapeutic applications, processes need to be run to current good manufacturing practice (CGMP) standards, and tend to be designed to not have many decision points, to aid with compliance. Harvey Branton, technology lead, Biologics, CPI, uses the following as a “for example” scenario: step elutions tend to be used from chromatography columns so that the product purity is based on a fixed buffer conductivity, rather than using a gradient elution where other factors, such as ultraviolet rising could be used to trigger fraction collection. This approach makes processes robust and straightforward to run but could result in reducing the true process potential.

Hahn agrees that data integrity is more at risk with higher data variability. “This is the case, for example, at the interfaces between upstream and downstream. An unidentified, or even unidentifiable, variability in the outcome of an upstream process may lead to a major loss in data integrity for all subsequent downstream considerations,” he states. “Digital process models can be utilized, however, to perform in-silico plausibility tests. Unplausible results in downstream reports can easily be validated and verified using digital representatives with no further experimental effort. Such models can further be used for analyzing root causes, such as upstream variability or lot-to-lot material deviations,” he explains.

Lenich says that with the increased complexity inherent in mAb manufacturing, there is a new focus on process analytical technology (PAT) to ensure that not only are measurements correct, complete, and accurate, but also to ensure that the foundations underlying those measurements are solid. “Because many key measurements are now generated via models, organizations need to be able to show that the model is also working correctly and that the environment the model is trained for hasn’t changed,” he states.

Additionally, data integrity has been about the accuracy of a measurement and making sure nothing modified that measurement. “Today, quality teams must also verify that nobody changed the configuration of the system, which might impact the measurement. It is no longer enough to demonstrate that the temperature value was correct; now we must also prove that nobody changed the alarm limits on temperature levels or other variables that might impact the safety and efficacy of the final product,” Lenich asserts.

Fitting in AI

The use of artificial intelligence (AI) has been another aspect in the discussions surrounding the transformation of a manufacturing facility to a digital platform. But how would this work? Hahn answers that question by explaining that AI, multivariate, and mechanistic models serve to better interpret complex, multidimensional data, which therefore enables improved process understanding. “By generating digital twins based on such approaches, the convoluted behavior of material, system, molecule, and method may be simulated, with the time consuming, expensive, and labor-intensive experimentation performed in silico,” he states. Thus, deviations can be better investigated, and processes may be developed to be more efficient and robust, with intensified process development, when transformed into the digital space.

“AI in its classical sense is based on a statistical ‘black-box’ approach: statistical approaches such as big data or machine learning utilize statistics to predict correlations, trends, and patterns. All of these models learn from experience provided in the form of data. The more the experience, the better the model will be. Depending on the specific application, data may however be sparse. In particular, in downstream processing, data are typically rare, which hinders the application of such approaches,” Hahn says. “Such models are further bound to their calibration range and can only predict results within the data space they are calibrated from. In particular, they do not allow any major change in the process set-up.”

In contrast, mechanistic models determine the fundamental phenomena by applying natural laws to comprehend emergent behavior. “For systems with established profound mechanistic understanding, such as chromatography and the Donnan effect during UF/DF [ultrafiltration/diafiltration], a mechanistic approach is the most efficient with data and enables extrapolation to unseen process conditions,” Hahn adds.

Properly implemented AI is tiered, with each layer building into the next, notes Lenich. The first level of AI is a foundation of models that ensure all assets, such as sensors and machinery elements, are working correctly. “When the organization has confirmed that assets are working as expected, it can begin to implement more advanced modeling focused on the unit operation level. This layer focuses more on specific pieces of equipment or operating areas to ensure that they are performing as expected,” he says. “Once the organization confirms that units are consistently performing as expected, more advanced real-time modeling and simulation predicts how production will behave in the future.”

Lenich explains that high-fidelity models compare what a facility is designed for versus what it is actually doing. High-fidelity models also identify the constraints that cause bottlenecks in throughput. “Any and all of these approaches are in use today. Predicting equipment failures, intensifying bioreactor titer, and improving facility throughput capacity are all examples of companies looking at all or parts of a biopharmaceutical manufacturing process and improving them with artificial intelligence models. The key point is to have a particular problem you want to solve. The best results are achieved when all of these approaches are planned and coordinated,” Lenich says.

Meanwhile, Lukas Kuerten, senior scientist, Data Analytics, CPI, sees multiple areas where AI would fit in achieving a “digital transformation” of a biopharmaceutical manufacturing process. This would extend from quite early in the process, from the identification of molecules to move forward into development (better defining the molecule’s developability) to process and clinical development (ensuring data are captured and evaluated) and, finally, to the manufacturing of commercial batches.

Kuerten adds that there are multiple areas in which AI can contribute to making process development of biopharmaceuticals more efficient. Biopharmaceutical development involves multiple experiments in upstream and downstream, with online and offline analytics being collected from multiple types of instruments—from pH meters through to high-performance liquid chromatography systems and mass spectrometry. “Today, all these data sets typically exist in multiple spreadsheets, which are not robust, require manual integration, have weak links to the original data, and are difficult to update,” he states.

In both upstream and downstream manufacturing, the operating paradigm for many years has been batch-based processing, with final product analysis being carried out offline, at the completion of a batch. As processes move toward continuous manufacturing, the full benefits of continuous processing can only be achieved through reducing, and eventually eliminating, offline final product batch release. This is an area where AI, in conjunction with online and rapid offline PAT techniques and digital twinning, will allow real-time process, Kuerten explains.

Legacy transformation

In terms of a legacy manufacturing facility, how would a digital system be fitted to bring such a facility up to speed? Lenich indicates that existing manufacturing plants can make a significant headway toward digital transformation by simply taking advantage of wireless networks. This would easily allow the plant to bring in new IIoT data without significant rework of the physical infrastructure that would otherwise be required to allow wires and connections to go out to where they are needed.

“However, the challenge comes when organizations want to connect to existing legacy systems,” Lenich cautions. “These systems often don’t have the correct interfaces to be easily incorporated into a digital strategy. New technologies, such as digital controllers and data lakes, can help bridge the gap between digital systems and legacy equipment.”

Lenich explains that new digital controllers can be connected to existing standalone equipment and programmable logic controllers and would offer the flexibility to provide customized integration to those legacy systems. “These controllers are then able to connect to the new digital infrastructure and pass on this legacy data. Users can then determine how important it is to ‘integrate everything’ or only do key items,” he says.

The new module type package (MTP) approach from the User Association of Automation Technology in Process Industries provides another way for new equipment to interface more easily to existing plant systems, Lenich adds. “MTP creates standardized content and use and communication protocols for equipment integration. MTP tells the existing plant automation systems what the new equipment can do and how to communicate to it, significantly decreasing the time it takes to bring in new equipment to an existing facility. Leveraging MTP standards is the next step on the path to a ballroom of plug-and-play equipment in a plant,” he says.

The update of a legacy facility may face major challenges if the existing equipment does not feature connectivity for digital integration, which may make costly retrofitting necessary. “However,” Branton points out, “that existing technology could be overlayed with new digital capabilities. For example, standard control loops based on proportional, integral and derivative (PID) control provide good set-point control where the demand on the system is stable but, where the load is changing significantly, the PID terms must be adjusted accordingly. New control algorithms can be developed, which feed off the process data generated but sit above the existing controller and automatically adjust the PID terms based on performance. This enables the updating of control systems to take advantage of more advanced control strategies without replacing the existing hardware.”

Meanwhile, a digital twin of a chromatography process can immediately be fitted into a legacy facility without any further effort, constraints, or requirements, Hahn says. “The input required by such a digital twin is the data and sensors on hand: UV and conductivity detectors are the minimal requirement; pH sensors can be considered in addition. No additional sensors are needed, as every other process parameter can then be calculated using the model itself. Real-time process control and process analytical techniques based on digital twins of course require the availability of data in real time, however,” Hahn adds.

Maintenance and upkeep

Digitalization can also offer the benefit of improved operations and facility maintenance.

Currently, maintenance is logged electronically, but in the future, predictive maintenance will be implemented, which involves the system being able to predict failure of parts, Branton notes. As a next step, the system could be enabled to automatically take remedial action such as ordering new spare parts from a digitized warehouse. Branton also points out that real run data, and a prediction of future use, can be fed into the digital twin and this can be used to predict maintenance requirements.

“Mechanistic chromatography models can be utilized to adapt an existing process model to an aged resin, for instance,” says Hahn. “Processes can then be continued even under changed or impaired conditions, which delays the need for maintenance and keeps productivity high. Also, models can be used to assess the anticipated remaining life-time of a column and predict how many more cycles it will be capable of performing.”

“Maintenance is no longer a case of monitoring a single piece of equipment, providing supplier recommended periodic preventative maintenance, and fixing it when it begins to show problems,” adds Lenich. Today, organizations are not only looking at how well equipment works, but how well multiple devices and operational units are working together. “This more focused approach to maintenance requires multivariate analysis that goes beyond what maintenance technicians can easily perform manually,” Lenich states.

Lenich also says that plants are turning to new wireless IIoT measurements to provide new information on equipment performance, such as vibration monitoring. Plant operators are applying AI and machine learning to analyze and predict equipment as well as plant health and performance using data from traditional sources and these new sources. “This multivariate data can be compared against equipment performance models to see how well equipment and manufacturing processes performed in the past and to predict how they will perform in the future in real-time. When performance starts to trend away from the norm, these same systems can provide actionable advice to help bring processes back in line before quality or safety are impacted,” Lenich concludes.

About the author

Feliza Mirasol is the science editor for BioPharm International.

Article Details

BioPharm International
Vol. 34, No. 6
June 2021
Pages: 26–30

Citation

When referring to this article, please cite it as F. Mirasol, “Setting Up Bioprocessing Systems for Digital Transformation,” BioPharm International 34 (6) 2021.

Recent Videos
Behind the Headlines episode 5
Related Content
© 2024 MJH Life Sciences

All rights reserved.