Streamlining Bioprocesses with Automation

Published on: 
BioPharm International, BioPharm International, July August 2024, Volume 37, Issue 7
Pages: 7–9

Automation technologies used in the development and manufacture of biopharmaceuticals continue to evolve, providing the potential for reduced costs and time.

The evolution of automation in biopharmaceutical development and manufacturing continues to move forward. Technologies have advanced from possibilities to realities in recent years, according to David Sheedy, head of Life Sciences Manufacturing Ireland, at Cognizant. He points to connected smart instruments, digital twins, and facilities that incorporate Industry 4.0 as examples. “Automation integration layers now make it possible to bring huge volumes of control data traffic from distributed sources across OT [operational technology] networks through a central function, making it available for further use to improve manufacturing intelligence,” Sheedy states.

According to Cory Perelman, PE, Lead I&C Engineer and ASME BPE PI Vice Chair at Arcadis, the use of single-use systems will be advanced by new technologies, such as single-use mass flow control, and traditional stainless-steel manufacturing will benefit from smart technologies. “Single-use and cell and gene therapy typically associated with many ‘islands of automation’ should be looking at a new ‘spoke-and-hub’ type architecture based on a message query telemetry transport (MQTT) data broker within a unified namespace.” He points out that some technologies are specific to one therapy while others can be applied across a range of therapies. “Ethernet-APL [advanced physical layer] is a databus with a two-wire Ethernet topology which will simplify network architectures and allow for [the] passing of a lot of useful information for all plants. For new continuous manufacturing processes, end users should focus on smart instrumentation for process downtime reduction and advanced process insights that can inform maintenance and SOPs [standard operating procedures],” Perelman states.

Ryan Thompson, senior specialist, Industry 4.0, at CRB sees open-sourced automation as a development, where data are treated as an asset to other manufacturing systems. “This [change] pushes original equipment manufacturers (OEMs) to offer connectivity to their equipment in the manufacturer’s data ecosystems instead of the typical vendor-locked approach with a separate system for each equipment or equipment type,” Thompson says.

Robotic process automation (RPA) allows processes to be automated quickly and repeatedly, which makes it a valuable tool for biopharmaceutical manufacturing, remarks Ronan Fox, PhD, senior vice president, IT R&D, at ICON. “RPA is one tool in the toolbox for automation engineers, and can be used as standalone, fully automated solutions, human-assisted, or used in conjunction with AI [artificial intelligence] and integrations (to mention a few) so that complete solutions can be provided to enable consistency, higher quality, and quicker throughput,” Fox says.

According to Edita Botonjic-Sehic, head of Process Analytics and Data Science at ReciBioPharm, there will be an increase in new process analytical technologies (PAT) that integrate assays online, inline, or at-line. “With the implementation of PAT, drug developers and manufacturers are reducing manual labor efforts through automation, eliminating the need for time-consuming manual sampling steps and removing the travel time associated with sending samples to external laboratories. This efficiency improvement enables what previously took two weeks of offline analyses to be completed in a matter of days,” says Botonjic-Sehic.

Implementation of automation

To implement automation in biopharmaceutical processes, Sheedy emphasizes that process decisions should be made early and rationally to ensure they can be automated. “Teams need to identify the steps that can be streamlined or sequenced differently to allow the overall delivery gain from the incoming automation,” says Sheedy. “Automating processes means significant change for personnel; therefore, investing in organizational change management (OCM) is critical in ensuring all stakeholders are identified, informed, and brought on the journey to minimize the impact on productivity, from go-live and beyond.”

The costs associated with implementation of automation should be looked at holistically, says Thompson, including integration of equipment and the personnel to maintain systems. “Having expertise in your technologies is an important consideration in biopharma manufacturing,” he stresses. “[It is not just a case of] looking at the cost of automation but what it will cost to have a connected factory.”

For more information on implementing automation in biopharmaceutical manufacturing, Perelman suggests the International Society for Pharmaceutical Engineers’ (ISPE’s) Pharma 4.0 guideline (1). “The Digital Maturity Model helps understand where a plant currently is on its path to digitalization. What existing systems and infrastructure (IT and OT) are in place to support digitalization is the first step. From there, it is important that the company has a clear vision and strategy as to where they want to go, and how to get there,” says Perelman. “A culture of change and digitalization across each project should be implemented as soon as possible. Often on the design firm side, we start with individual case studies, small and large, that are suitable for a given project. Then based on project budget, case studies can be selected for implementation, removed, or saved for a later date.”

When implementing automation into analytical processes, understanding the analytical instrument and the data it processes is important, but more so, is the context behind the data, explains Botonjic-Sehic. “Beyond tracking a number, a user needs to understand what part of the process gave that number and how it is related to the process performance. Properly contextualized data [are] critical for troubleshooting and success as drug developers and manufacturers transfer the process from process development (PD) to commercial scale,” she says.

Validating automation

A new system or operation that is being implemented in a regulated environment will require validation to ensure good manufacturing practice requirements are met. Validation ensures controls are in place for the automation of processes that are regulated, says Fox. “We must be able to prove that the automation applied is fit for the intended use, and that the execution of that automation is equivalent to the activities that would have been undertaken by trained humans,” Fox says. “The validation of the automation should match the level of validation that we would perform across any technology that is developed for a regulatory process—be that a bespoke software-based solution or an off-the-shelf purchased technology.”

For any new technology, validation strategies should be considered, notes Perelman, and new analyses should be performed. “[For example,] for cloud-based technologies, Amazon Web Services (AWS) tends to be a risk-adverse option with auditable security practices than a lesser-known player,” he specifies. “ISPE is starting to create local GAMP chapters where other regulated end users can get to together locally to discuss best practices. For new technologies, risk-based and well-documented validation approaches should be implemented in a logical way, rather than shy away from new technologies and revert to antiquated ones.”

Advertisement

Thompson also points to GAMP 5 on how to validate an automation system and take a risk-based approach. “Depending on the application, using a Computer Software Assurance instead of a Computer Systems Validation model may be appropriate,” he states.

In what may be a positive shift, according to Sheedy, validation is moving to streamlined procedures that enable multi-product facilities. “The 2020 lockdowns highlighted that there are better, more streamlined ways of doing things. Shifting to a digital approach to validation as standard enables the industry to work in a much more collaborative and flexible way,” Sheedy says.

When it comes to data, secure databases and historians help to streamline data access and integration, according to Botonjic-Sehic. Validation can be quickly performed by directly feeding process data into a data base. “Many processes and analytical instruments in drug development and manufacturing default to providing data in flat files, leaving the onus of building and connecting to secure databases on the users,” says Botonjic-Sehic. “Preventing bottlenecks that can arise as a result and maintaining data integrity hinges on creating an automation platform that seamlessly communicates with the entire workflow, ensuring faster, more reliable data flow from multiple systems on the operational floor. Addressing potential issues at the outset of building this architecture can help prevent the need for rework and minimize data loss.”

AI and machine learning

The development and advancement of AI and machine learning (ML) have entered the bio/pharmaceutical space. AI can process large sets of data; therefore, it may help to speed up drug development and reduce costs, according to Sheedy. “It also enables early issue identification to reduce waste, creates digital twins with greater accuracy and forecasts and executes faster supply chain decisions,” says Sheedy. “In manufacturing, the uptake has been focused on data analytics and in non-GMP applications within the manufacturing environment for now, but it is only a matter of time before AI infiltrates other aspects. As we move forward with the integration of AI, the interface between automation and AI algorithms will see increases in yield and productivity that drive business benefits and enable clients to retain competitive and make patient health improvements.”

Models for real-time monitoring and control can be created using AI and ML, according to Botonjic-Sehic. Real-world events can be simulated by in-silico models and provide proactive risk identification associated with critical quality attributes (CQAs). “Additionally, these models can determine trends that could lead to CQA deviations, allowing process development adjustments accordingly. As a result, drug developers and manufacturers can proactively minimize failure risks, reduce waste, and ensure a higher quality product,” says Botonjic-Sehic.

According to Perelman, AI and ML can optimize the work of lab scientists. “[For example, AI and ML] can pick ‘green’ solvents for synthesis best suited for the process and environmental impact (both from production of the solvent and production using the solvent).” Processes can be streamlined from pilot to scale using AI and ML, confirms Perelman.

However, regulators and industry will need to develop better validation and qualification procedures, according to Perelman. “For example, is an ML-generated process model that then feeds an ML model-predictive control allowed to change a setpoint or react to a process upset? Will an AI change on a timestamp satisfy 21 CFR [Code of Federal Regulations] Part 11 compliance? Are CPPs [critical process parameters] and tech-transfer documents largely AI-generated defensible? What kind of testing do we need to prove so? These are questions we need to ask.”

Vast datasets from biological experiments potentially may be analyzed by ML to identify patterns missed by human researchers, says Perelman. In addition, the success of drug candidates might be predicted by ML models. “Like AI, ML models can be difficult to defend to the FDA or other regulatory bodies, may contain insufficient or biased data that can lead to inaccurate models or poor predictions, and rely on expertise to create and implement the models that may be scarce in the organization.”

Data architecture may be a challenge, however, according to Thompson, because operational technology systems do not typically support data models for AI systems to scale. “Data science teams spend the bulk of their time organizing and cleaning data instead of refining models and providing insights. Spending time to define your data strategy and tools to model data will allow for AI/ML to scale,” says Thompson. “The quality of an AI/ML model improves as the data set grows. Using one piece of equipment to make inferences will deliver a lot less value than having a scaled architecture that can support the analysis of multiple pieces of equipment and take advantage of the natural experiments that occur in day-to-day production.”

“During biopharmaceutical development, particularly the clinical research phases, AI can play a key role in automating regulatory and non-regulatory processes,” says Fox. “AI, layered on other automation techniques, enables the automation of more sophisticated processes as they tend to leverage more unstructured data sources to proceed.” Fox cautions, however, that “care must also be taken to ensure that bias is minimized in the data and thus the models created. Increasing trust through demonstrable and explainable AI-powered decisions will be key to the further deployment and uptake of those AI-enabled automations.”

Reference

1. ISPE. Baseline Guide Vol 8: Pharma 4.0, 1st Edition (ISPE, December 2023). https://ispe.org/publications/guidance-documents/baseline-guide-vol-8-pharma-40-1st-edition u

About the author

Susan Haigney is lead editor for BioPharm International®.

Article details

BioPharm International®
Vol. 37, No. 7
July/August 2024
Pages: 7–9

Citation

When referring to this article, please cite it as Haigney, S. Streamlining Bioprocesses with Automation. BioPharm International 2024 37 (7).