Modeling Comes of Age in Biopharma

Publication
Article
BioPharm InternationalBioPharm International-11-01-2019
Volume 32
Issue 11
Pages: 18-22

Modeling is being used for everything from yield improvement to facility design, but new initiatives plan to broaden its reach, both upstream and downstream.

littlewolf1989/Stock.Adobe.com

Over the past few years, process modeling and simulation have evolved from an option to a necessity in more biopharmaceutical development and manufacturing operations. As models are used in manufacturing and development (see interview with Emerson's Bob Lenich) by companies ranging from large manufacturers to contract development and manufacturing organizations (CDMOs) such as WuXi, Lonza, MilliporeSigma, and Patheon, initiatives are currently underway to expand its use, both upstream and downstream.

Lonza is using models to guide process economics (i.e., investment decisions); to identify patterns in data that predict expected process outcomes (i.e., machine learning); and to gain a better understanding of how aspects of the process work (i.e., mechanistic models), says Brandon Downey, principal engineer, R&D at Lonza. Currently, process economics modeling tools are being used to decide which elements of continuous processing to adopt and when, he says. 

“In the other two cases,” he says, “we seek process improvements or process risk-mitigation strategies that are directly supported by fundamental process understanding and data. Use of machine learning techniques that incorporate mechanistic process knowledge helps our process experts focus on only the aspects of the process that are most likely to give the outcome we desire, as opposed to only using an empirical approach. We let the models do the pattern recognition, so that our experts focus and deeply understand how key areas of the process work and can be improved,” he says. 

Enabling feedback control

In the future, Downey expects to see more use of mechanistic and machine learning approaches to accomplish feedback control of bioprocesses, particularly when it comes to protein quality attributes. “The more data we gather, and the more mechanistic knowledge we incorporate, the more generalizable we can expect our models to be,” he says. “This quality could allow models to be applied across multiple sites, processes, and molecules, which would drastically reduce the development effort required for new molecules (including more complex, non-standard antibodies). Feedback control would substantially lower the process risk when it comes to quality attributes, which should lead to lower cost as well,” he says.

Currently, the main obstacle to greater use of modeling is data, says Downey, specifically, the ability to generate large amounts of process data economically, and then curate those data automatically in an electronic form so that they can be used for modeling. “Electronic data systems are part of the puzzle, but we also need to be creative about how data are generated so that we can measure the right things and generate more of the right kind of data in less time,” he says. 

Finally, he says, establishing tools that can readily access data from a complete product development program (i.e., from candidate selection to clinical end point results) is another gap that must be filled if modeling is to play a larger role in biopharm.

Eliminating data gaps

Development efforts are underway to expand the use and availability of new modeling approaches for biopharma, in both upstream and downstream processing. In late October 2019, the BioPhorum Operations Group (BPOG) was expected to release its first guidance document on modeling for continuous downstream processes. 

In September 2019, the Process Systems Enterprise Group (PSE), based in the United Kingdom, held its Advanced Process Modeling Forum in Tarrytown, NY, where it discussed plans to develop a systems modeling framework for upstream and downstream bioprocesses. Analogous to work that PSE has done with GlaxoSmithKline (GSK) and other pharmaceutical companies in the small-molecule drug space (1), its biopharma initiative (funded in 2018 by InnovateUK, the UK government’s innovation agency) would bring techniques that have been proven in small-molecule operations to biopharmaceutical processing, and also optimize other methods for use with large molecules. So far, its first application projects are focusing on bioreactors and liquid chromatography, according to Ed Close, strategy director for large molecules at PSE Formulated Products.

PSE aims to create a systems modeling capability for biological medicines within PSE’s gPROMS Formulated Products software, says Close. Many of the approaches that are being used in the small-molecule world (e.g., parameter estimation, external model validation, and optimization) are directly applicable to biopharmaceuticals, he says.

Where the science is mature, PSE’s work is focusing on science-based mechanistic models that have been derived from conservation laws of physics and chemistry. Where the science is less mature, the company is using data-driven approaches to fill in gaps in scientific knowledge, says Close. Increasingly, data-driven approaches are being combined with mechanistic models in a hybrid approach, he says. 

Use of CFD

In situations that require a high degree of accuracy (e.g., for modeling processes such as mixing, in which chemical phenomena are strongly dependent on local hydrodynamic effects), Close says that PSE is using the Multizonal gPROMS-CFD interface to link multi-zonal unit operation (e.g., bioreactor) models to computational fluid dynamics (CFD) models. This linkage provides a powerful way to model the effects of non-ideal mixing in industrial-scale process equipment, he says, while using kinetics models that were developed using scale-down processes. 

“Once we have a validated model, we can then use advanced solution techniques to exploit the model’s predictive capabilities, such as simulation, optimization, uncertainty analysis, sensitivity analysis, and deployment in online applications for operations,” says Close.

 

Hybrid models gain ground

Hybrid statistical and mechanistic modeling is becoming a promising approach for the biopharmaceutical industry, says Close. It combines the best of both worlds. The mechanistic structure of the model is used to describe well-known elements of the process, while the data-driven portion simplifies the management of unknown system complexity, such as cell metabolic networks and reaction pathways.  

The concept may not be new, per se, but use of hybrid models is particularly promising in biopharma, says Michael Sokolov, CEO and co-founder of Data How, a spinoff of ETH Zurich that is leveraging the hybrid model to develop an “algorithmic toolbox” to help increase modeling’s reach in biopharma.

“In biopharma, we do not have big data [yet] because each data point takes a lot of effort to obtain,” says Sokolov. “However, especially upstream, we face a lot of uncertainty about the overall process and data, because the process is not yet completely understood,” he says.

Purely data-driven approaches such as machine learning cannot be applied directly because they require sufficient data of good quality, Sokolov explains. “We need to utilize as much of all available information as possible (i.e., not only the data but the knowledge that has been established for the past few decades). This knowledge will help compensate for lack of data and its uncertainty,” he says.

“Hybrid models can be leveraged to help with this because they combine mechanistic approaches (i.e., teaching the digital tool the lessons that have already been learned), and machine learning approaches, which allow users to learn beyond the existing boundaries of knowledge from available data,” says Sokolov. 

Using this approach, Sokolov believes, reliable solutions can be built based on limited data and yet allow users to learn beyond what is already known to find new, nonintuitive combinations of relevant process parameters. “Including prior knowledge and stakeholders’ opinions in the solution makes it easier to change the user’s mindset. Because it is centered on information provided by users, it will not appear as a black box.”

Currently, he says, modeling is particularly important upstream, and it is more widely used in earlier process development stages. Downstream processes are better understood, but in complex cases and given time pressures, modeling offers benefits by enabling in-depth data analysis. In the future, he predicts that companies will see the value of organizing data centrally and in a standardized way, as Amgen did recently with its new facility in Singapore. “This will enable the establishment of knowledge and model libraries, which can effectively learn from data and be accessed directly,” he says. 

Sokolov says biopharma is not far from seeing digital twins being used across many companies’ labs and facilities. “However, the [broader] goal is to apply the concept to cover complete plants (i.e. all unit operations and their data sources) and to provide the model-based solution in a powerful manner so that many stakeholders can use it to support their decisions and plan operations,” he says. Sokolov believes that over the next two or three years, there will be more success stories from companies in the industry that are using this approach. At this point, he says, DataHow is working with a number of industry partners on projects in this area.

Reference

1. A. Shanley, “Can Better Modeling Reduce Pharmaceutical Development Costs?” PharmTech.com, March 16, 2016.

Article details

BioPharm International
Vol. 32, No. 11
November 2019
Pages: 18-22

Citation

When referring to this article, please cite it as A. Shanley, “Modeling Comes of Age in Biopharma,” BioPharm International 32 (11) 2019.

 

Recent Videos
Lee Cronin, founder and CEO of Chemify
Laks Pernenkil, PhD, principal and practice leader, US Life Sciences Product & Supply Operations, Deloitte
Related Content
© 2024 MJH Life Sciences

All rights reserved.