Platform processes have improved monoclonal antibody scale-up. Can they do the same for personalized therapies?
ustas/Stock.Adobe.com
Biopharmaceutical processes have always been difficult to scale up. Within the past decade, however, the industry has achieved significant improvement by adopting technologies that allow more predictable outcomes, says Alex Chatel, product manager at Univercells. Bioreactor manufacturers now integrate analytical technologies such as sensors, which allow better prediction and control of changes between scales, into their products. Chromatography and filter manufacturers have developed scalable and reproducible low-volume mimics of their industrial-scale equipment, which allow more accurate scale-up work than ever before, he says.
The availability of single-use equipment and platform solutions-such as fed-batch and perfusion bioreactors and multi-column chromatography-has helped achieve faster, more reproducible scale-up results. “Platforms have had a very positive impact on companies’ ability to develop products and manufacture them less expensively,” says Tom Ransohoff, vice-president and principal consultant with BioProcess Technology Consultants. Used almost exclusively for monoclonal antibodies (mAbs), platforms help shorten development times, particularly early in development, Ransohoff says. They also allow manufacturers to leverage performance data across multiple products, reducing the cost of late-stage process characterization and qualification. In addition, they permit some degree of facility design standardization, improving operational flexibility and capital savings, he says.
Developers are exploring how these platforms might be used in novel therapies such as cell and gene therapies. Although these treatments are in early stages of development, gene therapy has already become a reality, with companies such as Spark Therapeutics. The company received “breakthrough therapy” status for retinal dystrophy and hemophilia treatments, and is collaborating with Pfizer on hemophilia therapies, now in Stage III clinical trials.
The key to making gene therapy accessible to patients will be better manufacturing processes, Peter Marks, director of FDA’s Center for Biologics Evaluation and Research, told attendees at the Galien Forum in New York City on Oct. 25, 2018, in a panel discussion on gene therapy. At this point, the FDA has approved one directly-administered and two cell-based gene therapies, Marks said, but the Agency has received over 700 investigational new drug (IND) applications in the gene therapy field. “Sustained, reliable manufacturing will be required to bring gene therapy to patients worldwide,” said Mikael Dolsten, president of worldwide R&D at Pfizer.
Great improvements have been seen so far in delivering adeno-associated virus (AAV) vectors. These results demonstrate the inextricable link between clinical work and process development, said Kathy High, cofounder and head of R&D at Spark, who also spoke on the program. Good clinical results drive good process development, she said, in turn driving good manufacturing, which then drives more clinical results. “We’re already up by logs from where we were when we started our research,” she said.
Developers are actively exploring ways in which platform technologies could help improve the yield of AAV and other critical components of personalized therapy development and manufacturing. For example, a new collaboration (1) that involves GE Healthcare, Cobra Biologics, and UK’s Center for Process Innovation, is working on new ways to use fiber-based chromatography to improve AAV scale-up. Other development projects are evaluating different forms of chromatography and centrifugation.
“Companies are looking for ways to improve productivity because products on the horizon, such as new treatments for Alzheimer’s disease and antibodies for infectious diseases, could require significantly more material than products on the market today,” says Ransohoff. One way they do this, he says, is by “scaling out,” intensifying processes, for example by using fed-batch and perfusion on the upstream side, and multicolumn operations, whether it’s just higher cycling, or more continuous or semicontinuous chromatography (e.g., simulated moving bed (SMB) technologies), downstream, says Ransohoff.
Although use of process intensification is still at an early stage of adoption, more companies are actively studying how it might be used in multicolumn chromatography as well as systems employing membrane absorbers and fiber-based chromatography supports, Ransohoff says.
“Novel affinity ligand development technologies are getting to a point where companies can consider using a platform approach for a broader range of molecules in their pipelines, he says. Viral vectors-for example, AAV serotypes for gene therapies-could benefit from this technology,” he says. “This would allow gene therapy developers to use some of the platform technology thinking that is already being used to develop monoclonal antibodies,” he adds.
“Process intensification will enable much quicker pull through of personalized medicines, especially gene therapies and eventually cell therapies,” says Rick Morris, senior vice-president of product development at Pall Corp. In addition, he says, it will speed up and allow for screening of a much larger number of variants for mAbs, bispecifics, trispecfics, fragments, and recombinant proteins. For some cell therapies, the current process yield is only 30%, says Morris.
In internal testing, Pall scientists optimized process intensification to make 100 g/d of one product, enough to do preclinical and toxicology studies. The approach reduced the time required by six months, compared with traditional processes, says Morris. In addition, the equipment footprint was reduced by four or five fold, and the systems enabled, at large scale, a 40–50% savings in Protein A resin use, he says.
The cost of current technologies is another challenge, says Chatel. “Many processes rely on the culture of adherent cells for producing a range of biological molecules, including viral vaccines, viral vectors for gene therapy, and oncolytic viruses. Producing these therapies using traditional technology (i.e., static culture-ware, roller bottles or microcarriers in agitated bioreactors) suffers from a number of drawbacks,” he says, “including high operational costs caused by complex processes and the need for manual operations, complex process development, and high facility footprints leading to high capital costs.”
Univercells is applying process intensification and taking a chemical engineering approach to bioprocessing, Chatel says, with its patented scale-X line of bioreactors. The reactors are designed so that a very large surface for cell growth is packed inside a small volume, reducing the number and complexity of operations compared with traditional equipment, he says. This results in a smaller footprint and a more robust process, Chatel adds.
“Direct linear scalability is ensured by applying concepts similar to those found in the scale-up of chromatography columns, whereby the height of the reactor is kept the same with its
diameter increased as a function of the scale,” says Chatel. In addition, he says, the physical and chemical conditions are kept the same across scales, ensuring a smooth a risk-free scale-up.
The use of this bioreactor, in turn, led to the development of a “microfacility” concept, the NevoLine, which is designed to produce industrial quantities of viral vaccines (more than 550,000 doses per batch for a polio vaccine) in a near-continuous manner using 6 m2of floor space.
In more traditional biopharmaceutical development, bioreactors are typically the most difficult pieces of equipment to scale up, says Fredrik Lundström, product manager for bioprocess downstream hardware with GE Healthcare Life Sciences. Use of digital modeling is helping developers understand processes at small scale and assess operations before scaling up, he says.
Bioreactors also present opportunities for scaling out. “The upstream process is where you see major opportunity to intensify the process witheither intensified fed batch or continuous perfusion approaches that can be both scaled up and out,” says Peter Levison, executive director of business development at Pall Corp.
Bioreactor and fermentor technology platforms have had the greatest impact so far, says James Blackwell, principal consultant with the Windshire Group. Cell culture improvements were followed closely by small-scale single-use technologies. As a result, on the bacterial side, developers can now go into production using 100- to 300-L single-use reactors. “You can start working with those technologies at much smaller scales now than was possible in the past,” he says.
Being able to operate 2000-L single-use bioreactors in parallel also means that users can exploit significant improvements in both cell densities and product titer to generate a multi-fold increase in productivity levels, says Levison. “The key to exploiting this intensified upstream process is having downstream processing solutions that can handle the increase in mass and concentration of product and not become the process bottleneck. The adoption of continuous downstream processing unit operations has been a real enabler in addressing the process requirements of these intensified upstream processes,” he says.
Today, scale-up challenges are more associated with the automation and control of the processes, particularly with hybrid approaches, says Levison. Each part of the process must be properly connected and monitored and operating at a production rate aligned to both the upstream and downstream unit operation, he says. Analytics and control systems become critical to delivering reliable, accurate, and precise data for the duration of the entire batch, says Levison. “Not only strong process monitoring and control, but highly relevant assays are needed to ensure that clinically relevant product specifications are met. Where these are on-line or at-line, a high degree of accuracy and precision is a prerequisite,” says Levison.
As development timeframes shorten, developers can focus so intensely on short term goals that they lose sight of the big picture. During initial workflow development, for example, the need to produce material as quickly as possible can override the need to focus on optimizing process simplicity, reliability, and cost-effectiveness. “Often, developers fail to consider the entire workflow and interactions between unit operations, both of which are critical to ensuring a robust, efficient, cost-effective workflow for manufacturing biological drugs,” says Lundström.
“From time to time, we see biomanufacturers entering manufacturing scale with recipes and specifications that are difficult to operate and achieve at large scale,” he says. Prime examples, he says, are recipes with long and shallow linear gradients, as well as pH specifications for process buffers that can only be achieved off-line in the lab. Developers may also use a wide range of buffers to achieve specific results, which is fine in the lab but can create major problems at industrial scale, says Lundström.
Scale-up and scale-down are codependent, says Chatel. “The synergies between the two are essential and must be embraced by equipment manufacturers,” he says. The obvious target for scale-down models that accurately mimic larger scale equipment is in cases where pilot or industrial runs are most expensive (e.g., chromatography process steps or cell culture). However, with novel products such as cell therapy, the costs of production can be so high (e.g., including the cost of reagents for cell adherence and culture), that the whole process itself is essentially of high value and thus a desirable target for scale-down models, says Levison.
“There is a significant body of literature on how to scale up and scale down the unit operations that we work on most commonly for cell culture, fermentation processes, tangential flow filtration (TFF), and chromatography,” says Ransohoff. “Challenges tend to come with scale-dependent parameters that are more difficult to mimic, such as mixing, heat transfer, and mass transfer,” he says.
There is still a limitation in scale-down models at the sub-milliliter scale that would enable high-throughput screening for process development to work at the earliest stages of drug development, says Chatel. “As interfacial science and data analytics evolve, however, within the next 10 years, process development work may routinely be done on a bench-top,” he says.
Some high-throughput tools are already used early on and in process characterization, allowing users to run many experiments in parallel at smaller scale, with less material and in less time. One example Ransohoff cites is plate-based chromatography resin screening, enabled by technologies such as GE Predictor plates, that allow users to consider binding properties of resins and solutions in more multiplexed formats.
Beyond process development, on the drug product delivery side, personalized medicine will be much more challenging to scale up, says Blackwell. With autologous therapies, he says, scale-up involves a single unit and operations around the flows of a single product. “Scaleup becomes about achieving efficiencies on the operating floor, so you can scale up those processes efficiently in parallel. A new shift in thinking has to happen within the industry,” says Blackwell.
1. Cobra Biologics, “Cobra, GE, and the Center for Process Innovation Collaborate to Advance Gene Therapy,”Press Release, cobrabio.com, September 25, 2018.
BioPharm International
Vol. 31, No. 11
November 2018
Pages: 14–16
When referring to this article, please cite it as A. Shanley, "Scaling Up Novel Therapies," BioPharm International 31 (11) 2018.