The development and production of biologics can be a complex process. And the manufacture of biopharmaceuticals requires that processes be developed and validated so that product specifications can be met to create a safe and effective biologic drug. Steps can be taken at each stage of the development and manufacturing of biologics to ensure processes are optimized for the best results.
Sponsor companies often look to contract development and manufacturing organizations (CDMOs) to help with process development of these products. The following article provides advice and insight from experts and bio/pharma CDMOs on how process development can be optimized.
Upstream processes
A scalable production system is necessary when considering how to optimize upstream processes, according to Daniel Giroux, vice-president of Biologics Development at Abzena. Typical large-scale systems range from using 500- to 2000-L bioreactors, and the scale can prevent experiments. “If you have a single, 2000-L reactor, you can’t run multiple experiments for optimization,” says Giroux. A small-scale system will be needed in which growth and production parameters of the larger-scale system can be reproduced.
After the small-scale model is created, a design-of-experiment (DoE) approach can be used to identify parameters that may impact the titer or the product quality. “You create a statistical experiment where you’re varying those parameters in combination and singly. And then you execute that experiment in parallel bioreactor systems, so a small-scale reactor where you have many of them,” Giroux explains. Statistical tools can then be used to pull data from DoEs and models to optimize input parameters to achieve maximal titers.
Process intensification is another approach Giroux points to, where higher cell densities are used at inoculation to increase the area under the cell density versus culture time curve. “You want your cell density to be high for a longer period of time,” Giroux says. “And by starting at a higher cell density, it lets you do that, and you basically get 50 to 100% more titer in an intensified fed batch process. That means, though, that you have to develop a perfusion process for your penultimate reactor, your last seed reactor, so that you can actually have enough cells to inoculate your production systems at these higher densities.”
According to Paul Mugford, director, Biologics Process Development at BIOVECTRA, media optimization is an important part of upstream processing, especially in microbial fermentation. Media should be formulated to support optimal growth and productivity of the microorganism, says Mugford, which includes optimizing carbon and nitrogen sources, trace metals, vitamins, and other supplements. “All media components, of course, have to be animal free,” says Mugford. “And we also choose to start our process development with representative materials that will be used at large scale. These are raw materials that meet all the required GMP [good manufacturing practice] requirements, so there’s no issues transferring from small to large scale with respect to materials.”
Other areas that can be optimized are process parameters such as temperature(s), pH, dissolved oxygen, and feed rates. These can be optimized using DoE, Mugford points out. Early screening of equipment is used for initial optimization with shake flasks or higher throughput microtiter plate format.
“Once the initial screening is done, we would then move into our smallest fermenters, which are the Amber250 parallel reactor system, or our Dagsip system, where each one of those systems can run eight fermentations in parallel for further optimization of media and feeding,” explains Mugford. “And our goal here is to maximize the growth and also the productivity, so we monitor the process extensively for the carbon source (glucose or glycerol), acetate, ammonia, CO2, and use [these] data for further optimization. After that, we’d move into our larger fermenters, such as the 10-L and 50-L fermentor before scale up.”
When looking at viral-vector-based gene therapy products, larger and better-quality virus yields can be the result of optimizing upstream processes. “In cell therapy, the key is to scale up to gain higher cell yields while still maintaining the cell morphology and the genetic stability of the cells, so that it’s the same at small scale and large scale,” says Bob Schrock, PhD, senior director, Global Head of Process Development, at Lonza Group AG.
Cell Culture Optimization
Bob Schrock, PhD, senior director, global head of Process Development, at Lonza provides insight on how cell culture processes may be optimized and what kind of process studies are performed for cell culture.
Part One
Part Two
“And [for] viral vectors, keeping [the cells] happy and in culture[s] at bigger scales, larger scales, maintaining that all important yield of the number of virus per cell, [and] making sure that that stays similar the bigger, the larger and larger vessels you go up to, [is important],” Schrock explains. Multiplicity of infection (MOI), plasmid ratios, transduction amounts, and when to harvest after transduction are other parameters that need to be taken into consideration for viral vectors, says Schrock.
Passage number studies are also important. “Cells can change after multiple passages, and that can introduce variability into your process,” stresses Schrock. “So, you want to make sure that the conclusions you reached within a cell passage of 50 are the same as it is at a cell passage of 100 or whatever constraints you put in there for your minimum and maximum cell passage numbers. So, you don’t want to forget to do that as well to make sure that your process can handle cells at a variety of different passage numbers.”
Cell culture
Optimization of process development for cell culture is dependent on the type of product being produced, according to Schrock. Scale and purity are two important factors to consider as well if the product is cell based or single cell.
To optimize a cell culture process, the parameters that need to be optimized must be defined, says Giroux, either titer, product quality, or both. In addition, assays should be in place to identify target product quality parameters. “Typically, for product quality, we’re looking at glycosylation patterns on proteins to make sure that they are correct, or in the case of a biosimilar, that they match the originator,” Giroux says. One must then vary input parameters, both individually and in combination, and then measure their effect on productivity and product quality.
When it comes to performing process studies for cell culture, Giroux says parameters that vary in DoE studies include choice of basal medium and concentrated feeds. For fed-batch processes, parameters include choice of additives and trace elements, cush as minerals, trace metals, vitamins, anti-foam and anti-clumping agents, and surfactants.
“We then also vary or look at temperature, especially shifting the temperature. So, we typically start our productions at 37 degrees, or 36 and a half. But sometimes you get better productivity in a longer culture duration if you reduce the temperature part-way into the run to perhaps 32 degrees or 34 degrees,” explains Giroux. “But we could study that the effect of pH on productivity is assessed as well [as] agitation, and then again, [at the start] cell density can have an impact, and that’s whether it be an intensified process or not. Sometimes there’s an optimal cell density even at non frame, non-intensified process.”
“So, those are the types of parameters that we’re studying, but we generally [are] studying them in these design of experiments approaches where we can vary them in combinatorial fashion and understand the effects of the parameters individually, but also in combination with the other parameters,” Giroux continues. “So, for example, if a parameter affects your cell density, you may need higher feeds. So, a parameter that you know can’t be optimized by itself [has] to be optimized as a family.”
Schrock says it’s important to close open manual process steps to reduce contamination risks and the cost of production by reducing the required grade of the GMP production clean room (i.e., grade C cleanrooms are cheaper to maintain than grade B). Process automation for equipment and perfusion is also important, says Schrock.
“One really great example of this is the Lonza Cocoon Instrument. This is both a functionally closed system as well as automated system for manufacture of autologous cell therapies. This is a really good example of doing both automation as well as closing the process. Once you have a closed process, you move to optimizing parameters … things like pH, dissolved oxygen, agitation, speeds, gasses, and even temperatures, sometimes in certain viral vector types,” Schrock concludes.
Microbial fermentation
Quality considerations, such as regulatory guideline requirements and critical quality attributes, play a role in the design and optimization of fermentation processes, according to Mugford. First, one should decide on which microbial strain to use, depending on what type of product is being produced, because each biologic has unique strain requirements. According to Mugford, engineered Escherichia coli (E. Coli) strains provide optimal performance for recombinant proteins or plasmid DNA.
“Considerations in strain selection include the protein solubility, whether or not you get correct folding, the expression location, [and] if the target protein is in the cytoplasm or the periplasm,” says Mugford. “And there’s also modified strains that allow to correct for any codon bias or disulfide bond formation. So, selecting the appropriate strain is important up front.”
The design of the plasmid is also important for protein production, says Mugford, specifically selection of the promoter to tightly control expression of genes that encode proteins. Productivity can also be improved by codon optimization of the gene of interest and removing non-essential plasmid elements, according to Mugford. “And also, you may want tunable expression in cases where inclusion bodies are formed for the protein. [Y]ou may want to express the protein slower to minimize inclusion body formation.”
Plasmid DNA products have other challenges that require attention paid to strain selection, and plasmid design, according to Mugford. Plasmid stability can be enhanced, and endonuclease activity can be reduced due to the mutations E. coli strains designed for plasmids production. A high copy number plasmid is therefore used to maximize productivity. Mugford recommends the use of a fragment analyzer to assess clones in order to select the most appropriate to produce high-quality plasmid DNA.
“One of the challenges we’ve seen in the plasmid DNA area is when you have repetitive elements in the plasmid,” says Mugford. “Long poly(A) tails are quite common in plasmids destined for in-vitro transcription to make mRNA [messenger RNA]-based therapeutics like the COVID vaccines. Long poly(A) tails are prone to recombination events, and this can give some plasmid instability and shorten the length of the poly(A) tail leading to different length products. That affects the integrity of the plasmid. An efficient clone selection process for plasmid DNA can identify clones with a single, correct length polyA tail. Another important quality parameter in plasmid DNA is the super cooling, which is an indication of how much damage has been done to the plasmid.”
The next step in optimization of microbial fermentation is setting the fermentation parameters, such as media optimization, feed rates, and other setting to ensure scalability, says Mugford. Small-scale production needs to be able to transfer to large-scale parameters.
“[A]t the early development scale, our smallest fermentors are the 250-mL Amber fermentors for condition screening. These increase to fermentors in the 10 to 50-L range in process development, and [then up to] 200-L scale at the pilot stage. At the manufacturing scale, we can scale up to our 100-L and 1000-L single-use fermentors, or our 3000-L and two, 17,000 -L stainless-steel fermentors for microbial fermentation. To go from the smallest scale to the largest scale, one of the most important parameters is the oxygen transfer coefficient (kLa). Our chemical engineers have developed models to scale up and down while reproducing the same growth rates, and providing consistent results across scales.”
Mugford recommends single-use fermenters for their ability to quickly transition between products and reducing cleaning and testing. He also points to heat transfer as an important parameter for scale up, specifically during aggressive microbial growth. “With rapid growth in E. coli, there’s a lot of heat produced. [U]nderstanding the oxygen transfer rate for the fermenters and the cooling capacities is important when moving from small to large scale,” Mugford says. “[T]here are a lot of scale-independent parameters that can also be evaluated directly using DoE at the process development [stage].”
Cell harvesting
One way to harvest cells in a bioreactor, according to Giroux, is to use depth filtration, which involves a multi-stage depth filter followed by membrane filters that remove the cells. Large surface areas are required; therefore, producing a lot of waste. “Especially at the large scale, they’re troublesome to install and remove and so on. So, the setup time is difficult to remove,” Giroux warns. He points to continuous centrifugation as a better approach, which often uses a single-use disk stack centrifuge to extract the cells and dense components from the media.
“[We] don’t lose very much of the liquid itself [with continuous centrifugation]. So, you get a pretty thick slurry of cells that comes out. It might be 80% cell mass and about 20% media,” says Giroux. “For example, if you had 8% packed cell volume in your reactor, you could basically remove only 2%; you’d remove your 8% cells and then 2% of the media. So, your yield would actually be very high. You generally follow that with a very small area depth filter, sometimes even a specialized depth filter that’s designed to remove any additional cells. These filters can be slightly charged, and they help to remove debris as well.”
Choosing the correct scale-down equipment is important at process scale so that representative material is generated for filtration, inclusion body refolding, and downstream processing steps, says Mugford. For harvesting recombinant protein in E. coli, disc stack centrifuges are often used. A key parameter used in centrifuge scaling and performance evaluation is the Q/Sigma factor, which represents the flowrate (Q) and the sigma factor (Σ), according to Mugford. “This represents the relationship between the flow rate Q to the sigma factor, which is the equivalent settling area for a given centrifuge,” he says. “It’s a fixed number for each type of centrifuge, from small to large scale, and we find this is the most reliable parameter for scaling the centrifuge harvest.”
“To optimize this step, you would typically look at the percent solids in the input, the flow rate and discharge interval, and the bowl safety factor,” Mugford explains. “Those are important parameters and measurements. And you would monitor the turbidity of the centrate as a key metric of how well the clarification is going. The percent solids is a key metric if they will be carried forward, as in cases where the product is located within the cells.”
Because plasmid DNA is sensitive to shear, it requires a different approach for harvesting. During the discharges of the centrifuge, some damage can occur to the plasmid DNA, resulting in a reduced percentage supercoiling, stresses Mugford. He points to tangential flow filtration (TFF) as a better approach for plasmid DNA harvesting. Mugford says that optimizing the TFF requires selecting the appropriate cassette type (the micron/molecular weight size of the cassette).
“Typically, you would use an open channel cassette. You need to determine the loading, so liters per meter squared for the cassette and the flux rate,” says Mugford. “So, an advantage of the TFF harvest is that the cells can be washed during filtration, so this removes any spent media and some other impurities, which gives you a cleaner product into downstream processing. After this, you would typically go into clarification with a series of depth filters followed by cell lysis. For the lysis of cells containing proteins, you would typically use a high-pressure homogenization. Because plasmid DNA is shear sensitive, we developed a different method [in collaboration with Sartorius], using a chemical lysis in a continuous flow reactor to give a mild lysis and preserve quality through the midstream steps, allowing us to extract the plasmid DNA without doing any damage to it.”
Conclusion
The complex nature of biologics development and product offers a variety of ways in which companies can maximize their efforts and produce safe and effective medications. CDMOs and other contract organizations can provide sponsor companies with the expertise and infrastructure to streamline these complex processes. Planning ahead and using the correct parameters and approaches can ensure a cost-effective process.
About the author
Susan Haigney is lead editor of BioPharm International®.
Article details
BioPharm International
Vol. 38, No. 1
January/February 2025
Pages: 29–32
Citation
When referring to this article, please cite it as Haigney, S. Steps to Process Development Optimization. BioPharm International 2025 38 (1).