Monitoring of Biopharmaceutical Processes: Present and Future Approaches

Publication
Article
BioPharm InternationalBioPharm International-05-01-2009
Volume 22
Issue 5
Pages: 40–45

This article reviews some of the commonly used approaches for process monitoring as well as the evolution of process monitoring in the Quality by Design (QbD) paradigm.

ABSTRACT

The complexity of biopharmaceutical processes warrants that gathering of knowledge about the product and the process will continue as the product progresses through the different phases of commercialization, i.e., process and product development, process and product characterization, and process validation and routine manufacturing. Process monitoring has emerged as a critical tool for demonstrating that the manufacturing process is in control and identifying future process improvement opportunities.

Process monitoring is the method for collecting process data and statistical evaluation of parameters to verify and demonstrate that the process is operating in a state of control, identify possible process changes and shifts, and promote continuous improvement. Typical process monitoring applications in biopharmaceutical manufacturing incorporate statistical process control. In the United States, the Code of Federal Regulations (21 CFR Part 211) specifies "application of suitable statistical procedures where appropriate," with in-process specifications "derived from previous acceptable process average and process variability estimates."1

Process monitoring has been a topic of discussion in recent guidances from the regulatory agencies. The ICH Q8 guidance states that:

"Collection of process monitoring data during the development of the manufacturing process can provide useful information to enhance process understanding."2

The PAT guidance further states that:

"Process monitoring and control strategies are intended to monitor the state of a process and actively manipulate it to maintain a desired state. Strategies should accommodate the attributes of input materials, the ability and reliability of process analyzers to measure critical attributes, and the achievement of process end points to ensure consistent quality of the output materials and the final product."3

The ICH Q10 guidance identifies monitoring as a key element of the pharmaceutical quality system and states that:

"Pharmaceutical companies should plan and execute a system for the monitoring of process performance and product quality to ensure a state of control is maintained. An effective monitoring system provides assurance of the continued capability of processes and controls to meet product quality and to identify areas for continual improvement."4

Finally, a presentation on the recently released FDA guidance on process validation highlighted process monitoring as a critical follow-on to process validation to verify that the process is in control, add assurance of product quality, and reveal the need or opportunity for improving the process and the control strategy.5

This article is the 16th in the Elements of Biopharmaceutical Production series and reviews some of the commonly used approaches for process monitoring as well as the evolution of process monitoring in the Quality by Design (QbD) paradigm.

Anurag_Rathore

TRADITIONAL PROCESS MONITORING: USE OF CONTROL CHARTS

Statistical process control tools are used in a methodology that uses graphical and statistical tools to identify, analyze, control, and reduce variability in a process.6 The most basic tool is a run chart (or running record). The run chart is a simple graphical tool that is used to record and display process data over time. A more advanced version of the run chart is the control (or Shewhart) chart, which is a run chart that also includes upper and lower control limits (UCL and LCL, respectively) and a centerline calculated from the process data.7 These upper and lower control limits are derived statistically, and provide bounds for the natural variability of the process. Upper and lower control limits are typically established at ±3 standard deviations above and below an established process mean; individual points are evaluated against these limits. Thus, control charts use the variability inherent in the process to assist in determining whether observed variability is likely to be because of chance or the result of some (perhaps as yet unidentified) process shift or change.

After creating a control chart, a distinction needs to be made between common and special causes of variation. Common causes refer to the many unknown sources of variation that go into producing a natural variation that is predictable within limits. Special causes (also known as assignable causes) refer to the sources of variation that are not part of the natural variation. Special causes of variation should be examined to determine if any data point should be excluded from the analysis because of a known cause.

Statistical process control is a three-step process. The first step is process trending, in which the first to the 15th data points are examined for inclusion, and the series of data points is analyzed for trends. The second step is preliminary process control, in which the first 15 data points are used to calculate a centerline representing the mean of the data and the UCL and LCL, representing three standard deviations from the mean. All three are displayed on the chart. The 16th to 30th data points are then plotted to see if each point is within the control limits, and the series of data points is analyzed for trends. The third and final step is called statistical process control. It is the same process as preliminary process control except that the centerline and control limits are computed with enough data to be considered statistically significant.

There are two different approaches that are used in monitoring of biopharmaceutical manufacturing processes. The first approach focuses on data from a single lot. Such data associated with a particular lot are reviewed before releasing that lot. A data point beyond a control limit for a given process would cause a nonconformance that would have to be accounted for as part of the lot disposition process. The second monitoring approach analyzes process performance data across lots looking for trends. When looking for trends, process monitoring uses another statistical process control tool called run rules, also known as control rules or run tests. In 1956, the Western Electric Company published the first set of run rules in the Western Electric Handbook, which is today called the AT&T Statistical Quality Control Handbook.8 The rules were based on dividing the control chart into six segments using one, two, and three standard deviations above and below the centerline. The first four rules defined conditions that indicated that the process was not in statistical control. The remaining three rules identified deviations and trends within the bounds of the UCL and LCL. In 1984, Lloyd Nelson published a similar set of run rules in the American Society for Quality's Journal of Quality Technology.9 The Nelson run rules included in-control trends that identified potential problems. These trend rules are very useful in process monitoring; particularly Nelson rules 1 through 4. The ability of these rules to spot a trend or a deviation is illustrated in Figure 1.

Figure 1

Hotelling's T2 and Multivariate Exponentially Weighted Moving Average (MEWMA) are the two other approaches that are widely used in the biotech industry for process monitoring.10 Hotelling's T2 monitors individual process observations while MEWMA monitors shifts and drifts in a process. Annamalai, et al., have presented a case study involving monitoring of eight parameters (harvest volume, harvest amount, cSulf RP–HPLC, B. sepharose recovery, overall recovery, specific activity, peptide map sub-unit percent, and DS rapid acidic C4 RP–HPLC) for a protein purification process.11 In their case study, the Hotelling's T2 identified an unusual production batch during monitoring that would have otherwise gone unnoticed. MEWMA revealed small process drifts that were previously hidden. Understanding the origins of these drifts provided opportunities to improve the process further.

MONITORING IN QbD PARADIGM

In the QbD paradigm, the concept of design space plays a central role around which the various activities revolve.12,13 Design space has been defined as: "The multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide quality assurance. Working in the design space is not considered a change. Movement out of the design space is considered to be a change and would normally initiate a regulatory postapproval change process. Design space is proposed by the applicant and is subject to regulatory assessment and approval."2 After the design space has been defined from process characterizations studies, process validation is performed to demonstrate that the process will deliver a product of acceptable quality, if operated in the design space. The regulatory filing would include the acceptable ranges for all key and critical operating parameters (i.e., design space) in addition to a more restricted operating space typically described in the manufacturing procedures for biotech products as the set of operating ranges for the operating parameters.

After approval of a biotech drug has been obtained, process monitoring of the product quality and process performance attributes is performed to ensure that the process is performing within the defined acceptable variability that served as the basis for the filed design space. In the QbD paradigm, process changes within the design space will not require regulatory review or approval and this may facilitate process improvements during the product lifecycle. Excursions outside the operating space would indicate unexpected process drift and may initiate an investigation into the cause of the deviation and a subsequent corrective action. Excursions outside the design space will require a more thorough investigation of the root cause and the impact on product quality. As manufacturing experience grows and opportunities for process improvement are identified, the operating space could be revised within the design space without the need for a postapproval submission. Process knowledge and design space can be updated as understanding is gained over the lifecycle of a product. Changes to design space would require evaluation against the need for further characterization or revalidation.

REAL-TIME PROCESS MONITORING

The future of process monitoring lies in combined use of powerful analytical tools capable of supporting real-time decision making and sophisticated statistical tools that can analyze complex data sets in an efficient and effective manner.13 Becker, et al., have recently reviewed new approaches to sensor technology and control strategies to a variety of bioprocesses along with modern aspects of data evaluation for improved monitoring and control.14 Combinations of principal component analysis (PCA) and exponentially weighted moving average, as well as partial least squares (PLS)-based methods have been used to demonstrate their usefulness as monitoring tools capable of detecting small shifts in biological processes.15–17 The use of multivariate control charts for a cell culture step toward real-time process monitoring and for identification of atypical process performance has been suggested.18

As mentioned above, PCA and PLS are projection models and commonly used in multivariate process monitoring. PCA is often chosen when the objective is to simply monitor many variables that are collected in a continuous manner. An example of this is batch cultivation at a certain sampling frequency (based on the sampling and measurement systems in place and may vary between daily offline sampling such as pH and real-time analyzer or probe readings such as a pH probe). PLS is more effective when the objective is predictive monitoring. In other words, while monitoring many variables, we also would like to know how changes in them may affect a process end point such as yield.

PCA and PLS are data-driven and require a representative batch history of known good performance of the process to establish a baseline (or a reference mode) for facilitating comparison of new batches. Both techniques are also very powerful in explaining the overall variability and the correlation structure of all the variables, accounting for missing values, outlier detection, handling colinearity, and more importantly, reducing the dimensionality problem (while removing measurement noise). Because there are many variables measured (and often they are correlated to each other, hence creating colinearity) it is important to be able to reduce the number of variables to a select few important ones, which are typically expressed as principal components and latent variables, that actually drive the overall variability of the process. As shown in Figure 2, PCA uses a mathematical algorithm to fit a model plane to the data cloud and finds the direction that shows the maximum variability of the process followed by finding the next variability direction that is orthogonal to the first component and so on, until there is no significant variation left to be explained but noise. Therefore, each principal component has some contribution from the original variables in a certain weighted manner depending on the correlation structure. Multivariate statistical process monitoring in real-time uses these derived variables (principal components) to monitor the entire process performance along with multivariate statistics and charts for fault detection and diagnosis.

Figure 2

After the process model based on historical data is developed by either PCA or PLS, a number of multivariate statistics and charts can be constructed for monitoring new batches. Commonly used multivariate monitoring and diagnosis charts include:19

  • Squared prediction error (SPE, also known as Q-residuals or DModX) chart: SPE is used for process deviation detection. It is very useful in detecting events that are not necessarily captured by the model. In other words, when SPE limit violation is observed (and if there is no T2 violation), it is likely that a new event is observed that is not captured by the reference model (this can be triggered by a normal event that is part of inherent process variability that is not captured or a process upset).

  • Score time series and Hotelling's T2 charts: As mentioned earlier, these charts are also used for process deviation detection; in this case, detecting deviations that are explained by the process model and within the overall variability, but unusually high comparing to the average. Score time series allows one to monitor performance at each model dimension separately while T2 allows monitoring all of the model dimensions over the course of a batch run by using a single statistic.

  • Contribution plots: When either or both of the detection charts identify a deviation (violation of multivariate statistical limits) from the historical behavior, we need to find out what the deviating variables are. Contribution plots are then used to delve into the original variable level to inspect which variable or variables are contributing to the inflated statistic.

SPE, T2 , and score-time series should be used together to better understand what type of deviation (a known and an unknown) is detected.

CASE STUDY INVOLVING REAL-TIME PROCESS MONITORING

As an example of real-time multivariate statistical process monitoring, a PCA-based model has been developed for monitoring a mammalian cell culture bioreactor at commercial-scale. In this setting, historical batches are mined from the manufacturing databases to develop a nominal process model for a seed bioreactor train. Eleven process variables that are measured online for 30 batches are used in model building. The model is able to explain overall process variability with only three PCs. New production batches are monitored against this model in real time.

During the real-time monitoring, one typically looks at high-level multivariate charts for deviation detection purposes as mentioned earlier. In Figure 3, the three main steps involved in monitoring are shown. In step 1, a T2 chart is used to detect a deviation. Step 2 involves diagnosis at the variable level, which indicates that the pH probe is reading less than historical averages, i.e., outside of +/–3 standard deviation. Finally, in step 3, inspection of the pH trace is performed. This allows scientists and engineers to start troubleshooting the probe and other operational factors to better understand and monitor the process by this simple three-step process.20

Figure 3

CONCLUSIONS

Statistical techniques, such as those discussed in the article, have been demonstrated to be capable of performing rigorous analysis of complex datasets that abound in biotech applications. This, combined with the advancements in analytical tools to allow online analysis, can form the basis of real-time process monitoring. Implementation of such systems is likely to result in gains in consistency of product quality as well as efficiency in manufacturing of biotech products and bring us closer to full implementation of QbD and realizing its benefits.

Paul Konold is senior engineer, Rob Woolfenden II is principal engineer at Seattle, WA, Cenk Undey is principal engineer at West Greenwich, RI, and Anurag S. Rathore is director at Thousand Oaks, CA, 805.447.4491, asrathore@yahoo.com All authors are in the process development department at Amgen, Inc. Rathore is also a member of BioPharm International's editorial advisory board.

REFERENCES

1. US Food and Drug Administration. US Department of Health and Human Service. Part 211 current good manufacturing practice for finished pharmaceuticals, Code of Federal Regulations. Title 21; Vol. 4. Latest revision on 2008 Apr 1.

2. US Food and Drug Administration. Guidance for industry. Pharmaceutical development. Rockville, MD; 2006 May. Q8 Annex Pharmaceutical Development, Step 3; 2007 Nov.

3. US Food and Drug Administration. Guidance for Industry. PAT guidance for industry. A framework for innovative pharmaceutical development, manufacturing and quality Assurance, Rockville, MD; 2004 Sep.

4. US Food and Drug Administration. Guidance for industry. Quality systems approach to pharmaceutical CGMP Regulations. Rockville, MD; 2006 Sep.

5. Hasselbach B. Process validation–a lifecycle approach. 2008 PDA/FDA Join Regulatory Conference; 2008 Sep; Washington DC.

6. Hotelling H. Multivariate quality control, techniques of statistical analysis. Eisenhart C, Hastay HW, Wallis WA, editors. New York: McGraw-Hill; 1947. p. 111–184.

7. Bersimis S, Psarakis S, Panaretos J. Multivariate statistical process control charts: an overview. Quality Reliability Eng Int. 2007;23:517–543.

8. AT&T statistical quality control handbook. 11th ed. North Carolina: Delmar Printing Company; 1985.

9. Nelson LS. Technical aids: display tables and significant digits. J Quality Technol. 1984;16:175–176.

10. Lowry CA, Woodall WH, Champ CW, Rigdon SE. A multivariate exponentially weighted moving average control chart. Technometrics. 1992;34:46–53.

11. Johnson R, Yu O, Kirdar AO, Annamalai A, Ahuja S, Ram K, Rathore AS. Applications of multivariate data analysis (MVDA) for biotech processing. BioPharm Int. 2007;20(10):130–144.

12. Rathore AS, Branning R, Cecchini D. Design space for biotech products. BioPharm Int. 2007;20(4);36–40.

13. Rathore AS, Winkle H. Quality by Design for pharmaceuticals: regulatory perspective and approach. Nature Biotechnol. 2009;27;26–34.

14. Cinar A, Parulekar SJ, Undey C, Birol G. Batch fermentation: modeling, monitoring and control. New York: CRC Press; 2003.

15. Becker T, Hitzmann B, Muffler K, Pörtner R, Reardon KF, Stahl F, Ulber R. Future aspects of bioprocess monitoring. Advances in Biochem Eng and Biotechnol. 2007;105:249–293.

16. Yoo CK, Lee I–B. Nonlinear multivariate filtering and bioprocess monitoring for supervising nonlinear biological processes. Process Biochem. 2006;41:1854–1863.

17. Undey C, Ertunc S, Cinar A, Online batch/fed-batch process performance monitoring. Quality prediction and variable contributions analysis for diagnosis. Ind Eng Chem Res. 2003;42(20):4645–4658.

18. Undey C, Tatara E, Cinar A. Intelligent real-time performance monitoring and quality prediction for batch/fed-batch cultivations. J Biotechnol. 2004;108(1):61–77.

19. Kirdar AO, Conner JS, Baclaski J, Rathore AS. Application of multivariate analysis towards biotech processes: case study of a cell-culture unit operation. Biotech Progress. 2007;23:61–67.

20. MacGregor JF, Kourti T. Statistical process control of multivariate processes. Control Eng Practice. 1995;3:403–414.

21. Undey C. Are we there yet? An industrial perspective of evolution from post-mortem data analysis towards real-time multivariate monitoring and control of biologics manufacturing processes. IBC's BioProcess International Analytical and Quality Summit. Cambridge, MA; 2008 Jun 2–4.

Other articles from The Elements of Biopharmaceutical Production series:

1. Modeling of Microbial and Mammalian Unit Operations

2. Scaling Down Fermentation

3. Optimization, Scale-up, and Validation Issues in Filtration

4. Filter Clogging Issues in Sterile Filtration

5. Lifetime Studies for Membrane Reuse

6. Modeling of Process Chromatography Unit Operation

7. Resin Screening to Optimize Chromatographic Separations

8. Optimization and Scale-Up in Preparative Chromatography

9. Scaling Down Chromatography and Filtration

10. Qualification of a Chromatographic Column

11. Efficiency Measurements for Chromatography Columns

12. Process Validation: How Much to Do and When to Do It

13. Quality by Design for Biopharmaceuticals: Defining Design Space

14. Quality by Design for Biopharmaceuticals: Case Studies

15. Design Space for Biotech Products

16. Applying PAT to Biotech Unit Operations

17. Applications of MVDA in Biotech Processing

18. Future Technologies for Efficient Manufacturing

19. Costing Issues in the Production of Biopharmaceuticals

20. Economic Analysis as a Tool for Process Development

For the entire series of The Elements of Biopharmaceutical Production, please visit www.industrymatter.com/EBPseries.aspx

Recent Videos
Behind the Headlines episode 5
Buy, Sell, Hold: Cell and Gene Therapy
Buy, Sell, Hold: Cell and Gene Therapy
Buy, Sell, Hold: Cell and Gene Therapy
Related Content
© 2024 MJH Life Sciences

All rights reserved.