Robust Experimental Strategies for Improving Upstream Productivity

Publication
Article
BioPharm InternationalBioPharm International-07-01-2010
Volume 23
Issue 7

Identify the best experimentation methods for the data you need.

ABSTRACT

Experience using Quality by Design in upstream processes has identified several things that can improve the application of the method. When the experimental environment is diagnosed and strategies are developed to match the environment, experimentation moves more quickly and the critical process variables are identified with higher probability. This article presents a process for developing experimental strategies. The approach focuses on enhancing process understanding by developing the right data, at the right time, and in the right amount to maximize the time-value of the data collected. The development and operation of robust measurement methods that produce high quality data and streamlining of experimentation work processes also is discussed.

With the leadership of the FDA, there has been considerable focus on the use of Quality by Design (QbD) to accomplish the goal of speeding up development and reducing costs in research and development and manufacturing; producing better quality products and more effective, efficient, and robust manufacturing processes. Much has been learned about the use of QbD since it was first proposed by the FDA.1–4 QbD also can be used to enhance the performance of existing products and processes.

There are many variables involved in improving an upstream process. Hulbert, et al. have shown an effective way to prioritize the collection of variables.5 Now, we need a strategy to deal with the variables identified by the prioritization approach proposed by Hubert, et al.

(SAFC)

Hubert, et al. point out that there are a lot of data involved in improving upstream productivity. The data can't be collected all at once. The strategy selected must enable us to collect the right data, at the right time, and in the right amount, as discussed by Lonardo, Snee, and Qi.6

Even when good QbD approaches are used, experimentation often is slowed down by poor processes. Personnel, materials, testing, and equipment often are not available when needed, resulting in wasted time and effort. Additional time and effort are wasted when analytical methods are of poor quality or the procedures in the analytical laboratory are inefficient.

Figure 1

Two methods can be used to increase the speed of upstream development, which in turn speeds up the development of process understanding:

  • using Design of Experiments (DOE)-based strategies to design, analyze, and interpret experiments, resulting in getting better information in a timely fashion

  • using Lean principles to streamline the availability of information, materials, equipment, measurements, and personnel for the experimental process, thereby accelerating the flow of the experimental process.

The overall strategy therefore is to speed up the experimental process by adopting strategies that collect the right data when needed and to improve the work processes used for experimentation. The result is that the experimentation is speeded up, in turn speeding up the development of process understanding, which is fundamental to improving upstream productivity (Figure 1). Some critical principles in conducting upstream experimentation are summarized in the sidebar and discussed in the following paragraphs.

STREAMLINING EXPERIMENTATION

Critical to success is the development of a strategy of experimentation, which streamlines the experimental process. Such a strategy, summarized in Table 1, identifies three experimental environments: screening, characterization, and optimization.7 The objectives of each of the three phases are summarized in Table 1. The strategy sequences and links together a variety of experimental designs, which enables scientists to achieve greater results than they could achieve previously using the same DOE techniques in isolation.

Table 1. Comparison of experimental environments

The strategy used depends on the experimental environment. These characteristics involve program objectives, the nature of the factors and responses, resources available, quality of the information to be developed, and the theory available to guide the experiment design and analysis. A careful diagnosis of the experimental environment along these lines can have a major effect on the success of the experimental program.7

Critical Principles for Experimentation

The screening–characterization–optimization (SCO) strategy is illustrated by the work of Yan and Le-he, who describe a fermentation optimization study that uses screening followed by an optimization strategy.8 In this investigation, 12 process variables were optimized. The first experiment used a 16-run Plackett-Burman screening design to study the effects of the 12 variables. The four variables with the largest effects were studied subsequently in a 16-run optimization experiment. The optimized conditions increased fermentation yield by 54%.

The Screening Phase. The screening phase explores the effects of a large number of variables with the objective of identifying a smaller number of variables to study further in characterization or optimization experiments. Screening studies typically use fractional–factorial and Plackett–Burman designs to collect data. More screening experiments involving additional factors may be needed when the results of the initial screening experiments are not promising. The screening experiment often solves the problem.

The Characterization Phase. The characterization phase helps us better understand the system by estimating interactions as well as linear (main) effects. The process model is thus expanded to quantify how the variables interact with each other as well as measure the effects of the variables individually. Full-factorial and fractional–factorial designs are used here.

The Optimization Phase. The optimization phase develops a predictive model for the system that can be used to find useful operating conditions (design space) using response surface contour plots and mathematical optimization. In these studies, response surface designs are used to collect data.

The SCO strategy in fact embodies several strategies which are a subset of the overall SCO strategy, including:

  • screening–characterization–optimization

  • screening–optimization

  • characterization–optimization

  • screening–characterization

  • screening

  • characterization

  • optimization.

The end result of each of these sequences is a completed project. There is no guarantee of success in a given instance, only that the SCO strategy will "raise your batting average."7

REDUCING THE EFFECTS OF ENVIRONMENTAL FACTORS

It is important to discuss how the effects of environmental factors are included in the experimental strategy. Part of diagnosing the experimental environment is identifying environmental variables, which often are overlooked. Even the best strategy can be defeated if the effects of environmental variables are not properly taken into account. Environmental variables include factors such as bioreactors, raw material lots, operating teams, ambient temperature, and humidity. Recognizing the effects of variation from environmental factors, can go a long way in ensuring that the resulting data are not biased. Special experimental strategies also are needed to reduce the effects of extraneous variables that creep in when the experimental program is conducted over a long time period.

In one case, a laboratory was investigating the effects of upstream variables using two identical bioreactors. As an after-thought, the scientists decided to use both bioreactors in the same experiment. There was some concern that using both reactors would be a waste of time and resources because they were identical. Data analysis showed, however, that there was a big difference between the results from the two bioreactors. These differences were taken into account in future experiments.

In another situation, an experiment was designed using DOE procedures to study the effects of five upstream process variables. The data analysis produced some confusing results and a poor fit of the process model to the data (R2 values were low). An analysis of the model residuals showed that one or more variables, not controlled during the experiment, had changed during the study, leading to the poor model fit and confusing results. An investigation showed that the experiment had been conducted over an eight-month period. It is very difficult to hold the experimental environment constant for such a long time.

Heilman and Kamm report a study in which the biggest effect was raw material lot–variation introduced by different lots of media.8 This effect was present during the production of more than 50 batches before it was discovered.

In each of these cases, it is appropriate to use "blocking" techniques when conducting upstream experiments.9 Blocking accounts for the effects of extraneous factors, such as raw material lots, bioreactors, and time trends. The experimentation is divided into blocks of runs in which the experimental variation within the block is minimized. In the first case above, the blocking factor was the bioreactor. In the second case, the blocking factor was a time unit (e.g., months). The effects of the blocking factors are considered in the data analysis, and the effects of the variables being studied are not biased.

POOR MEASUREMENT PROCEDURES SLOW DOWN EXPERIMENTATION

Another problem that can reduce the speed and quality of process development is the measurement systems used to collect the data, i.e, availability of good methods and the efficiency with which the analytical laboratory is operated. Measurements enable us to see the effects of the variables driving the process and build models that are used to develop the design space.

The role of good measurements often is overlooked. Poor laboratory performance can slow down development if:

  • methods are not available when needed or have been developed poorly, producing misleading test results that slow the development of process understanding

  • laboratory testing procedures and scheduling are inefficient, producing delays in getting the results back from the laboratory. The value of good analytical methods is lessened if it takes a long time to get the samples analyzed.

Poor quality measurements and ineffective and inefficient laboratory procedures result in long timelines and misleading results. Measurement procedures must produce high quality, repeatable, and reproducible measurements and have stable measurement procedures and methods that are robust to small deviations from the method standard operating procedure.11 An example of improving the flow of samples through an analytical laboratory is discussed below.

EXAMPLE: STREAMLINING R&D WORK PROCESSES

As noted above, developing good experimentation strategies to design, analyze, and interpret experiments is necessary but not sufficient for speeding up the improvement of upstream productivity. You also must streamline your experimentation work processes to get the full benefit of Quality by Design. A critical issue is the scheduling of experimental work. You lose the benefit if you have to wait to run the experiments.

In an experimental program, a screening experiment was designed, the personnel were assembled and ready to conduct the experimental runs. Unfortunately, the lead scientist couldn't make the process to work properly because of mechanical difficulties. The personnel waited for two days and were then assigned to other projects. The experiment finally began two weeks later. The scheduling and personnel acquisition had to be repeated. A significant amount of time and resources would have been saved if the process operation had been mastered before scheduling the screening experiment to be run.

The availability (flow) of information, materials, personnel, measurements, and equipment affects the flow of experimentation. One of the most common inefficiencies is that there is a lot of waiting around: tasks are performed late; or personnel, equipment, and materials are not available when needed; standards are not used, making it difficult to determine what was done and to compare to other work. The solution to this problem is to use Lean principles to streamline the processes and procedures used to do the experimental work.12 Eliminating complexity and wasted time and effort results in experimentation being speeded up and scientists having more time to do creative work.

Kamm and Villarrubia report on an initiative that used Lean principles to streamline an analytical laboratory.13 The incoming workload on the quality control laboratory was variable in both volume and mix. Throughput time was >15 days.

Lean principles were used for this initiative. A 5S program identified and labeled equipment, marked bench space, and delineated storage areas. Lean process design principles were used to dedicate equipment and people to a set of products. This new design for the flow improved in throughput time by 53%, reduced personnel utilization by 25%, and accelerated material release by 14%. This example shows how applying Lean principles can speed up the flow of analytical testing.

CONCLUSION

A critical component of using Quality by Design to increase process understanding and improve upstream productivity is speeding up the experimentation associated with developing new and existing products and improving processes. Experience has shown that this can be accomplished by developing a strategy for experimentation that diagnoses the experimental environment to determine the best experimental design to use. This strategy also takes into account the environmental variables that affect the process. It also has been found that the quality and availability of the measurement system can have a major effect on the speed of the experimental process.

The development process also can be enhanced using Lean principles to streamline R&D work processes by eliminating complexity, non-value–added work, and wasted time. Improving the availability of personnel, materials, measurements, and equipment can improve the flow of experimentation, thereby speeding up the improvement of upstream productivity. The resulting work processes free up scientists to spend more time on scientific work, thereby speeding up the development and process improvement work.

Ronald D. Snee, PhD, is the founder and president of Snee Associates, LLC, Newark, DE, 610.213.5595, ron@sneeassociates.com.

REFERENCES

1. International Conference on Harmonization. Q8, pharmaceutical development, current step 4 version. Geneva, Switzerland, 2005 Nov.

2. Snee RD, Cini P, Kamm JJ, Meyers C. Quality by Design—shortening the path to acceptance. Pharm Process. 2008;25(3): 20–24.

3. Snee RD. Quality by Design—four years and three myths later. Pharm Process. 2009;2:14–16.

4. Snee RD. Building a framework for Quality by Design. Pharm Tech. Online exclusive; 2009 Oct. Available from: http://pharmtech.findpharma.com/pharmtech/Special+Section%3a+Quality+by+Design/Building-a-Framework-for-Quality-by-Design/ArticleStandard/Article/detail/632988

5. Hulbert MH, Feely LC, Inman EL, Johnson AD, Kearney AS, Michaels J, Mitchell M, Zour E. Risk management in pharmaceutical product development. J Pharm Innovation. 2008;3:227–48.

6. Lonardo A, Snee RD, B Qi. Time value of information in design of downstream purification processes—getting the right data in the right amount at the right Time. BioPharm Int. Advances in separation and purification: the future of downstream processing. 2010 Mar suppl; pp. 29–34.

7. Snee RD. Raising your batting average. Remember the importance of strategy in experimentation. Qual Prog. 2009;12:64–8.

8. Yan L, Le-he M. Optimization of fermentation conditions for P450 BM-3 monooxygenase production by hybrid design methodology. J Zhejian Univ Sci B. 2007;8(1):27–32.

9. Kamm JJ, Heilman C. Coming to a biotech near you: Quality by Design Part 2: design space in development and manufacturing. BioPharm Int. 2008;21(7):24–30.

10. Box GEP, Hunter JS, Hunter WG. Statistics for experimenters—design, innovation and discovery. New York, NY: Wiley-Interscience; 2005.

11. Schweitzer M, Pohl M, Hanna-Brown M, Nethercote P, Borman P. Hanson G, Smith K, Larew J. Implications and opportunities of applying QbD principles to analytical measurements. Position paper: QbD analytics. Pharm Tech. 2010;34(2):52–9.

12. King PL. Lean for the process industries. Boca Raton, FL: CRC Press; 2009.

13. Kamm JJ, Villarrubia Y. Using Lean principles to improve analytical laboratory operations. Personal Communication with the author; 2009.

Recent Videos
© 2024 MJH Life Sciences

All rights reserved.