As biotechnology organizations have successfully launched new products, the challenges of producing adequate quantities have grown. Many companies are now dealing with multiproduct manufacturing facilities and pushing the limits of their capabilities. One result of this complexity is a loss of production capacity due to inefficiencies.
As biotechnology organizations have successfully launched new products, the challenges of producing adequate quantities have grown. Many companies are now dealing with multiproduct manufacturing facilities and pushing the limits of their capabilities. One result of this complexity is a loss of production capacity due to inefficiencies.
Many factors contribute to inefficiencies, such as scheduling conflicts in production or supporting equipment, inadequate availability of production resources (both human and machine), lack of available raw materials or subcomponents, and miscalculations of the true capabilities of facilities. In many cases, production line inefficiencies do not result from weaknesses in day-to-day production scheduling, but from the original assumptions built into planning models.
Often companies lack the data to support a systematic capacity analysis of all their production areas. Equipment- and labor-capacity planning exercises usually are inaccurate because of pressure to provide next year's budget forecast. They are not geared to provide the valuable information that will drive future planning and scheduling needs. In many cases, broad assumptions are made about the true production bottlenecks, process efficiency figures, and process times that will be realized in manufacturing. The result is production lines that are not appropriately sized or managed to optimize the bottlenecks, that do not account for the variability in day-to-day production needs, and that fail to maximize the use of critical plant equipment.
Planning tools provide a solution, as long as you use the right tool for the right task. Capacity-modeling tools fall under two broad categories, static and dynamic.
Static modeling is the most common and the easiest to program. Static modeling can be very effective as a first-pass analysis tool where future manufacturing requirements are still wildly variable. The assumptions on a variety of planning metrics have a time-independent perspective. For example, with static modeling, one uses monthly demand to calculate the labor and equipment needed to support the required volume.
Dynamic (simulation) modeling, although more complicated to build and use, provides a more realistic tool for planning a production area. Dynamic models, by definition, are time dependent; they analyze how systems or areas will react to changes over time. Instead of examining the production line and resources on a monthly basis, the model simulates individual lots moving through the line based on a realistic production schedule. This model can then provide a picture of production during the month, including achievable cycle times, sources of delays, inadequate production staffing levels, and shifting bottlenecks.
Figure 1. Modern simulation tools allow visual representation of a production operation with reasonable effort. Animation is standard in most available models. Imagine That, Inc. designed this model in Extend.
Newer software packages make creating dynamic models simpler than before and even offer animation as a standard feature. Also, since this type of modeling primarily supports planning activity (and not decision-making activities impacting product quality), it does not need to conform to 21 CFR Part 11. Figures 1 and 2 show examples of a build model as well as sample output capabilities.
When choosing between simulation modeling and a static planning tool, the first thing to consider is your needs. Simulation models offer many benefits, including the ability to conduct time-sensitive analyses, provide for variability in workload fluctuations, and help with bottleneck identification (including the ability to analyze shifting bottlenecks). If a company is facing decisions that may benefit from such detailed information, then it should seriously consider using a simulation model.
However, if a company is simply conducting a first-pass capacity analysis in a straightforward environment where the implications of inadequate resources are not severe, a static model may be sufficient. Of course, if you already have a static model in place and you are not getting the information you need, this may indicate that simulation is required.
Another factor to consider is the nature of the process. Simulation requires that clear rules be in place for product routing, resource usage, processing times, and other parameters. If the process you are modeling has too many open-ended choices (regarding what equipment is used in the process or when certain actions are performed, for example), you should consider using a simpler model.
Finally, as with any model, the availability of data is critical to its success. A good simulation requires extensive and accurate data for standard processing times, yield rates, production volumes and mix, and other input parameters. Since a main benefit of simulation is its ability to analyze the effects of real-life variability on resource requirements, service levels, and other performance aspects, it is important that reliable data be available to determine the averages for each parameter, as well as the standard deviation and type of distribution. Table 1 highlights some of the differences between static and dynamic modeling.
Simulation is a complex and time-consuming process. Management must be patient and allocate time for design because reworking and redesigning the model after it is already in development can add significant time to the development process.
Table 1. Static vs. dynamic models
The design process should be seen as a cross-functional exercise. All key stakeholders providing inputs to the model will be affected by it. This collaborative approach to development can help ensure that buy-in exists for the model and that the model will accurately reflect the needs of the impacted groups.
Since simulation is a way to mimic reality, the first step to creating a simulation model is to define that reality. This can typically be done through process mapping; process maps focus on the flow of products, people, and materials through the operation. For each step of this flow, determine model inputs and decision criteria that impact product routing, resource usage, yield rates, and other performance parameters. Keep in mind that during this mapping process, companies often identify a series of improvements to their operation that can generate significant benefits; the designed model may need to account for these constant improvements to the process map.
When mapping the process, distinguish between physical movement and logical process transitions. For example, a product may "move" from a labeling operation to an inspection step. However, if both steps take place in the same room, there may be no physical movement involved. Nevertheless, it is important to identify the logical transition, or movement, of the product from one process step to the next. Whether you consider this logical transition to be the beginning of another step or part of a physical movement will be dictated by how much of an impact that distinction has on labor and equipment resources and the level of detail you desire fromthe model.
The capabilities of newer simulation packages make it tempting to try to accurately reflect every detail of the operation's reality. Many of today's tools are designed visually, giving the user the freedom to create processes and move steps around with the click of a mouse. However, simulation models that should be relatively simple can become overcomplicated and burdened with insignificant details.
Figure 2. A typical output of a simulation model shows cold room capacity.
"Scope creep" is very common when the model is demonstrated to different groups. With each demonstration comes a request to add yet another analysis or feature. This results in a time-consuming development effort and introduces difficulties in the operation of the model once it is complete. A complicated model is more susceptible to errors, slower to run, and usually requires more detailed inputs. This means that changes to inputs, either as part of standard maintenance (such as process changes, production mix, and volume changes) or as part of scenario analysis and planning, become more frequent and difficult to make. The result is a model that is less flexible and more difficult to use and may not be an effective planning tool overall. Additionally, a complicated model may be more difficult to verify since troubleshooting model inaccuracies can be challenging.
It is important to keep the model as simple as possible. With input from all key stakeholders, review the process flows and create a simplified version of the process that focuses on the most critical resources and operations. Avoid going to the other extreme of over simplification by getting concurrence of all key stakeholders.
As the simplified flow is developed, there will be many points where a design team will need to make decisions about handling certain scenarios. In a packaging area, for example, if Line A is occasionally used as a backup for Line B but neither is a bottleneck, one might simplify the model by defining the process as having a single available piece of equipment (with slightly more capacity) instead of having two separate pieces. Although not completely realistic, the impact of this design should have a minimal impact on the performance analysis. If an operation is not on the process's critical path and does not affect relevant resources, you might ignore it. Sometimes, you may group several operations into one step if they share similar equipment and labor resources.
Needless to say, potential errors must be carefully reviewed when making such assumptions. A piece of equipment that is not considered a bottleneck may turn out to be one (the simulation model can actually help in identifying these), or a noncritical operation may be keeping resources away from a critical one. Use care when deciding what should and shouldn't be modeled. In many cases, a small first-pass static model can be a good starting point for the more complex simulation model.
Similar to simplifying the flow, one needs to consider the amount of detail to be included within each step of the model. Most simulation packages allow a significant amount of detail for each operation. One example is the definition of the production demand for a modeled operation. You can define specific dates for the schedule (for example, "start lot A on Jan-5, lot B on Jan-9") or define demand on an interval-based schedule ("start one lot every 4 days"). Other examples are the size attributes that are assigned to production lots. Each lot can be specified in detail ("lot A is 50,000 vials of product AAA") or have attributes assigned by the model using a distribution based on the production mix ("on average, lots of AAA will be 50,000 vials with a random distribution").
While the use of precise data allows for more detailed planning and analysis based on the real production forecast, it also requires more data processing and consideration of factors that could otherwise be ignored or simplified (such as holidays, shift structures, and rotating schedules). The pros and cons must be considered against the objectives of the model to determine the right approach.
As with any other model, once the simulation model is created, it should be put through a rigorous verification process to guarantee its performance and accuracy. One good verification technique is to use historical performance data to test the model. After plugging in previous production plans, the user can then compare model results with actual production performance
Some simulation models are not robust. You must test the model in conditions that exceed the range of normal parameters (for instance, additional or fewer resources and higher or lower production volumes).
Since a simulation model considers variability that can randomly sway outcomes, a successful model should not be expected to reflect real-life performance case-by-case. Rather, it should show an accurate trend or average performance. For example, the cycle time for a specific lot may be different in simulation than on the floor due to built-in variability, but the average cycle time for several lots should be more accurate.
One of the key inputs for such models is not only average labor and equipment processing times for a given product, but the variability of those times. As in real life, the average is rarely, if ever, achieved. A model that accurately accounts for this reality can provide a better picture of the production needs.
Over time, the modeled operation will change, and a formal process should be put in place to update the model. A valuable technique to make this manageable is to assign a model owner in charge of tracking all desired changes. These changes are reviewed semiannually or annually in the context of future operational strategy, and a short design process is initiated to clearly define the scope of the updated model.
Additionally, process data should be collected on an ongoing basis in order to maintain model integrity. Systems should be put in place to capture actual processing characteristics (averages and standard deviations) of key model steps. These figures should be periodically reviewed, and the model should be updated accordingly.
Simulation is a powerful tool that can offer many benefits if done right and used appropriately. Like most powerful tools, this one also needs to be handled with care. Knowing what the model is expected to achieve and correctly identifying the relevant variables and their behavior is the key to ensuring meaningful results.
The Solution Lies with SOLBIOTE™: Achieving Sustainability, a Growing Focus in Biopharma
October 28th 2024The nexus between biopharmaceuticals and sustainability is seemingly far apart, however, it is increasingly recognized as an inevitable challenge. It is encouraged to take a sustainable approach to reducing the environmental impact of the production and supply of medicines while improving people's health; delivering the well-being of people and the planet. Yosuke Shimojo (Technical Value Support Section Manager, Nagase Viita) will unveil how SOLBIOTE™, a portfolio of injectable-grade saccharide excipients, would be a key for the biopharmaceutical development and achieving sustainability for a better future of the industry.