Minimum disruption and maximum gain result when adopting a distributed process control and data management system for a cell culture and fermentation lab.
The Process Science group at a Northern California biopharmaceutical company recently changed its methods of data management and process control from manual to automated. The company's Lab Director said that, thanks to the automation now in place, "We have realized a 30% savings in time, resulting in substantial development cost reductions."
Meg Kay
The actual installation of the automation system throughout their bacterial fermentation lab took a single day. Luckily, the installation could be done between experiments, but if experiments had been running, the reactors could have stayed under manual control for that time. At the end of that day, the reactors were up and running again, under the eye of a networked, automated data management and control system.
The staff attended two days of training on-site, one day geared to researchers and the other to technicians. Over time, various customizations and new instruments were added, nearly all without shutting down the system or interrupting running experiments.
Jova Solutions
The bacterial fermentation lab did not need to have any of its instrumentation replaced in order to install the new automated system. All the existing instruments were brought together under one unified interface. All sources of data were managed together, all available for display, calculations, and control. New instruments were added later, and they were integrated with the rest.
We want to tell the story of this change because many labs are contemplating automating, but are fearful of disruption. This case history shows that it can be quick, nearly painless, and has huge advantages.
Biopharmaceutical companies struggle to comply with FDA regulations, good laboratory practices (GLP) current good manufacturing practices (cGMP), and process automation technology (PAT) standards during process development, scaleup, and production. Manual methods make this difficult, because changes can't be recorded and tracked, procedures can't be repeated exactly, and systems are less secure. The system security and tracking commonly found in FDA 21 CFR Part 11-compliant manufacturing facilities would be great in process development, but the high cost and inflexibility of the systems usually selected for manufacturing makes them impractical.
However, a good software solution enables FDA-compliant systems for labs and pilot plants, with full security and tracking features, without sacrificing the flexibility and openness required during process development. The tracking is transparent to users of the system, and the security can be configured by each facility to be as free or as rigorous as required. When new instrumentation is purchased, it can be integrated into the same interface as illustrated in Figure 1.
Figure 1. Automated process control and data management allows easy expansion.
The most noticeable benefit of incorporating a comprehensive system is that laboratory scientists are freed from the mundane manual methods of data collection, analysis, and control, and can focus on essential research and development. To make things even easier, having a unified interface for all sources of data means the people in the labs don't have to learn all the different systems. A Senior Process Development Scientist at a large West Coast biotech company pointed out the advantage of this, "Having a common data platform saves time and money in training and in the overhead of integrating different data sources and training on different systems."
As pressure on biopharmaceutical companies to accelerate the commercialization of life-saving biotherapeutics and comply with government regulation increases, incorporating advanced process control and data management is essential. Biopharmaceutical companies are facing major challenges and are; "...compelled to incorporate tools that can improve their productivity and reduce R&D expenditures," according to analyst Raghavendra Chitta of Frost & Sullivan in a July 2005 report.
One senior process development scientist at a large West Coast biotech company made the benefits of automation clear when he said that this automation software, "...gives us profit maximization... We get products to market faster... if we're one month faster to market, that is significant added value —as in more revenue."
In 1990 the Process Science group at this company had a typical fermentation lab, with an eclectic collection of instrumentation. Scientists collected data and managed recipe development for new drugs in the tried-and-true, old-fashioned way. To monitor the development process, they walked around the lab recording data on a clipboard from each instrument, and then manually entered all the data into a spreadsheet. Based on their analysis of this aggregated data, they returned to each controller to adjust the recipe process by hand, in order to optimize the recipe for maximum effectiveness and best yield.
Figure 2. Fermentation lab
A few years ago, the company acquired the pharmaceutical assets of another company. With this acquisition they got a mixed set of expensive instrumentation. Adding these instruments to the lab resulted in a wide variety of equipment of assorted age and from various vendors. They had reactor vessels connected to either B. Braun Biostats or New Brunswick Scientific BioFlos. They had probes, pumps, valves, meters, gas mixers, and scales. Just about every make and model was represented. Each vendor's instrument had its own proprietary interface and few of the instruments talked to the others. Each biocontroller operated in isolation. The data aggregation, integration, and analysis became even more cumbersome.
This scenario of manual methods for instrument monitoring, control, and data management is still the status quo in many R&D, process development labs, and pilot plants today. They are "tried and true," and many businesses depend on established, "trusted" procedures rather than risk introducing change. On the other hand, plenty of lab directors would welcome automation, as would the scientists and technicians who do the daily tasks of managing cell culture or fermentation experiments. Often, they believe that manual process control and data collection are the only alternatives available, given the equipment they have to use. Most comprehensive automation systems are large-scale, expensive systems with required hardware (such as PLCs) and fully customized software. These systems are just not practical in the process development lab or pilot plant, where automation rightfully takes a back seat to experimentation, innovation and cost-savings.
This status quo of manual process control and data management is partly dictated by the instrumentation. Instrument manufacturers focus on excellence in function, accuracy, and speed of their devices. They are typically less concerned with acting as part of a larger whole, sharing and communicating with other instruments in "the big picture." The result is islands of data: isolated measurements and analyses, and isolated subsystems like scattered puzzle pieces. Instrument manufacturers have specific expertise, and concentrate more on perfecting their own island, and less on building bridges. A particular instrument could be a piece in many different puzzles, serving a variety of larger processes. It is a challenge to be a team player in those many different systems.
The Lab Director of Process Science at the company recognized that the costs of managing the isolated islands of data and trying to bridge the gaps between them were hurting his biopharmaceutical business. He foresaw opportunity costs growing steeply, because his products and processes were becoming increasingly complex, patent competition more intense, and time-to-market more critical. For example, Table 1 shows three examples of drugs and their average daily revenue. You can see a large lost opportunity cost for every additional day that was spent in its development.
Table 1. Opportunity Costs
The Process Science Lab Director was not going to replace or upgrade all the instrumentation in order to achieve a unified system. After all, that would also be a waste of good equipment. His strategy was to bring old devices, new devices, and future unknown devices under one umbrella.
The Process Science group had been taking small steps toward automation, contracting for custom software that did crude control and data collection for individual reactors, and trying software applications supplied by the biocontroller vendors. The limitations of these led to the decision to put a comprehensive, automated data management and process control software system into place.
The first step was to define the lab's requirements and select a solution. Those discussions spanned several months, and resulted in the selection of a distributed, hardware-independent process automation system. A comparison of manual and automated is in Table 2.
Table 2. Comparing manual and automated controls
Our automation team worked with the Lab Director to specify the requirements for his Process Science labs. The life science industries are late when compared to the electronics and other manufacturing industries in adopting automated process control and data management systems. Biopharmaceutical firms are aware that huge profit opportunities exist for those who can get higher-quality, higher-yield products to market faster. Automation of process development at these facilities—from a small academic lab to a biotechnology giant— must be much more flexible and affordable than traditional manufacturing automation systems.
Discussions with the Process Science group lasted several weeks. During this time, the group defined their requirements and evaluated different solutions. They listed the instruments they would be using, specified some of their control methods and analysis, and talked with their information technology (IT) department about network and backup plans.
The Process Science group selected an instrument-independent, distributed process control and data management software system, specifically designed to increase productivity and accelerate product development in cell culture and fermentation. By tying together existing instrumentation with comprehensive software, they brought the efficiency and accuracy of automation to their labs, and allowed the Lab Director to save in his capital equipment budget by making full use of the legacy and acquired instruments.
The automation system this company selected is very cost-effective. Most systems with comparable features and benefits would be more expensive, and would require specific instrumentation to be used: either a particular brand of biocontroller or expensive PLCs. Such systems require months of expensive customization to get a facility up and running. Instead, this software solution has a full range of capabilities yet it is hardware-independent, cooperating with biocontrollers, PLCs, and other instruments as desired. It is quick to install and configure, and it is specifically designed for fermentation and cell culture.
There is cheaper software available from individual instrument vendors. While this software typically works well with the specific instrument for which it was designed, it often lacks the robust features, networked architecture, open integration, and flexibility needed to provide automated process control and data management for the entire lab.
The software was installed in their bacterial fermentation labs on standard Windows desktop PCs, plus one Windows server-class computer, all on a standard local area network. The different brands of biocontrollers were connected to the networked computers via Ethernet or serial ports. The other instruments were connected in various ways: Masterflex and Watson-Marlow pumps via USB data acquisition cards; Sartorius mass flow controllers integrated with the Braun Biostat B reactor controllers; Mettler-Toledo and Ohaus balances to serial ports; and dissolved oxygen and optical density probes through FieldPoint modules.
The total installation and configuration took one day, followed by two days of on-site training. Since then, further configuration and extensions—such as new instruments, control methods and displays — have been added over the years without requiring system shutdowns.
After using the software for some time, the Lab Director reported that the automation system "...provides flexibility, customization, data analysis, and data review that are very accessible and very well done." He described the earlier manual methods as "inefficient and error prone." The system was soon extended into the cell culture lab. This added a dozen Applikon ADI 1030 biocontrollers to the existing array of instrumentation under its control. Process scientists at the company today continue to extend the automation system as needed, modifying and adding new recipes and experiments; measurements and calculations; control methods; and reports and displays.
The benefits of a comprehensive, automated process control and data management system make a long list (Table 2), but they all lead directly to the main points of greater efficiency and shorter time-to-market. Almost all modern instruments are designed with the capability to communicate and cooperate, but what has been lacking is the orchestra conductor that manages and controls all the individual instruments as a unified whole. The software working at the process level bridges the gaps between disparate subsystems. It is independent of the instrumentation and able to communicate with instruments of all brands. It supports all types of physical connections: Ethernet, serial, analog & digital I/O, OPC, and other standard protocols.
Another benefit that comes from device-independence is that one method of measurement can be replaced with a new one, with no interruption to the history of the process value. Here, the phone companies set a good example. A customer can buy a new phone (instrument), but his phone number doesn't change; his identity stays the same. Likewise, when an old mass spectrometer is replaced with a new kind of gas analyzer, the nitrogen measurement has a seamless history across the two.
Even with the wide spectrum of open instrument communication, there might still be some external analyses from which data is collected by hand. This "off-line" data can be entered into the unified data management system and is treated as an equal citizen. It is available for display, analysis, and feedback control, along with all the data from the various instruments that were automatically collected. Also, all online data analysis or calculated results are equally available for display, further analysis, and feedback control. Figure 3 shows a Trends display that graphs process data acquired from online instruments, off-line measurements, and calculated results all together.
Figure 3. Unified data from instruments, offline measurements, and results are all visible on the computer screen.
In addition to communication with each different data island, there need to be bridges between the islands. This is accomplished by a distributed architecture, linking all computers, instruments, data servers and archives. This requires a robust network to be in place. A company's IT group has to be an active player here, and should be involved early in the planning. Figure 4 shows examples of connections in this distributed system.
Figure 4. A networked, distributed system can connect to other parts of the system.
The advantages of a distributed system are many. First, all aspects of the process control and data management can be available from any site on the network. A technician can monitor many reactors from one station, or respond to alarms remotely. A researcher can check on and adjust experiments from her office, create reports, and be automatically notified by email or pager when problems arise. Second, instrument resources can be shared. One expensive gas analyzer can serve many reactors. Third, know-how and results can be shared. A recipe or control strategy developed by one scientist can be used by others for scale-up or for other runs. Process data and results can be shared and compared on a live system, even from a meeting room. The Operations Manager of a San Francisco Bay Area biotechnology firm said, "I have immediate access to all the reactors I manage in three separate facilities, and this has streamlined our post-experiment data analysis."
It is often essential to bridge the gaps between different subsystems within the bigger process picture, not only to share and communicate with the reactors and other instruments directly involved in fermentation and cell culture experiments, but also to bring the upstream, downstream, and side subsystems into the picture. When a sample from a reactor is taken aside for specialized analysis, valuable results will be calculated there. Those results provide critical information about the success of the process and quality of the product, and those results should be integrated into the records and control decisions for the on-going process. If the sampling, analysis, and results of the various subsystems can be automatically integrated, you have a more efficient process and a higher-quality product.
For example, at one facility, the same bioreactor automation system is controlling a high-pressure liquid chromatograph and recording results of chromatography runs for samples taken from the bioreactors upstream. At another, an in-situ microscope with attached camera takes images of cells in the bioreactor. The system controls the camera and microscope, displays and stores images, and automatically analyzes certain cell features in the images. The open architecture of the automation software allows these kinds of subsystems to be integrated into the larger process.
Once the entire process is linked and unified, bridges can be built to other parts of the business enterprise, such as laboratory information management systems (LIMS). Whenever data from one system must be manually entered into another, there's an opportunity for automation. Of course, it is best to focus first on building bridges across gaps in the system that must be crossed on an hourly or daily basis, then address the more infrequent data transfers.
Meg Kay is director of process control products at Jova Solutions (formerly WireWorks West, Inc.) , 965 Mission Street, Suite 600, San Francisco, CA 94103; 415.348.1400; fax, 415.348.1414; info@jovasolutions.com