Defects as small as 10 μm can be detected without compromising product cleanliness using helium integrity testing.
In the biopharmaceutical industry, flexible containers such as plastic bags or liners are often used for bulk intermediate storage, cell culture resuspension, viral inactivation, final formulation, final fill, or as bioreactors. In such applications, the bag is hermetically sealed and sterilized. Sterility of the bag must be maintained to avoid contamination of the product. Any breach of the sterile condition is considered a serious risk and often results in disposal of valuable product, sometimes after significant cost and effort has been expended in the course of making it.
Vishwas Pethe
At the moment, disparities exist between defect sizes that are readily detectable using current on-line technology, and the speculated value for the critical defect size (e.g., the defect size at which sterility of a package is lost). Lampi established, and Chen and Keller independantly substantiated, that the critical defect dimension for bacterial penetration for flexible bags is 11 μm or less, while Gilchrist determined that the dimension was nearly twice that: 22 μm (1–4). Blakistone later established that critical defect size was 7 μm (5). Discrepancies in the critical defect size could be attributed to differences in the bio-test methodology, concentrations of test microbes, test times, or positive/negative pressure in the test bag. Therefore, one of the objectives of this article is to determine the critical defect dimension at which sterility breach occurs in biopharmaceutical containers. Such information would provide a foundationfor avoiding inherently problematic conditions, as well as for empirically evaluating and tailoring leak detection equipment for their specific needs.
The package's integrity controls microbial ingress into the package and thereby preserves product sterility. The current technologies employed in the biopharmaceutical packaging industry (e.g., vacuum bubble test, dye-penetration test, pressure-decay test, constrained-plate pressure-decay test) are only capable of reliably detecting leak rates in the range of 10-2 or 10-4 cc/s, which is equivalent to detecting defect sizes in the scale of 90–500 μm. Thus, industry needs a method of testing the integrity of a flexible container that detects defects corresponding to the minimum size needed to block water-borne microbes. This article describes a new test methodology, called helium integrity testing (HIT), that can detect such defects.
PHOTO COURTESY OF ATMI
A defect can be defined as an unintended crack, hole, or porosity in an enveloping wall or joint, which must contain or exclude different fluids and gases allowing the escape of closed medium. The shapes of defects (e.g., cracks, fissures, porosity, damages, etc.) are very different, unknown, and nonuniform. Therefore, it is impossible to measure their sizes with any precision except in the case of an ideal or artificial leak as used for calibration. There are many methods for detecting defects, a list of which along with their sensitivities can be found in the literature (6–11). Figure 1 summarizes the sensitivity limits of each of them.
Figure 1: Leak detection sensitivity chart. (ALL FIGURES ARE COURTESY OF THE AUTHORS)
The most practiced method in the industry involves observation of gas or fluid flow through a test bag under defined conditions of temperature and pressure. Consequently, defect sizes are measured by the pressure decay method or by the tracer gas leak testing method. The pressure decay method is discussed below to illustrate some of the limitations observed in ensuring integrity of test bags, and varying adaptations of the tracer gas leak testing method are discussed along with their limitations.
Figure 2: Pressure-decay curve.
Pressure-decay test
The pressure-decay test consists of pressurizing the system with a high pressure gas, usually dry air or nitrogen. The test part is isolated from the gas supply and, after a stabilizing period, its internal pressure is monitored over time. The pressure drop (Δp) is measured over the interval (Δt) (see Figure 2). If the pressure in the system drops quickly, there is a large leak present in that component or section of the system. If the system's pressure drops slowly, there is a small leak present. If the pressure remains the same, that component is leak-free. The leak rate, Q, can easily be computed considering the volume V of the component by using the following equation:
Leak-detection sensitivity is related to the test time, the pressure transducer resolution, and the test bag volume. Several external factors, such as temperature variations and mechanical deformations, affect this test. The internal pressure depends on temperature, and therefore thermal fluctuations may cause changes in pressure, altering the results. Longer test times allow for a more sensitive check, but make the test very time-consuming because smaller defects may require holding periods of several hours. The higher the pressure, the faster leak determination can be made. However, operator safety concerns along with the risk of damaging the test product limit the maximum admissible pressure value.
The difficulties increase when using this technique for checking the defects in flexible bags. Calibrating the system when the test product being measured is extremely pliant is challenging. Therefore, reliable test sensitivity is reduced for flexible bags. This issue can be addressed by the use of constraining plates to constrain bag stretching or internal volume (11, 12). For products with a more complex shape it may be necessary to make a tool that conforms closely to the product outline. However, it is not always possible to fit the final product into the test assembly, leading to products being tested without final assembly elements such as tubing connections or filters, because tubing port connections and manifolds are sensitive to over stretching or bending caused by constraining plates. In the case of large test bags, such as 1000-L or 2000-L, it may require hours to fill the test bag with pressurized gas and to empty the bag. Even after such long test times, one can only detect defects as small as 500 μm while smaller defects go unchecked. Thus, the inability to test fully assembled product risks assuring integrity of a finished product. It is therefore necessary to have a method that is easy to implement (e.g., not bulky or cumbersome), reliable, and that does not contaminate the interior compartment of the container undergoing testing.
Tracer gas leak testing
The term "tracer gas leak testing" describes a group of test methods to detect and measure a tracer gas flowing through a leak. These techniques differ for the tracer gas used and for the realization technology. The most commonly used tracer gases are halogen gases (e.g., CFC, HCFC, and HFC refrigerant), helium, and a mixture of nitrogen and hydrogen (95% / 5%). Helium has been used successfully as a tracer gas for a long time because of its physical properties (12, 13). It is neither toxic nor flammable and is does not react with other compounds. Helium has low viscosity and relatively low molecular mass, so it easily passes through cracks. In the same environmental conditions, it flows through orifices 2.7 times faster than air. Since its concentration in air is low (i.e., 5 ppm), it is easy to detect an increment of helium concentration. It is important to remember that background concentration in air is a limiting factor for any tracer gas detector.
There are two ways to carry out leak testing with tracer gas: internal detection of tracer gas entering from leaks (i.e., outside-in method) and external detection of tracer gas escaping from leaks of a filled unit (i.e., inside-out method). The inside-out method can be executed with atmospheric sniffing or with vacuum chamber detection, while the outside-in method is generally implemented by putting the unit to be tested in a room containing the tracer gas or, very rarely, spraying the tracer gas on the unit surface.
In the outside-in leak testing technique, the unit to be tested is put into an enclosure containing the tracer gas. The part is connected to a vacuum pump and evacuated. A tracer gas detector (i.e. helium mass spectrometer) is placed in the vacuum line to detect the tracer gas pulled in by the pump. The sensitivity, depending on tracer gas and test time, can reach 10-6 mbar ∙ l/s. This method can be fully automatic, so it is not operator dependent. The gas containment hood can be configured to prevent dispersion, which reduces working area pollution and tracer gas consumption, and saves money by avoiding the need for a recovery system. The drawbacks include a large amount of tracer gas use because, in the case of a big leak in the part under test, a large amount of tracer gas escapes and is lost. In addition to the loss of tracer gas, a long pumping time could be required to lower the tracer gas in the detector to an acceptable level compatible with system function. The system is unusable during this time. Another disadvantage is that this method does not identify where the leak is, it only determines if a leak is or is not present. This technique only works for rigid test parts. For flexible parts, the test bag may collapse, blocking any trace gas from leaking.
Sniffing, an inside-out technique, involves a probe or wand being moved over the test part. The probe detects the leak as it passes over the leak. The speed, distance from the part, and the probe sensitivity determine the accuracy of leak detection. However, sniffing has the distinct advantage of being able to locate a leak on the test part, unlike the other methods described, and has the ability to sense leaks as small as 10-7 mbar∙l/s, depending on the tracer gas. Sniffing is not recommended in a high volume production environment, other than for locating leaks for repair. Disadvantages include a high chance of missing leaks due to operator dependency, fragile equipment in rugged environments, and the rejection of good parts because of the inability to quantify the leak. The minimum leak rate measurable by a sniffer is the concentration of the tracer gas in the working area, a value known as background level. This level may change during the production cycle and increases because of leaking units. Relating to the tracer gas used, in case of a big leak in the part under test, a lot of tracer gas escapes from it and may remain for a long time in the working area, strongly affecting the subsequent tests causing rejection of good parts.
Vacuum chamber inside-out leak testing is the most complex system of leak detection, but it is theoretically suitable for finding very small leaks, using the proper tracer gas. This method involves placing the test bag inside a vacuum chamber connected to the integrated helium leak detection system. The vacuum chamber encapsulates the entire test bag, ensuring that any helium molecules escaping from the bag are captured and directed towards the sensor.
Depending on the vacuum chamber dimensions, the evacuation group could call for a high pumping speed, which would introduce high level of particles in the test bag. In the case of a large leak in the part under test, a large amount of tracer gas escapes relative to the amount of tracer gas used. A long pumping time could be required to lower the tracer gas in the detector to an acceptable level compatible with system function, during which the system is unusable.
Although use of helium gas as a tracer gas has been well documented in the literature, its use for detecting defects in flexible bags used in the biopharmaceutical industry has been limited because of the operational and functional issues listed below:
1. Long down times due to inefficient process: Helium gas disperses slowly into the atmosphere, so, in the case of large leaks, its high concentration will contaminate the area for a long time, even hours, effectively making it inefficient for production purposes.
2. Raw material/equipment cost: Conventional belief that highly pressurized systems provide high sensitivity to method detection results in large quantities of helium being used and hence, high costs. The most suitable helium detector is based on a mass spectrometer, which is an expensive and delicate apparatus requiring much care and maintenance. There are high costs due to loss of tracer gas for the test.
3. Test time: To be able to detect smaller leaks, the test time can be long which affects production cycle time and hence overall cost of the product.
4. Product performance effect: The testing method can impact product cleanliness performance and may increase bio-burden level of test bag.
5. Nature of flexible bags: Traditional hard vacuum leak testing is often difficult to perform on flexible wall parts. The pressures resulting from a full vacuum can damage the part.
To address issues discussed above, ATMI developed the helium integrity test (HIT) method and apparatus which allows for fine leak testing of parts using a helium mass spectrometer (MS) leak detector. The technology is based on the inside-out testing principle, however, it addresses issues that prevented its use in leak testing of flexible bags. This method can achieve desired results at lower part differential test pressures and with faster cycle times compared with traditional helium accumulation methods.
The HIT apparatus includes the following main components (see Figure 3):
Figure 3: Helium integrity test (HIT) apparatus.
Leak test method
The HIT apparatus is composed of a test chamber with a closable lid that when closed, forms a tight seal such that no helium can enter into the test chamber from outside. The test unit is a flexible two-dimensional bag with connecting ports that enable fluid communication with the helium source and vacuum pump. The test chamber is large enough to house the unit to be tested. The test chamber is connected to a vacuum pumping group equipped with the tracer-gas detector for chamber evacuation and gas detection. A second vacuum group is used to evacuate the unit under test before filling it with gas. A tracer-gas filling device (i.e., helium supply) completes the testing apparatus. The unit to be tested is put into the vacuum chamber and connected to service hoses, then the vacuum chamber and the unit are evacuated. During chamber evacuation, the test unit is pressurized with the tracer gas. After a stabilization time, the detector is linked to the vacuum line to detect the tracer gas flow through a leak and drawn in by the pumping group. This method can be made fully automatic, so it depends little on an operator. Its sensitivity can reach < 10-10 cc/s flow rates. To prevent long down times, the HIT system employs an in-line pressure-sensing test method as a preliminary leak test that detects gross leaks before the final automatic leak-test operation using a tracer gas is begun. This approach prevents large quantities of tracer gas from leaking into the atmosphere. A specialized pumping technique reduces the stress on the test unit by reducing the internal pressure of the test part along with the external (i.e., chamber) pressure.
To improve test time per test unit, the test chamber was modified to include spacers that allow the simultaneous testing of four units. The spacers are held upright by connecting rods, and each spacer has enough cavities in it such that when a test bag inflates against it, it does not block the path of helium molecules flowing through the defect. The spacers constrain the test bag in the test chamber, further increasing the helium pressure in the bag and resulting in increased sensitivity of leak detection. The spacers also ensure that test bags experience minimal stress during chamber evacuation by limiting bag expansion.
Test bags
The test bags were prepared by welding two sheets made from ATMI's proprietary TK8 film. To create defective bags, one of the two sheets was modified to incorporate a defective patch. A defective patch is a piece of TK8 film (4 in. x 4 in.), with a 10 μm ± 1 μm hole drilled by a laser. The laser-drilled holes were validated for defect size by measuring flow rate through the defect area. Hereafter, "defective bags" implies a test bag with a 10 μm defect, and "good bags" implies bags with no defect.
The size of the test bag depends on the size of the two sheets used in making one. The nominal volumes of test bags in this study were 1, 5, 10, 20, and 50 L. The helium leak rates through defective bags and good bags measured using HIT technology are discussed below. The test bags post-HIT tests were further characterized for product performance such as liquid particle count (USP <788>) to ensure that the HIT process did not affect product performance.
Microbiological challenge test method
In this study, test bags for microbial challenge were prepared by thermally welding TK8 film sheets having defects of specific sizes (2, 5, 10, 15, 20, and 50 μm). The test bags were aseptically filled with 50 mL microbial growth medium (trypticase soy broth). The outside of the test bag was sprayed with a 0.9% saline solution containing approximately 106 CFU/mL of Escherichia coli, Staphylococcus aureus and Bacillus spizizenii and approximately 105 CFU/mL of Candida albicans and Aspergillus brasiliensis. The test bags were then transferred to an incubator maintained at 30–35 °C and monitored for 15 d. The growth medium inside the test bag was periodically checked for microbial growth, which indicates microbial ingress.
Liquid particle count (LPC) test
A test bag was filled with ultrapure water (deionized water filtered using a 0.05 μm filter) and gently shaken to ensure that all the bag surfaces came into contact with the solution. A sample of the solution from the test bag was then passed through particle measuring equipment (PMS). The instrument reports the number of particles per mL of solution with particle sizes greater than 25 and 10 μm.
Microbiological challenge test
While immersion biotesting has long been used to challenge packages, particularly cans, for pinholes and channel leaks, they are not the real conditions that a package generally encounters during its use. Hence, package integrity-evaluation methods that employ bioaerosols that simulate the conditions that the package will be expected to tolerate during storage and distribution are more relevant and are gaining prominence. Test bags with defects between 2 and 50 μm were exposed to a microbial environment and served as test sample. The test bags not exposed to bacteria served as negative control for the aseptic filling. For each organism, a bag containing no defect was injected with 0.1 mL of a 103 CFU/mL solution of the organism to serve as positive control. The positive controls exhibited growth after 1 d, thus validating the test conditions for detecting microbial ingress.
As expected, under the conditions of the test, microbial ingress into a package took longer as the defect size got smaller. Test bags with defect sizes of 50 and 20 μm took the same time (i.e., 5 d) to allow microbial growth, meaning that a clear channel for microbial ingress was already established at 20 μm. However, when the defect size was 15 μm, it took 14 d to show any microbial growth (see Table I). A significant slowdown in the microbial ingress at 15 μm and complete cessation of microbial ingress at 10 μm or smaller defect sizes are interesting, considering that microbial organisms are much smaller than 10 μm and should infiltrate through 10 μm defects just as easily as they did through 20 μm defects. The logical explanation for this observation lies in the threshold pressure inside the test bag (1). To initiate microbial ingress through a defect, the pressure inside the test bag (i.e., threshold pressure) must overcome the force of the liquid surface tension and initiate liquid flow through the defect, thus providing a channel for microbes to travel into the bag. The magnitude of threshold pressure required to initiate liquid flow depends on the location of the defect due to differences in the static head pressures. As defect size decreases, the threshold pressure for a given liquid increases. Thus, in test bags with 10 μm defects, the threshold pressure is lower than the force of the liquid surface tension, preventing the formation of a channel through which microbes can travel. One other reason offered in the literature for this behavior is the formation of a biofilm on the film surface, which prevents microbial ingress.
Table I: Microbial challenge test data.
Although additional studies may be required to confirm the root cause for the lack of penetration of microbes through a 10 μm defect, these results are in agreement with researchers in the food packaging industry such as Lampi and Chen, who have shown the critical dimension for microbial ingress to be about 10 μm (2, 3). The differences in the critical defect size for microbial ingress between different studies could be due to differences in the test methods employed and the contact materials that affect the surface tension of the liquid. Based on these results, it is clear that defects larger than 10 μm cause sterility breach. Therefore, integrity testing for on-line package testing must detect 10 μm defects to ensure product sterility.
Leak detection data
In an ideal world, the helium in the test background would be nonexistent, and a good bag would not leak helium at all. On the other hand, a defective bag would leak a definite amount helium, that would be detected, thus resulting in distinct separation of helium leak rates for defective bags versus those for good bags. This ideal behavior would result in a high degree of resolution, allowing existing leak-detection methods to be used to detect bag defects.
However, the walls of flexible bags are often made of polymeric materials, which are intrinsically permeable to gases. Helium gas has a smaller molecular size and permeates faster through polymeric materials than air or nitrogen. Thus, even a good test bag can leak a significant amount of helium by diffusion through the bag walls. This diffused helium creates high helium background levels in the test chamber, thus making it difficult to quantify actual helium leaks through the defects in the test bag. The high helium background essentially masks the helium leaking from defects, limiting the lowest leak rate that can be reliably measured. HIT testing ensures that helium flowing through the defects is maximized, while the background helium concentration is minimized. The test time was kept as short as possible to prevent elevation in the helium background as the test progressed.
Data reproducibility
Often, even when a constant amount of helium flows through a defect, the ability of the pumping system to carry helium molecules to the detector can vary. In addition, the helium background varies as the equipment operates over a period of 8 h, which may result in variation of the measured leak rate for the same defect size. To minimize this variation, the volume of the test chamber external to the test bag was minimized and the pumping efficiency of the vacuum pumps was optimized. A single defective bag from each of the bag sizes (i.e., 1–50 L), was tested five times to verify the reproducibility of the leak rate through a defect (see Figure 4). The data show a standard deviation of less than 4% for leak rate in all bag sizes tested, except for 1-L bags, which had slightly greater variation at 10%. The higher variation in 1-L bags was due to the relatively high ratio of detector gas to test bag volume, resulting in higher sensitivity input parameters such as tracer gas fill volume and helium background. The box plot of the measured leak rates also indicates that the average leak rate for all defective bags, irrespective of the bag size is above a certain leak rate.
Figure 4: Data reproducibility of HIT system.
Effect of test bag position
The test chamber includes a spacer, which splits the test chamber space into four compartments that can each accommodate one test bag. Of the four compartments, the two compartments closer to the test-chamber wall are designated as positions 1 and 4, and the remaining two are designated as position 2 and 3 (see Figure 5).
Figure 5: Test chamber with spacer, illustrating test bag positioning.
During the leak-testing process, as the bag walls push against the container wall, some defects may be blocked by the container wall, thus reducing or eliminating helium flowing through the defects and, in turn, preventing its detection. To minimize this risk, spacer bars were placed between the walls of the test bag and container wall to provide a path for the tracer gas to escape. Experimental studies were performed to ensure that there was minimal variability in the amount of helium flowing through the defects due to the position of the bag in the chamber. Experiments were conducted with test bags placed in locations 1 through 4. For each location, five distinct sets of defective test bags per bag size were tested for helium leaks. The box plots showing the leak rates from bags at different positions indicates that there was minimal variation and that the measured leak rates were above a certain leak rate (see Figure 6). A t-test comparing the means of leak rates confirmed with greater than 86% confidence that no significant difference would be observed in the mean of the helium leak rates. Thus, irrespective of the bag position in the test chamber, helium leaks through a defective test bag reach the detector without appreciable loss in the process.
Figure 6: Effect of test bag position in the test chamber on the helium leak rate. Validation of HIT technology for 10 μm defect detection
A series of experiments was performed to validate that the HIT technology could detect 10 μm defects. In the first set of experiments, good test bags of all sizes (i.e., 1–50 L) were run through the tester, the helium leak rates were recorded, and a baseline leak rate was established. The defective test bags were then tested, and the leak rate through the defective bags was compared with the baseline leak rate. The sequence of testing defective bags versus good bags, smallest bag versus large bag was randomized to avoid trending issues. The helium leak rate through defective bags must be substantially higher than the maximum helium leak rate through good bags to detect 10 μm defects.
The good test bags averaged a helium leak rate of 1.17 x 10-5 cc/s, while the defective bags averaged 8.94 x 10-5 cc/s. The variation in the leak rates through good bags was largely due to variation in the helium background, while the variation in defective bag leak rates was largely due to variation in the laser-drilled defect hole sizes. The maximum leak rate, also referred to as baseline leak rate, is the sum of the average leak rate of good bags and four times the associated standard deviation. The baseline leak rate varied with bag size but for further evaluation, the baseline leak rate is the maximum leak rate possible for good test bags irrespective of bag size, 3.4 x 10-5 cc/s (see Figure 7). This meant that if the helium leak rate was higher than the baseline leak rate, there was a defect in the test bag, and the bag would be rejected. The helium leak rate value for all the defective bags tested was higher than the baseline leak rate value, allowing easy distinction between the good and the defective bags (see Figure 7).
Figure 7: Box plot of helium leak rates (cc/sec) of good bags and defective bags.
A statistical analysis (t-test for means) of leak-rate distributions of defective bags versus good bags indicated clear separation (p < 0.1) of leak rates (see Figure 8).
Figure 8: Distribution curve for helium leak rate through good bags. The curve shows a good probabliity that good baks will have a leak rate lower than the baseline leak rate.
Liquid particle count
While methods using tracer gas have come close to detecting 10 μm defects, they have failed to maintain cleanliness of the test unit. Often such techniques involved connecting the test unit to the tracer-gas supply, evacuation pumps, which can be a source of particle generation. In addition, when the test bags are filled with tracer gas, they are pressurized to a great extent, resulting in film stretching and causing particle shedding. It was the intent of this test to show that no significant increase in the liquid particulate level occurred due to the use of this technology. The ratio of liquid particles generated to the nominal bag volume was highest for 1-L bag because of its higher surface-area-to-volume ratio, making 1-L bags more sensitive to a change in particle concentration as a result of the leak testing process. Therefore, only 1-L bags were tested for verifying particulate generation. It is expected that if 1-L bags do not show a significant rise in the particle count, large bags will not show a significant rise either. A sample of solution from each of the 1-L test bags was run through particle measuring systems equipment, and the data is reported in Table II. The data did not show significant increase in the liquid particle count level indicating that HIT technology does not cause particulate generation and is safe for-point-of-use applications.
Table II: Liquid particle count results (LPC/ml) in 1 L bags.
Existing integrity-test methods for flexible containers can detect defects in the range of 500–90 μm, but are inadequate for ensuring product sterility. A microbial ingress study conducted on flexible sterile containers revealed that defects as small as 15 μm can compromise sterility, while defects equal to or smaller than 10 μm did not. A novel helium integrity test method developed by ATMI is capable of detecting 10 μm defects during in-line package testing without compromising the cleanliness of the product. The test bags with 10 μm defects had helium leak rate value higher than baseline leak rate value allowing clear distinction between good bags and defective bags. The cleanliness testing post integrity testing revealed that product performance is not affected by the test process.
Vishwas Pethe* is a research engineer, Mike Dove is a manufacturing engineer, and Alex Terentiev is US R&D director, all at ATMI Life Sciences, vpethe@atmi.com
1. S.W. Keller et al., J.Food Prot. 59 (7), 768–771 (1996).
2. R.A. Lampi et al., J. Food Process Eng. 4 (1), 1–18 (1980).
3. C. Chen et al., J. Food Prot. 54 (8), 643–647, (1991).
4. J. Gilchrist et al., J. Food Prot. 52 (6), 412–415 (1989).
5. B.A. Blakistone, et al., J. Food Prot. 59 (7), 764–767 (1996).
6. A. Roth, Vacuum Technology, (Elsevier, NY, 1990).
7. J.F. O'Hanlon, A Users Guide to Vacuum Technology (John Wiley and Sons, NY, 1989).
8. J.M. Lafferty, Foundations of Vacuum Science and Technology, (John Wiley and Sons, NY, 1998).
9. ASTM F1929-Standard Test Method for Detecting Seal Leaks in Porous Medical Packaging by Dye Penetration
10. ASTM F2096-Standard Test Method for Detecting Gross Leaks in Medical Packaging by Internal Pressurization (Bubble Test).
11. Donbar Industries, "Systems and Methods for Testing Packaging" US Patent 7,565,828 B2, Jul. 2009.
12. A. Nerken, J. Vac. Sci. Technol. A9 (3), 2036 (1991).
13. N. Hilleret, Leak Detection, published proceedings of CERN Accelerator School, Vacuum Technology, (Snekersten, Denmark 1999) pp 203–211.