Biopharma Analysis Benefits from New Technology and Methods

Publication
Article
BioPharm InternationalBioPharm International-02-01-2020
Volume 33
Issue 2
Pages: 10–14, 40

Analytical solutions are improving for raw material testing, process development, drug product release, and more.

lotus_studio/Stock.Adobe.com, Jezper/Stock.Adobe.com, Adapted by: Maria Reyes

Effective analytical methods are essential for the successful development and commercialization of both small- and large-molecule drug substances and drug products. As the complexity of both biologic and chemical drug substances increases, analytical methods must evolve as well. New analytical techniques and methods are therefore crucial to the fast-moving bio/pharma industry.

“Faster, more efficient techniques will give companies an advantage as their products move through the pipeline,” asserts Robin Spivey, director of analytical research and development, Cambrex High Point. Techniques that are more sensitive and more accurate will, she says, better position a company for regulatory acceptance as long as they are willing to help pioneer the techniques. In addition, such companies will be seen as being at the forefront of the industry.

Major strides in analytical methods

Some of the most noteworthy advances in analytical methods involve the application of mass spectrometry (MS) for process development and product release of both biologics and synthetic drugs, the enhancement of chromatographic techniques, particularly liquid chromatography (LC), microcrystal electron diffraction, and techniques designed for use as process analytical technology (PAT).

For biopharmaceuticals, MS was initially limited to use for protein characterization to provide supplemental information for regulatory filings, according to Amit Katiyar, director of analytical and formulation development for bioprocess sciences at Thermo Fisher Scientific. Process release/stability testing continues to largely depend on conventional analytical methods such as LC, capillary gel electrophoresis (CGE), imaged capillary isoelectric focusing (iCIEF), and enzyme-linked immunosorbent assays (ELISA) due to their simplicity and wide adoption in quality control (QC) labs.

Inclusion of biosimilars, complex non-monoclonal antibody proteins (e.g., fusion proteins), bispecifics, and combination products in the product pipeline, however, is presenting challenges due to the inability to gain a thorough understanding of these molecules using platform methods. “Most of the time, platform methods may not be able to provide the information required to develop and commercialize complex biomolecules. In these cases, MS-based methods are being used for process development and as identity and release/stability indicating methods,” Katiyar observes.

In addition to using peptide-mapping principles in multi-attribute methods (MAMs), major biopharmaceutical companies are now using MS-based identity methods to release biologic drug substances and drug products. “This approach will provide the opportunity to gather more information on the performance of MS instruments in QC labs that can then be used for implementing MS technology for process development, release, and stability testing,” says Katiyar. The current approach for regulatory filing, he adds, is to use a combined package of conventional methods and MS methods to gain more confidence from health authorities and be able to present a future case for submissions based only on MS data.

For Da Ren, process development scientific director at Amgen, MAM is probably the most important emerging analytical technology that has been used in process development and release and stability testing of therapeutic proteins. “MAM is an LC/MS-based peptide mapping assay. Unlike profile-based conventional analytical assays, which focus on whole or partial proteins, MAM can identify and quantify protein changes at the amino acid level and can provide more accurate information on product quality related attributes,” he explains. Notably, MAM is capable of replacing four conventional assays including hydrophilic interaction liquid chromatography for glycan profiling, cation exchange chromatography for charge variant analysis, reduced capillary electrophoresis-sodium dodecyl sulfate for clipped variant analysis, and ELISA for protein identification, according to Ren.

In the case of small-molecule drug development and commercialization, MS detection systems are no longer considered just research tools and are becoming more widely used for routine QC testing, for example, determining extremely low level impurities such as genotoxic impurities/potential genotoxic impurities, according to Geoff Carr, director of analytical development in Canada with Thermo Fisher Scientific.

“These advances are very likely in response to new regulatory guidelines issued by agencies such as FDA and the European Medicines Agency, but also as a result of specific problems that have occurred in the industry, such as recent concerns regarding observations of N-nitrosamine residues in sartans,” Carr explains.

Efficiency gains for analytical workflows

Changes in analytical workflows have the potential to impact productivity and efficiency but may also create challenges depending on the nature of the modifications. These changes may also originate as the result of new technology or new processes and approaches.

For biologics, using MAM through process development and release and stability testing is a revolutionary analytical workflow, according to Ren. “The continuous monitoring and control of product quality attributes at the amino acid level during product and process characterization as well as release and stability testing enhances the understanding of biotherapeutic products and processes,” he asserts.

One driver leading to changes in analytical workflows is the desire to achieve greater efficiencies and thereby reduce operating costs, according to Carr. One approach that many pharma companies have taken, he notes, is to implement operational excellence initiatives within laboratory operations.

Regulatory pressures for improvements in the scientific understanding and quality of drug product is also leading to an evolution in analytical workflows. “We are seeing increasing guidelines focused on analytical development, such as a [Brazilian Health Regulatory Agency] ANVISA guideline on conducting forced degradation studies that is very demanding,” Carr observes.

 

 

Automation improves sample prep

Some of the most important advances in sample preparation tools include increasing application of automation and robotics. Quality-by-design (QbD) approaches to analytical testing can often lead to multiple sampling and testing to achieve a more accurate assessment of the total batch rather than testing one or two samples per batch.

Automation of sample preparation for low throughput methods is also critical to improve turn-around times to support process development activities, adds Katiyar. In general, he notes that automation of all in-process methods for biologics-including size-exclusion chromatography, CGE, iCIEF, n-Glycan content and residual host-cell protein, DNA, and Protein A-to support process development activities is crucial for meeting fast-to-first-in-human trials/quick-to-clinic timelines. “In addition,” he says, “late-stage programs with QbD filings are also exploring better turnaround times to support expanding pipelines.”

In the field of biologics sample preparation, Process Development Scientific Director Jill Crouse-Zeineddini at Amgen sees acoustic droplet ejection for potency assays as an important advance. Acoustic droplet ejection uses acoustic energy instead of tips to transfer a fixed amount of liquid sample from a source to destination plates freely with excellent accuracy and precision, she explains. “The significance of this technology resides in its superb dispensing performance at a very low sample volume. This technology performs direct dilutions instead of serial dilutions and prepares each dose independently, improving assay precision and throughput,” Crouse-Zeineddini observes.

 Robust aseptic sampling and automated sample preparation for the purification, desalting, and digestion of protein samples, meanwhile, enables many different product quality analyses. “This technology not only significantly improves operational efficiency, but also eliminates potential contamination and mistakes during manual sampling handling,” states Gang Xue, process development scientific director at Amgen.

Another important point, according to Carr, concerns the reliability of the sample preparation procedure. “This issue is not a new one, but it is becoming more apparent as we apply QbD approaches to our analytical procedures. While the greatest emphasis has been applied to chromatographic parameters, we now realize that the sample preparation stage is at least as important and also needs to be developed using QbD,” he comments.

More developments on the horizon

Such capability for biologic drug substances has yet to be developed, however, and simplified identity methods to support release and establishing post-shipment identity of bulk drug substance are still required, according to Katiyar. Currently, peptide mapping and binding ELISA are used as identity methods, but they have long turnaround times. Raman spectroscopy has been evaluated for biologics, but it has not yet been adopted by the industry for release of drug substances and drug products. “Simplification using scan-based methods with better specificity and faster turnaround times would be highly beneficial for biopharmaceuticals,” he says.

When integrated with analytical instruments, aseptic sampling and automated sample preparation has the potential to move in-process and product release testing from offline QC labs to the manufacturing floor, either in-line or online, according to Xue. In addition to enabling real-time monitoring of not only cell growth, but also the critical quality attributes of therapeutic proteins themselves, the technology is beneficial for providing much more granular insights into the conventional batch process and products in-flight, he notes. “More importantly,” Xue states, “it could in the future be crucial for lot definition, process variation detection, and material segregation as required for continuous bioprocessing.”

Carr, meanwhile, expects to see increasing use of LC–MS for routine analytical testing. “This technology is widely applied in chemical drug development labs for various purposes and is also used for biopharmaceutical analytical testing, but less for release testing of products and for testing stability samples. The technology has advanced considerably over recent years, and while these instruments were previously only applied in R&D, they have now become highly suitable for use in routine testing labs,” he remarks.

Short timelines create challenges

There are a number of challenges to the adoption of advances in analytical techniques, some of which vary according to the development phase. Adhering to compressed program timelines is the key challenge in getting advances in method adoption for early stage development, according to Katiyar. “Fast-to-first-in-human/quick-to-clinic program timelines have been introduced in almost every pharmaceutical organization to provide clinical material for Phase I studies, and these timelines have shrunk from 18 months to less than 12 months during the past five years,” he says.

The shorter timelines are met by relying on platform approaches developed based on knowledge generated over years with multiple molecules. “For new molecules that fit the platform methods, there is no scientific justification to explore new technologies,” Katiyar states. When working in a lab that is operating in a high-efficiency environment, there is often resistance to the introduction of new methods and approaches due to concerns about meeting delivery targets, agrees Carr.

Once programs move to late-phase development, organizations are hesitant to introduce any change in the control strategy unless it is absolutely needed. This reluctance is particularly strong if a filing has been made to a regulatory agency and/or if significant data have been collected using the older technique, according to Spivey.

“To be adopted for measuring product quality measurement, the performance of new analytical methods must be equivalent to or better than the methods they replace, and there must be clear evidence that they are reliable and robust across a wide range of operating spaces,” states John Harrahy, director of process development in pivotal attribute sciences with Amgen. The adoption of new technology in the middle of a program, adds Katiyar, requires significant effort to develop the new method, perform bridging studies, requalify the method, perform technology transfer (if outsourced), perform retrospective testing, and define new specifications. Bridging studies cost the sponsor additional money and time, and there is always the risk that a bridging study may show that the methods or techniques are not comparable, adds Spivey.

There is also often a reluctance on the part of drug companies to be the first to make a submission to FDA with a new technique due to the possibility of the validity of the technique being questioned, Spivey notes. “They don’t want the burden of having to defend the technique to FDA or other regulatory agencies,” she says. There can be some risk with introducing new technologies that have had limited regulatory exposure, adds Harrahy, particularly considering the different regulatory expectations and change control requirements from different regulatory authorities worldwide.

“With that said,” Harrahy comments, “evaluating innovative technologies is a vital component to ensuring product quality and value to patients, and the ultimate risk of not evaluating new technologies greatly outweighs remaining stagnant.”

The ideal solution, Katiya argues, is to explore new technologies as part of improvement initiatives without associating them with any programs. This approach provides the flexibility to explore new technologies without putting the program timelines at risk. “Once proof of concept is established and the method is ready to be adopted, a platform approach can be used to implement the new technology,” he comments.

Senior leadership in large organizations, according to Katiya, must provide guidance to their teams to push innovation without risking program timelines. In addition, it is also important to apply thorough training practices to ensure that scientists really understand the new approaches, says Carr. Continuity of data must also be addressed. “Trend analysis is a widely used tool for monitoring pharmaceutical product quality, and the introduction of new and ‘better’ methods may be perceived to interfere with this trending process,” Carr observes, even though it is more important to apply continuous improvement and accept possible breaks in trends.

 

 

Ways to facilitate adoption of analytical technology

In addition to evaluating new analytical methods separately from specific drug development programs, there are several other strategies that can be used to facilitate the adoption of advances in analytical techniques.

The best strategy for adopting a new analytical method in a quality setting, according to Harrahy, is to start with the end in mind. Does the proposed method fit the analytical target file? Is the method sufficiently capable for the product or products that it will measure? Does the methodology require modification to the available GMP/QC environment?

“The robustness, reliability, and value of introducing any new method must be clearly demonstrated, which is often best accomplished by taking a staged approach: determining the method operable design space in a development laboratory, piloting the method in a development/phase-appropriate setting to monitor ‘real-world’ method capability, performing bridging studies vs. the older method, staging its implementation in QC, and continuously monitoring method performance,” he says. In addition, for regulatory acceptance of novel technologies, early partnered engagement with health authorities is strongly recommended, for example, participating in FDA’s emerging technology platform when the new technology has the potential to improve product quality.

The most important strategy, agrees Spivey, is to provide ample data demonstrating that new methods are reliable and robust and that there is little or no risk to implementing the technique in a regulated environment. Advances that offer significant advantage over corresponding currently accepted techniques will also have greater likelihood for acceptance. However, Spivey stresses that the advantage would need to be significant enough to be worth the time and money needed for it to be implemented.

“Ideally,” she says, “the owner of the technique would perform some preliminary legwork with the regulatory agencies demonstrating the capabilities of the technique. The sponsor would then have some assurance that the agencies would accept their data and make it a less risky approach for them.”

Another approach, depending on the nature of the old and new/improved methods, is to run both in parallel for a period of time in order to develop an understanding of how their performance and the resulting data compare, Carr suggests.

For Heewon Lee, director of analytical research and quality systems in chemical development US for Boehringer Ingelheim Pharmaceuticals, the key to new analytical method adoption is the sharing of use cases between pharmaceutical companies combined with the publication of white papers and communication with regulatory authorities. Katiyar agrees that sharing knowledge is essential. “Peer-reviewed publications, conference presentations, and Biophorum Operations Group-like forums are the best places to share information and exchange ideas to improve and adopt new technologies on a global scale,” he comments.

The need to collaborate

That information sharing should occur between all stakeholders, including contract research, development and manufacturing organization, testing laboratories, biopharmaceutical companies, regulatory authorities, and instrument/equipment vendors.

“Innovators and service providers need to be open to new ideas and be willing to invest the time and money to implement new techniques. Service providers also, rather than waiting for clients to request a technique before investing in it, should advocate for the use of new methods with their clients,” Spivey asserts. In addition, Katiyar believes innovator companies working with service providers should form an external working group to share new methods and technology to eliminate knowledge gaps caused during technology transfer of methods. “Most of the time,” he remarks, “innovator companies are not willing to share new methods and technologies and thus delay the adoption of new technologies throughout the pharmaceutical industry.”

Regulators also need to be open to new ideas and willing to work with pharmaceutical companies to ensure that new methods and techniques are acceptable for use in a regulated environment, according to Spivey. It is important for pharma companies and regulatory authorities to remember they have a common goal in identifying new methods and technologies for monitoring and quantifying critical quality attributes that may impact the safety and efficacy of the molecule throughout the lifecycle of the program, adds Katiyar. He points to MAMs as an example where health authorities have accepted data packages consisting of results obtained using conventional approaches supplemented by those obtained using MS-based approaches.

Instrument/equipment vendors, meanwhile, should be prepared to demonstrate that a new technique is sufficiently better than the currently accepted technique to be worth investing in and worth any potential regulatory risks, asserts Spivey. The dilemma here, according to Carr, is how stakeholders all link together.

“If a new analytical technology comes up, it will not be accepted by industry/regulators unless the equipment that is required to use it becomes widely available. Maintenance, qualification, and repair services must also be widely available and reliable. Typically, however, a vendor will not set establish this level of availability unless there is a level of confidence that sales targets will be achieved. I think that this is the area where conferences, exhibitions, and publications provide a really valuable platform to get the information from innovators and suppliers circulated to end users,” he says.

Article Details

BioPharm International
Vol. 33, No. 2
February 2020
Pages: 10–14, 40

Citation:

When referring to this article, please cite it as: C. Challener, “Biopharma Analysis Benefits from New Technology and Methods,” BioPharm International, 33 (2) 2020.

 

 

Recent Videos
Lee Cronin, founder and CEO of Chemify
Jay Rajagopalan, senior director—Engineering & Product Management for Malema at PSG Biotech | Image Credit: BioPharm International
© 2024 MJH Life Sciences

All rights reserved.