Connected Software Solutions Streamline Biopharma MS Workflows

Article

The integration of modern software tools can make the acquisition and analysis of MS data more efficient, accurate, and compliant.

Mass spectrometry (MS) is a key analytical technique of the biopharmaceutical industry, with applications from early development to quality control (QC). Disjointed software practices, however, are a major impediment for researchers. Integrating advanced software packages can streamline the process of data acquisition and analysis for MS, facilitating collaboration and accelerating the pace of innovation.

Thanks to advances in automation, sensitivity, and speed, MS has become an indispensable element of the drug development process. Applications are wide-ranging and include the identification of proteins in complex biological matrices, high-throughput analysis of recombinant proteins, optimization of bioprocesses, protein characterization, rapid screening of drug-target binding, and QC (1).

In each of these cases, the accurate and efficient collection of data is paramount. However, achieving this can be a challenge, as studies often involve collecting and processing data from a wealth of different instruments. A lack of connectivity between instruments makes the process time consuming and prone to error, as data can be lost along the way.

Part of the problem is the number of different software packages used for data acquisition and analysis. This can lead to so-called “disjointed data” where results are stored in siloes and are often not centrally accessible, preventing easy access and requiring data transfer, which can lead to the loss or corruption of data.

In this article, the authors explain how integrating modern software tools can make the acquisition and analysis of MS data more efficient, accurate, and compliant. These software tools offer integrated and connected solutions, so that large amounts of data can be efficiently stored, analyzed, and accessed globally.

Practices for acquiring, organizing, and analyzing MS data are disconnected

MS is typically hyphenated to a separation technique to achieve separation of components and to reduce matrix effects and increase sensitivity. A range of chromatography–MS–based workflows, such as liquid chromatography (LC)–MS and imaged capillary isoelectric focusing (iCIEF)–MS, are used in the biopharmaceutical industry for a variety of purposes (2).

Whatever the particular workflow, data acquisition and management is typically performed using a chromatography data system (CDS), a software package or set of software packages that connects directly with, and acquires data from, the chromatography equipment. Simple processing, such as integration of peaks, may also be performed using the CDS. More complex analysis, such as protein characterization, usually requires additional and more specialized software tools.

This means researchers often use multiple, disconnected software platforms to acquire and analyze the data from their MS experiments. Previously, software was used as a standalone, meaning data was stored locally and could not be accessed remotely or by multiple people simultaneously. It also meant that users had to manually transfer data between instruments and computers with the different software packages. As MS technology and use cases advance, this approach is no longer sustainable, especially for the analysis of large datasets.

Increasingly large and complex datasets present new challenges

The number of MS users, workflows, and opportunities to connect to alternative separation and detection methods are continuing to increase. Advancing technology also means modern instruments are faster and increasingly automated, which accommodates higher throughput and allows for the generation of larger and more complex datasets from MS experiments. Gigabytes of data can be generated from a single instrument in a month; across a whole lab, which could amount to terabytes of data each month.

This deluge of data is driving new scientific insights but also presents challenges for the conventional processes of acquiring and managing data. The first of these challenges is storage, as such large amounts of data cannot be stored locally without risking data loss. Local storage also creates challenges for access, making it difficult to share results between scientists and across sites.

What’s more, when results from different experiments are stored separately, they must be transferred between instruments and software platforms. This is inefficient, time-consuming, and introduces the risk of information being lost or corrupted along the way.

Data generated and stored using traditional, separated software platforms can also be difficult to search and retrieve. The ability to quickly find relevant results becomes increasingly important as the volume of data increases, but this is challenging without robust organizational structures and comprehensive metadata.

What is metadata?

Metadata can be thought of as the “data that describes other data” and might include the time/date an experiment was performed, the person who performed the experiment, the instrument used, and the processing parameters (3). As well as assisting in data retrieval, metadata can be useful during analysis, for project progression and review, and to maintain regulatory compliance.

Modern software solutions streamline data acquisition and analysis

As MS becomes increasingly routine during the drug development pipeline, it is important to have the right software to support this.

Progressive CDS can manage data from multiple instruments, streamlining chromatography and MS analysis into a single software platform. Meanwhile, a cutting-edge protein characterization software can monitor post-translational modifications and other critical quality attributes quickly and efficiently. Modern software solutions are being designed to integrate with one another to enhance data management and analytical workflows.

Utilizing these tools as an integrated software platform has a number of advantages for users. For example, automated and streamlined data acquisition and processing saves time. One central CDS can acquire and analyze data from both chromatography and MS instruments, and provide data for downstream analysis. In addition, advanced integrative software is generally more intuitive and easier-to-use. Multiple instruments are supported on one platform, providing the same user interface and results in the same format. Templates are also available for different workflows including pre-defined parameters for different separation techniques, which increases ease-of-use, reduces the likelihood of errors in the analysis process, and ensures results are comparable. What’s more, integrating software solutions eliminates the need to frequently transfer data between different software platforms, which mitigates the risk of data being lost or corrupted.

The latest software tools also offer specific capabilities for work on complex biotherapeutics, including identification of proteins and monitoring of post-translational modifications, while data storage and security are improved. With data stored on a secure server, the limits of local storage capacity are removed. Secure storage is also a key component of data integrity, which is important for maintaining regulatory compliance. Data security has become even more important during the COVID-19 pandemic, with an increasing number of scientists working (and accessing data) from home.

Furthermore, centralized data helps to ensure remote access, allowing users to retrieve results in the lab, in the office, or from home, with the same performance as a local operation. Another advantage is that metadata is generated automatically and simplifies the process of organizing and searching data. This includes the creation of audit trails, which maintain a historical record of system, instrument, and user events. This provides traceability and preserves data integrity, which are needed to meet good practice (GxP) regulatory requirements. Combined, these benefits mean scientists can spend more time on research and analysis, driving innovation and discovery.

Integrated software supports innovation in the biopharmaceutical industry

Combining an innovative CDS with modern protein characterization software also generates significant benefits for the biopharmaceutical industry. MS is routinely used across the drug development pipeline for characterization, formulation development, process characterization, stability studies, and QC. Advanced software tools can benefit each of these stages.

Optimized data collection and streamlined analytical workflows increase efficiency and confidence in output, with benefits for individual organizations and the biopharmaceutical industry as a whole. These benefits include reduced training requirements for staff, where more intuitive and consistent software with greater automation reduces the need for specialist training. Such software also reduces the impact of staff absence, as the software for multiple instruments can be used by any analyst and data can be easily searched and retrieved, regardless of who generated it.

Another benefit is cost savings, which is achieved because the overall pipeline is made more efficient by innovative software tools that reduce the need for manual processing and data transfer. Modern software tools can also automate or partially automate many aspects of the workflow, such as sequence creation, error handling, chromatogram processing (including peak detection and integration), and graph and report creation. What’s more, CDS provide a comprehensive set of compliance tools to make it easier to meet regulatory requirements such as GxP guidelines. These tools include accurate metadata and audit trails, laboratory monitoring (including the ability to search and filter for high-risk events), streamlined report generation, and data review processes.

Collaboration, which is a major driver of innovation in the biopharmaceutical industry (4), is also supported as partners at different sites can easily access and share data.

Ultimately, the efficiency gains made by streamlined analytical workflows result in an accelerated innovation process, a faster time to market for new biopharmaceuticals, and a quicker return-on-investment.

Modern, connected software tools can also increase confidence in the quality of the final product thanks to rich and accurate data and built-in compliance and data integrity tools, including audit trails, which can be used to detect data integrity issues and provide the history behind an action.

As well as traceability, modern software tools also provide enhanced data security, including privileged access systems. Together, these tools can help organizations to meet the requirements set by major regulatory bodies such as FDA, the UK Medicines and Healthcare products Regulatory Agency, and the European Medicines Agency.

Conclusion

The scale of data being generated by increasingly sophisticated MS techniques is both empowering the biopharmaceutical industry and revealing bottlenecks in the industry’s data processing and analysis pipelines. Previous MS workflows with manual data processing, single-user limitations, local storage, and missing metadata are no longer practical.

Innovative CDS and protein characterization software overcome these limitations, and their integration is an act of future proofing for the biopharmaceutical industry. They offer a single software platform with seamless connectivity for the generation, storage, and analysis of data and a centralized server-based system that ensures information is accessible to everybody. This also generates productivity gains thanks to streamlined and automated workflows, freeing up scientists to invest more time in their research projects.

Regulatory compliance is a priority for the industry and is supported by modern software solutions. These software solutions can increase industry confidence in meeting GxP guidelines by providing tools for robust documentation, security, and traceability.

Integrated, automated, and cloud-based data management systems can help the biopharmaceutical industry to make the most of the escalating volume and complexity of analytical data. MS is now an integral part of the drug development pipeline and streamlining its role in this process, from discovery to QC, ultimately means a quicker time to market. Biopharmaceutical production needs GxP just as traditional pharma needs it, and being able to implement GxP guidelines seamlessly helps contribute to product quality.

References

1. G. L. Glish, and R. W. Vachet, Nature Reviews Drug Discovery 2, 140–150 (2003).
2. G. Loos, A. Van Schepdael, and D. Cabooter, Series A, Mathematical, Physical, And Engineering Sciences 374 (2016).
3. WHO, “Good Chromatography Practices (Draft for Comments),” WHO Drug Information 33 (2) (July 2019).
4. N. Lesser and M. Hefner, Partnering for Progress: How Collaborations are Fueling Biomedical Advances (Deloitte, New York, 2017).

About the authors

Peter Zipfell is product marketing manager, CDS Software, at Thermo Fisher Scientific, and Michelle English is Data Science product manager at Protein Metrics.

Recent Videos
Related Content
© 2024 MJH Life Sciences

All rights reserved.