The Criticality of Manufacturing Data

Publication
Article
BioPharm InternationalBioPharm International-11-01-2018
Volume 31
Issue 11
Pages: 20-22

Industry is searching for ways to deal with the criticality of ensuring data integrity.

Vectorfusionart/stock.Adobe.com

Regulators require that data used during biopharmaceutical processes, including drug development and manufacturing, be available to support biologics license applications (1). According to Billy Sisk, life-sciences industry manager in EMEA at Rockwell Automation, manufacturing data such as pressures, feed flow flux, and pH are generated during harvest, filtration, and purification processes. And these data must be collected and controlled in accordance with regulations. “Biopharmaceutical producers need to ensure that data are recorded, processed, retained, and used for a complete, consistent, and accurate record throughout the data lifecycle. [Companies] need to maintain formal management of the records and data throughout the company and control all phases of the data lifecycle-from initial data creation to destruction,” says Sisk.

Regulations state that data must be secure, reliable, and available for review, but they don’t necessarily tell companies how to do this. In the past few years, FDA has made data integrity a focus. The agency has issued guidance on data integrity requirements (2) and has cited a number of companies for lack of proper data safety procedures (3). It appears as if data integrity has suddenly become a problem that was not there before, but is that truly the case?

According to Els Poff, executive director–Data Integrity Center of Excellence at Merck, data integrity may not have been well understood until multiple guidance documents were published. It has been a few years since the publication of these guidance documents, and therefore, most companies now have programs in place that ensure the appropriate prevention and detection controls are applied to secure their data.

Are all data equally critical?

There is some concern and debate in the industry, however, about how stringent companies need to be when securing data. The Parenteral Drug Association (PDA) is working with members of industry and regulators to develop a technical report that industry can use to determine what companies need to do to stay compliant with data integrity regulations and requirements.

According to Poff, a member of the working group writing the report, PDA is trying to provide industry with clarity on what should be done regarding data that aren’t directly linked to the quality of the batch. “From a data integrity perspective, [industry is] not 100% sure how they should treat that data. Do the controls for [generating, using, and reporting these data] have to be the same as we would use with the data in our labora.tories or for inline testing elements of a manufacturing process?” Poff asks.

The PDA working group is looking to develop levels of criticality for certain types of data. “We as a team are challenging ourselves to see if there are places where the controls are in line with the criticality of that data point. The high critical [data points] are easy [to define] … your critical process parameters (CPPs), your critical quality attributes (CQAs), in-line testing results … but what about everything else? Are there levels of criticality you can assign to [other types of data] that are less critical? … We are trying to identify a number of data integrity controls … based on [the] vulnerability [and] complexity of [the] process … to provide better understanding on what’s really needed based on a risk-based approach,” Poff says.

Going too far?

“Industry struggles with defining what [criticality] is. [Companies may be afraid] that the way they interpret [criticality] may not satisfy the regulator who is going to come and visit them next, so by being overly conservative, they are safeguarding themselves from observations, and not be in any position where anything gets called into question,” Poff says.

But all this work might be unnecessary. For example, says Poff, human-machine interfaces (HMIs) on the shop floor may simply turn an asset off or on, but they allow data, such as speed and time, to be entered. Older HMIs may not be capable of having individual logins and passwords, but regulations indicate that any data entered must be attributable to a specific individual. “So, what do we do with that asset? Do we need to rip it out and put something else there?” asks Poff. “Or can you look at what it’s controlling, define the data it’s controlling or generating, and make a rational decision to say this HMI only starts and stops the machine. That’s all it does. So, why are we calling that into the data integrity realm? Or is it setting a line speed that doesn’t really reflect the quality of the batch because it’s only controlling the speed of the line? So, although there are data there, [these data are] not relevant to the quality of the batch. And therefore, controls should be applied that are commensurate with the risk to product qual.ity and/or patient safety. It’s that kind of logic that we are trying to outline for [industry] and give them a framework to evaluate their system, process, and people to determine if everything is needed, or is there a level of control that can be different for this situation?”

Poff hopes the new PDA report will help companies make some of these choices. “The intent of the tech report is to make sense of some of the black and white rules … Manufacturing is different than the lab, and if we apply a black-and-white rule, we might be putting the business at a disadvantage.”

 

 

One size does not fit all

Poff states that the fact that different companies have different data systems complicates the development of guidelines that suit every process or every company. “Some [biopharma com.panies] are highly automated, highly technology enabled; some are paper based, and the majority are a hybrid where some elements of the manufacturing process are automated and other elements are paper based.”

“There is not a one-size-fits-all solution to ensuring data integrity. In many cases, there will need to be point appli.cations or manual processes, such as second-person verification, to ensure data integrity,” says Melissa Seymour, vice-president, Global Quality Control at Biogen. “Some commercial solutions, such as ComplianceBuilder by Xybion or LDAP [Lightweight Directory Access Protocol], can help manage access control. Additionally, alternative Windows shells can be configured to block users from deleting data on personal computer-connected instruments.”

Tools for controlling data integrity

“Producers should understand the data lifecycle, manage data quality, manage reference and review data, and manage quality risks,” says Sisk. Risk assessment, behavioral steps (e.g., culture and training), procedural steps (e.g., audit trail review, control of access, supervision of manual entry), and technical steps (system validation, user access control, rights management, segregation of duties, etc.) are all tools that can be used to ensure data integrity, according to Sisk.

The technology is developing, and industry has been looking for ways to control the integrity of data, such as automation. Automation creates its own complications, however, and can’t be completely relied upon.

“In recent years, the collection of data during manufacturing has become more automated which, while providing benefits from a trending and integration perspective, [also] increases risk with respect to data loss and/or data manipulation,” says Seymour. “Additionally, the move towards more on-line and at-line testing to determine chromatographic performance creates more system integrations resulting in more complex data flows and subsequently more complex data mapping. Data [are] becoming less siloed and more integrated. For example, an evaporating light scattering detection (ELSD) system may be coupled with high-performance liquid chromatography (HPLC) and integrated into the manufacturing execution system (MES) at a chromatography step. Compounding the complexity, the MES may be integrated to a laboratory execution system (LES) and one or more enterprise systems for deviations or business management processes. As data become more electronic and more accessible, this provides the industry with tools for better product process and quality understanding. At the same time the quality of the data then becomes of utmost importance,” says Seymour.

“A validated electronic batch recording (EBR) solution, combined with a single data source, promotes a high level of data integrity,” says Sisk. “EBR supports data integrity through a multitude of validation measures, including monitoring interface communication, integrating the automation layer with standardized interfaces for automated data acquisition, checking consistency during recipe design and execution, and automatically recording changes to electronic records. In addition, fully integrated exception management in EBR supports tracking anomalies during process execution, limiting violation exceptions, overriding exceptions, recording canceled exceptions, providing real-time availability of exception review, and controlling user authorization during review and closure of exceptions.”

Culture of quality

A culture of quality that puts emphasis on data integrity is also essential. Quality culture and data integrity are linked, according to Susan Schniepp of Regulatory Compliance Associates (4). “It is clear that regulatory authorities consider quality culture an important element in establishing the veracity and integrity of the data being generated by companies that support the products they manufacture,” said Schniepp. “Auditing a company to determine if their culture is conducive to generating data that meets the attributable, legible, contemporaneous, original, and accurate (ALCOA) concepts is on the horizon and may become a part of routine audits performed by regulators or industry auditors when evaluating the suitability of a manufacturer, potential partner, or service provider,” she added.

“First and foremost, people are the most critical element in any data integrity initiative,” agrees Seymour. “Organizations must create awareness and educational programs to achieve the appropriate quality culture. Additionally, close collaboration between information technology, quality, and business units is critical in defining the overall structure of a data integrity program. Robust data governance policies that define and standardize processes across the organization are also critical to instilling integrity of data. Key in the development of data integrity programs is the utilization of business process mapping in defining the flow of data and identification of quality information as an asset. This includes the need for data mapping from initial capture through the complete lifecycle, ensuring that systems integrations and electronic transcriptions are well understood, risk assessed, and remediated. Clarity of definitions and procedural controls are also necessary,” says Seymour.

 

 

Considerations when outsourcing

The sharing of data becomes equally important in the global pharma environment. Integrity and quality are important when transferring and sharing data with contract manufacturing or research organizations (CMOs and CROs) (1). Biopharmaceutical development and manufacturing data can be complex and can have specific language and formats (1). Sponsor companies must be sure the CMOs or CROs they work with can access and understand these data, and regulatory data integrity requirements make this task imperative.

“Getting the right data landscape in place is now an even more critical business objective,” according to Paul Denny-Gouldson of IDBS. Companies should use a common lexicon or catalogue of terms. “This is perhaps the simplest of things, but it causes the most issues in peer-to-peer data exchange,” stated Denny-Gouldson (1).

Sponsor companies should consider if their partners have their own data systems. And the sponsor’s definition of quality might be different than their outsourcing partner (1). Specifics about each data point, such as equipment and software used, lab conditions, buffers and media used, and the training of personnel should be known (1).

“The utopian view would be a network of software solutions-an ecosystem-that all work seamlessly together to deliver the data collaboration services required,” according to Denny-Gouldson (1). Having a system to ensure quality data are secure and available should be a critical objective in the sponsor/contractor relationship.

Evolving technology and industry

With the ever-evolving nature of technology, how does the bio/pharmaceutical industry ensure the quality and integrity of its data in the years and decades to come? Poff thinks this is a question that is still unanswered.

“Our retention periods are so long. How do you ensure that data remain accessible over [their] lifespan? I think in the short term, we have somewhat found a way to make this possible. But with the rapid change of technologies and more automation/data capture on the shop floor, companies will need to ensure they look at long-term data retention strategies,” Poff says.

References

1. P. Denny-Gouldson, “Barriers and Solutions to Effective External Collaboration in Biopharma,” BioPharm International, Biopharma Laboratory Best Practices 2018, EBook (October 2018).

2. FDA, Data Integrity and Compliance With CGMP, Guidance for Industry, Draft Guidance (CDER, April 2019), www.fda.gov/downloads/drugs/guidances/ucm495891.pdf

3. FDA, Warning Letters, FDA.gov, www.fda.gov/ICECI/EnforcementActions/WarningLetters/default.htm

4. S. Schniepp, Pharmaceutical Technology 42 (10) October 2018.

Article Details

BioPharm International
Vol. 31, No. 11
November 2018
Pages: 20-22

Citation

When referring to this article, please cite it as S. Haigney, "The Criticality of Manufacturing Data," BioPharm International 31 (11) 2018.

Recent Videos
Behind the Headlines episode 5
Related Content
© 2024 MJH Life Sciences

All rights reserved.