Removing Gaps in Data Integrity

Publication
Article
BioPharm InternationalBioPharm International-07-01-2019
Volume 32
Issue 7
Pages: 41–45

FDA guidance is expected to improve industry practices, but work is also needed to bridge disparate industry and software engineering standards.

Gorodenkoff - Stock.Adobe.com

Over the past few years, after some notable failures at a number of companies (1), global regulators have been paying much closer attention to how pharma­ceutical manufacturers safeguard the integrity of their data, to prevent accidental manipulation as well as fraud. Data integrity is the foundation of the current good manufactur­ing practices (cGMPs). According to FDA, between January 2015 and May 15, 2016, 21 out of the 28 warning letters issued to pharmaceutical manufacturers centered around problems with data integrity (2).

As Orlando López, Kansas-based senior site automation specialist for a Big Pharma company, has noted (3), con­nections between different data points (i.e., initial data cap­ture, metadata, and records) cannot be weakened or broken because they provide the only objective evidence that opera­tions meet regulatory requirements and are being managed responsibly. These intact connections are also needed for process validation and continuous process verification.

FDA’s 2016 draft guidance clarified a number of best practices, such as the need for paper and electronic recordkeeping to use the same basic practices, and for an administrative role for managing data, separate from func­tions that are generating and recording that data. FDA also introduced the ALCOA acronym to emphasize the fact that data must be “attributable, legible, contemporaneous, original (or a true copy), and accurate.”

In December 2018, FDA updated this guidance (4). “The new guidance serves to clarify data integrity requirements that have consistently challenged organizations. It firms up expectations around data/metadata and the data lifecycle within a risk-based structure of systems and design controls,” says Kir Henrici, CEO of the Henrici Group, a consultant for the Parenteral Drug Association (PDA), which pub­lished best practices on laboratory data integrity in 2018 and plans to publish manufacturing data integrity guidance by the end of 2019.  FDA has emphasized the need for careful risk manage­ment, the importance of audit trails and access controls, and the need to save all records that could be relevant to cGMP compliance.

FDA’s latest data integrity guidance also underscores the importance of senior management support and involvement in data integrity efforts. “Leadership and culture are key, and, if underdevel­oped, are the starting points for com­pliance risks and gaps,” notes Henrici.

Training and ongoing communica­tion with senior management are crucial to establishing and maintaining this support, says López. Expressing poten­tial losses or liabilities in dollars and cents terms can be especially effective, he says, noting a 2017 hacking inci­dent at Merck & Co., which cost $105 million to fix (5) and might have been prevented if investment had been made in data security. He also recommends keeping senior management aware of regulatory citations in a way that allows them to connect data issues to lost batches, product recalls, incorrect labels, or formulation changes. “Senior manag­ers needs to hear about money. It gets their attention immediately,” he says.

Risk-based approach

It is also important to take a systematic, risk-based approach to data integrity pro­grams, says Henrici. Some organizations may roll out data integrity programs in fragments, or as a reaction to a specific event, but then allow efforts to trail off, she says. “Pharmaceutical companies need to structure a systematic, risk-based approach that integrates the quality man­agement system (QMS), which assures the efficacy and sustainability of the entire data integrity initiative.”

Process mapping is especially important in ensuring that the right documentation is easily acces­sible to staff, but also to regulators. “Companies should track their ‘qual­ity decisions’ to identify where non-cGMP records (e.g., emails) will be required to demonstrate cGMP com­pliance,” says Henrici.

“You need to understand each spe­cific process. Then, if you do a good risk assessment, to record which systems and which data you will focus on, you will be in a much better position to demonstrate full cGMP controls to the authorities,” says López. “If you have done both the risk assessment and the process mapping correctly, there should be no doubt that you are implementing controls to the correct records,” he says. Decisionmaking processes must be documented, says Henrici-not only reported data but any data that support quality decisions-and they need to be justified, retrievable, and reviewable, she says. Maintaining a clear audit trail and using the best approaches to computer system validation will be crucial and will require new training and staffing practices, says Henrici. “Next-generation quality assurance will consist of multiple disciplines or working teams to keep pace with data integrity require­ments in the era of smart manufacturing, big data management, and artificial intel­ligence,” she says.

 

Getting to a deeper level

In a sense, however, data integrity is only the first step in a continuum. “Pharmaceutical industry guidelines often refer to data reliability, which is a step beyond data integrity,” says López. “But, ultimately, the goal for our industry is data quality, which is a step beyond reliability. We can­not reach that point with the guide­lines that we have now,” he says. “And cGMP requirements alone will not suffice.” (6) Another challenge is the fact that existing pharmaceutical industry data guidelines don’t go deep enough, to the engineers and IT pro­fessionals who are implementing the systems, says López. “ALCOA may give them the ‘why,’ but it doesn’t give them the ‘how,’” he says.

Annex 11, first published in 1986, addresses data integrity in a more explicit way than the US 21 Code of Federal Regulations (CFR) Part 11, he notes. López recalls one Big Pharma company that correlated Annex 11 with Part 11 computer system require­ments, which made things easier for engineers and IT professionals imple­menting systems.

Currently, there are a number of different data integrity guidelines. In pharma, these include regional ones as well as those developed by pro­fessional societies such as PDA and the European Compliance Academy (ECA). In addition, the software industry has codes of its own, as does the manufacturing industry in gen­eral. The focus of each of the relevant pharma guidelines is different, López says, and the definition of data integ­rity in software engineering standards (i.e., as mainly about data security and not changing the records), differs somewhat from its definition in the pharmaceutical industry. The result can be confusing to IT professionals, at least initially, and take time to sort out.

“I would love to see integra­tion between pharma standards and the software engineering standards, because currently the industry is not speaking the same language as soft­ware development academics. For example, for the pharma industry, vali­dation is a process associated with the system lifecycle (SLC), which con­tinues from the start of a project until decommissioning. For software devel­opment, validation is just the testing phase of the SLC. So when you hire a new graduate, he or she won’t under­stand the difference,” says López.

Without understanding the pharma validation concept and the relation­ship between software engineering and software quality, people in pharma’s IT and engineering departments will have different understandings and the resulting programs will be incomplete, he says. López recalls seeing such a situation when asked to consult for one mid-sized pharmaceutical com­pany several years ago. Data security had been made part of the system’s user requirements, but, as one looked at the different processes throughout the product lifecycle, there was one point where the computer system was missing provisions for security, so that anyone could make changes to the data. “There was testing, but no transparency, and in the design document, there had been no steps taken to ensure security throughout the lifecycle,” he says.

Common standards and defini­tions will be the key to moving the industry beyond data integrity to data quality, López says, explaining that the origin of the International Society for Pharmaceutical Engineers’ (ISPE’s) GAMP 5 guidance (7) is the International Standards Organization’s (ISO’s) 9000-3, which is also the foun­dation of the pharmaceutical industry’s guidelines. “Why are we speaking dif­ferent languages?” he asks. “We need to synchronize all the different stan­dards (e.g., ISO, ISPE, The Institute of Electrical and Electronics Engineers [IEEE]), and pharma’s so that we are speaking one common language and so that we will understand each other,” says López. Without that common understanding, gaps will persist, some worse than others, he says.

The pharmaceutical industry’s approach to data integrity and quality will need to evolve as the industry moves to a Pharma 4.0 model and adopts data analytics and artificial intelligence. At INTERPHEX 2019 in April, Henrici spoke of the need for a “Data Integrity 4.0” strategy, a framework for matching IT and data integrity rules. She also noted that FDA approved 12 artificial intelligence algorithms in 2018.

If the industry is to reap the ben­efits of using algorithms and artificial intelligence, she said, companies will need to create multifunctional data governance teams to bring different perspectives to this effort, and facilitate communication between industry and regulators, and data will need to be safe, secure, and relevant (8).

In the end, the required knowl­edge is already there for implementing new IT approaches, says López, who developed an analytics and business intelligence platform at a Big Pharma company several years ago, before the term “Big Data” had even become a buzz phrase. He advises breaking the problem down to its simplest level: the data wave and input/output (I/O). “With a Big Data project, instead of one, you might have 20 data waves and I/Os. The pharmaceutical industry has been working on all the technolo­gies that industries are exploring (e.g., wireless, industrial internet of things, and Big Data). In the end, each type involves software and hardware, input and output, and there are certain tools best suited for each situation,” he says. “On the fundamental level, we need to understand the relationship between the system, process mapping, data wave sets, and intended use, and simply apply what we have learned during the past 30 years to the new technologies,” adds López (9).

The industry has come a long way in improving data integrity since the Barr decision in 2005 (1), a land­mark case that resulted in mandates for recordkeeping and the investiga­tion of out-of-specification conditions. New guidance and better integration of existing best practices-not only those designed for pharma’s end users, but for implementation specialists in IT and engineering-promise to push pharmaceutical manufacturers beyond data integrity in the future.

References

1. J. Gallant, “Data Integrity: Detecting and Mitigating Risk,” presented at the Life Sciences Trainers and Educators Network annual conference (2017).
2. S. Barkow, “Current Expectations and Guidance Including Data Integrity and Compliance With cGMPs,” presented at the International Society for Pharmaceutical Engineers Data Integrity Workshop (Bethesda, MD, June 5, 2016).
3.  O. López, “Defining and Managing Raw Manufacturing Data,” Pharm Tech 43(6) 32–38 (2019).
4. FDA, Data Integrity and Compliance with cGMP, Q&A–Guidance for Industry (Silver Spring, MD, December 2018).
5. C. Souza, Pharm Exec 38 (12) 28–29, 38 (2018).
6.  O. Lopez, “Comparison of Health Authorities’ Data Integrity Expectations,” presented at the 4th Annual Data Integrity Validation Conference (Boston, MA, August 2018).
7.  C. Plagiannos, “What is GAMP 5 and How Do I Use It Effectively, montrium.com, Nov. 30, 2015.
8. A. Shanley, “PDA Strengthens its Global Presence,” pharmtech.com, April 15, 2019.
9. O. López, “Electronic Records Integrity in Data Warehouse and Business Intelligence” in Data Integrity in Pharmaceutical and Medical Devices Regulation Operations (CRC Press, Boca Raton, FL, 1st ed., 2017), 341–351.

Article Details

BioPharm International
Vol. 32, No. 7
July 2019
Pages: 41–45

Citation 

When referring to this article, please cite it as A. Shanley, “Removing Gaps in Data Integrity," BioPharm International 32 (7) 2019.

Recent Videos
Behind the Headlines episode 5
Buy, Sell, Hold: Cell and Gene Therapy
Buy, Sell, Hold: Cell and Gene Therapy
Buy, Sell, Hold: Cell and Gene Therapy
Related Content
© 2024 MJH Life Sciences

All rights reserved.