All potential uses and supporting endeavors of the data must be examined and then defined as a virtual process stream.
The implementation of good quality electronic systems is challenging. In most organizations, business processes are entrenched, personnel are overworked, and financial resources are limited. Sometimes these organizations do not understand that aspects of their electronic environment are out of compliance or even worse, may even not realize the need for compliance.
Douglas Bissonnette
Today's networked, and web-enabled, biopharmaceutical environments demand data of the highest quality. Do not regard Good Laboratory Practice (GLP), Good Clinical Practice (GCP), and Good Manufacturing Practice (GMP), collectively known as GxP-related computer systems, as a collection of discrete technologies solely supported by individual validation packages and a few SOPs. A company must think of its entire virtual, GxP-related environment as a process stream. This environment must have an entire quality system built around it to be able to meet the intricacies of the law and to be able to produce, manipulate, analyze, transfer, and store high-integrity data.
This article is aimed at individuals working in GxP functional areas, like laboratories, clinical trials, clinical and commercial manufacturing, quality, supply chain, and the information technology (IT) organizations. An approach to utilize such thinking for building compliance and quality into the readers' organizations will be outlined. This article will not define how to develop a 21 CFR Part 11 risk document; rather it will examine the context of Part 11 as an attribute of a larger, frequently over-looked, and extrinsic, quality system.1,4 We introduce a methodology to outline, assess, and document this virtual process stream and make recommendations on how to close compliance and quality gaps.
Table 1. Basic Quality Attributes
The ubiquity of desktop computers with easy-to-use spreadsheets and databases, web-enabled technology, and high-speed networks has allowed many people to rely on these tools for their daily work. This evolution of technology in today's society has been a panacea, but has also made it easy for those working in regulated industries to implement computers into their work practices without forethought of the consequences. As a result, workers in FDA-regulated organizations have to keep up with technology changes while still ensuring that the safety, identity, strength, purity, and quality of the product, and ultimately consumer safety, are not detrimentally affected. These challenges apply equally to all individuals working in GxP functional business areas, not just IT specialists.
Many IT departments do not fully understand the requirements of the law, nor do they understand the impact that their work practices and decisions have upon the larger enterprise. Business people are not likely to be aware of how their day-to-day practices impact the virtual environment and the data being produced. Both sides may be fooled into thinking that because they have implemented a validated computer system that they are in compliance with the law. This thinking is erroneous, because the days of stand-alone or isolated computer systems dedicated to one function are gone.
The Food and Drug Administration (FDA) has written rules and statements in attempts to guide industry on the use of computer technology. There is confusion in industry concerning Part 11 rules for electronic records and electronic signatures. We believe that this confusion stems from a reading by many that FDA's position on Part 11 of a "narrow interpretation" means that FDA has in fact 'backed-off' or relaxed its stance. This understanding is incorrect and untrue, as evidenced by FDA's own statements that the intent of Part 11 still exists, but the number and types of systems falling under Part 11 are not abundant.1 We contend that this "narrow interpretation" of Part 11 has lulled many into believing that when Part 11 is applicable, it should only apply to discrete systems. Discrete is not meant to solely describe a stand-alone PC. Networked systems within the confines of the organization are also considered discrete in the context of this article.
Table 2. Compliance Attributes of a High-Level System
Part 11, although well meaning, has placed a burden on industry that has caused many organizations to lose sight of basic quality. With the advent of a risk-based approach to Part 11, many appear to have taken the position that all computer quality is risk-based. This is not true. The intent of the regulations has always been to require a demonstration of repeatability of functions, and a demonstration of control and understanding of the systems at any point in time (historical or current). Demonstration of repeatability is determined by the degree of validation, revealed through risk analysis. Demonstration of control and understanding of the virtual stream should always be present despite the regulation.
There has been too great a focus on understanding what Part 11 is and how to apply it. This anxiety has caused many to forget or even fail to realize that there are basic quality practices that must be in place despite Part 11. These quality control practices should be centered on the attributes listed in Table 1.3 Principally, auditable documentation must exist to show compliance with the requirements of the law. Table 2 outlines compliance attributes of a high-level computer system that act as a benchmark for demonstrating good faith.
With the increased demand for organizations to share information externally as well as internally, it is imperative that quality supports the entire environment. Such an environment frequently exists across all supporting business functions and regulatory practices (i.e., GLP, GCP, GMP). We have found that common quality assurance practices focus on discrete systems of the organization without taking into account the end uses of the information, or how it will flow to and across those discrete systems. An organization's entire environment, consisting of technology, supporting business practices, and documentation that impacts the quality of its data must be considered.
Table 3. Example of an Assessment Questionnaire
Over the past several years more attention has been devoted to validating network equipment, like routers. However, information is still frequently shared within and between organizations using non-validated media (e.g., disks, CDs, email files) that exist outside of the validated (or validateable) domain. It is common for such information to be used for important decision-making, investigations, and submissions without regards to possible adverse consequences.
Examine all potential uses and supporting endeavors of the data and then define them as a virtual process stream. This stream should consist of technology and business practices across GxP areas for production, manipulation, analysis, transfer, and storage of data.
This effort will produce a thorough, documented understanding of the entire stream and will identify gaps in existing understanding of the environment. This work will ensure that legal requirements are met and a strong quality system is in place to support a virtual environment of technology, documentation, and supporting business practices producing GxP-related information.
We have encountered a few organizations that did not consider the virtual process stream in their daily practices. It might be helpful to talk about a few cases. Many biotechnology and pharmaceutical firms have implemented Enterprise Resource Planning (ERP) systems across their local area networks (LANs) and wide area networks (WANs) that may fully meet the expectations of a high-quality electronic system. However, the generated data may then be shared with an external partner organization. The technology (i.e., ETL tool, XML, disk) and business practices used to transfer that information, and the use of the information within the partner organization have not been fully taken into account by the originating organization. Frequently, we have found that once the information leaves the originator's environment, the originator seems to believe that their responsibility for the data has been nullified.
Table 4. Electronic Quality Systems Guidance Document
Another example focuses on the collection and use of data from clinical trials.4 For example, data are collected at multiple sites and sent to the analyzing or end-use organization. We have witnessed on many occasions these data being transcribed (by hand or fax) into electronic format from paper forms, saved to file, and physically or electronically mailed out to a receiving organization. The data may undergo several more conversions (paper to electronic, electronic to paper, or electronic to electronic) and cuts for the statistical analysis before being sent to the primary end-user. Data are then prepared for use in important decision-making or a submission.
Although the analysis plan was generated, it is common for the plan to not specify the handling of data in its entirety, or to specify incorrect or poor-quality procedures to handle the data outside of the statistical processing. The sponsoring agency may not have any idea of the actual detailed handling (storage and conversion) and transmissions that occur to support the generation of the final report. Many organizations feel that the storage of the paper case report forms relieves them from concerns about the quality attributes of the downstream electronic, or physical plus electronic, processes exercised.
As our last example, we have observed that a client assumes all processing occurs on an individual server that may be validated, but some sort of handling occurs that moves the data off the server to another non-validated location. Also, no procedures may exist to support functions that are occurring to the data that is now stored at this other location. As a result, a "new" copy of the dataset is created and that may end up being used in a study or a filing without thought to its origination. An inspector could immediately call the integrity of a decision, submission, or investigation into question.
Companies need to implement a good quality electronic system around their virtual GxP-related process stream. Here is a succinct and well-organized approach for assessing current status and providing recommendations for closing quality and compliance gaps.
During the first phase, organize the environment into logical processes. This can be done in many ways to suit the organization. Usually the processes would be organized according to a functional area, for example, supply chain distribution. If the firm has multiple product pipelines with different technologies and business practices, both internally and at differing partner sites, the process flows can be differentiated as Product X and Product Y.
Thinking of the electronic environment as a stream of logical processes is instrumental to building and maintaining a good and cost-effective quality system. Delineation of boundaries around individual computer systems is ineffective and often leads to an organization losing sight of its larger-picture environment, which usually consists of inter-dispersed business practices (i.e., handling) that may not be fully realized.
Develop flowcharts to document and view the logical processes. Multiple charts for the same process flow should interconnect. Such flow charts should contain references to supporting documents, such as SOPs and validation packages. We recommend that the generated flowcharts be approved and managed in a document control program. Microsoft Visio has been found to serve as a good tool.
During this first phase, draw up and fill out an assessment questionnaire to identify quality and compliance gaps in each of the logical processes. Design the questionnaire around each listed attribute. Questions posed should include the existence and suitability of procedures, policies, business practices, and technology, directly or indirectly supporting the electronic process stream. A brief example is Table 3.
For phase two, once the assessment has been completed and rated, develop an electronic quality systems guidance document. This document is a design tool, built from the flow charts and results of the assessment questionnaire. It provides a summary outline for the organization's environment and makes recommendations for closing any quality and compliance gaps that may exist. The documents are considered living documents, and should be revisited on a periodic basis, preferably yearly. The document should contain the sections listed in Table 4.
Now you are ready to confidently face an audit. The virtual process stream is understood and under control. Your guidance document clearly outlines all of the technology, business practices, and documentation employed that supports the virtual process stream. As a result, the quality of the virtual stream, used to support GxP areas, will be beneficial to consumers of your products as well as your own corporate harmony. Inspectors will appreciate visual flowcharts linked to organized documents that encompass complex technical environments. The methodology outlined should tell any inspector that your organization is making every effort to demonstrate control and understanding of the environment supporting the GxP data, and that nothing is being hidden in its complexity.
Douglas Bissonnette is an independent consultant, 48 Harvard Lane, Bedford, NH 03110, 617.899.6410, fax 866.511.8961, doug.bissonette@comcast.net
1. FDA. Guidance for Industry, Part 11, Electronic Records; Electronic Signatures - Scope and Application. US Department of Health and Human Services. Bethesda MD 2003 August.
2. FDA. Quality System Regulation. 21 CFR Part 820. US Department of Health and Human Services. Bethesda MD revised 2001 April.
3. Peltier TR. Information Security Risk Analysis. Information Technology — Code of Practice for Information Security Management. ISO/IEC 17799:2000. CRC Press. Boca Raton FL 2001 September.
4. FDA. Guidance for Industry, Computerized Systems Used in Clinical Trials; Draft Guidance; Revision 1. US Department of Health and Human Services. Bethesda MD 2004 September.
5. Chamberlain R. Computer Systems Validation For The Pharmaceutical And Medical Device Industries. Alaren Press. Libertyville IL 1994.
6. Russell D and Gangemi GT Sr. Computer Security Basics. O'Reilly & Associates. Sebastopol CA 1991.
7. Wingate G (ed). Computer Systems Validation – Quality Assurance, Risk Management, And Regulatory Compliance For Pharmaceutical And Healthcare Companies. CRC Press. Boca Raton FL 2004.