Features
January / February 2019

Data Integrity: Beyond the Lab

Nuala F. Calnan, PhD
James G. Davidson, PhD
Data Integrity: Beyond the Lab Banner

The 2018 ISPE Quality Manufacturing Conference, held 4–6 June 2018 in Arlington, Virginia, included a well-attended session entitled “Data Integrity—Beyond the Lab,” which reaffirmed continued focus from both industry and regulators on this critical element of assuring product quality and patient safety.

The program, chaired by James Davidson, PhD, Vice President, Science and Technology, Lachman Consultant Services, Inc., included a variety of perspectives shared by Paula Katz, FDA regulatory attorney and former director of the agency’s Office of Manufacturing Quality; Aidan Harrington, PhD, Senior Consultant, DPS Group; and Nuala Calnan, PhD, Senior Associate, Lachman Consultant Services, Inc. The session concluded with a lively audience participation session, in which contributions from both the podium and the floor confirmed that concerns related to data integrity challenges and risks extend beyond the lab, onto the manufacturing shop floor, and into the supply chain.

Tip of the Iceberg

Katz reminded participants that ensuring data integrity is an important component of industry’s responsibility to ensure the safety, efficacy, and quality of drugs, and of the FDA’s ability to protect public health. Data integrity underpins cGMP, the minimum standard required to assure product quality, and lapses can obscure other problems. 1 Data integrity issues unearthed during an inspection raise a red flag about the integrity of other quality practices, the level of control and oversight by management, and the levels of qualification, training, and access of frontline staff who may consciously or unconsciously impact data quality and integrity.

Data integrity continues to be a factor in a significant portion of OMQ Warning Letters (WLs), she said. Sharing FDA data for FY 2017 and Q1 2018, Katz showed that data integrity shortcomings appear in just under 50% of all WLs issued. Furthermore, data for FY 2015–FY 2017 show that the detection of data integrity issues during regulatory inspections continues to rise, confirming that they remain a global challenge for the industry. This is despite the fact that five years have elapsed since the UK’s Medicines and Healthcare Products Regulatory Agency (MHRA) announced its expectation regarding self-inspection and data integrity in a December 2013 news item 4 .

Deep and Wide Perspective

An examination undertaken by the authors of recently published data integrity deficiencies from both FDA WLs and EU Statements of Non-Compliance shows that a much broader range of ALCOA+ issues extend well beyond the lab, including:

  • Data quality, security, or integrity issues with batch production and control records
  • Record review practices in general
  • Falsified management oversight on follow-up and closeout of commitments given in earlier inspections
  • Incomplete or falsified recording, review, and closeout of deviations and investigations in aspects of the pharmaceutical quality system
  • Control and use of automatic, electronic, and computerized systems across the plant
  • Falsified cold-chain records
  • Falsified contamination control data associated with the production areas
  • Falsified cleaning records for production equipment

This litany of failures highlights the need for a broader industry perspective when examining potential weaknesses in production systems and business processes, including an honest review of how they operate within their organizations.

* See “QP Not Present in Company: Prohibition of Supply,” EU NCR Report 2017090955 by the Danish Medicines Agency for EuroPharma. This indicates that not only are inspectors seeking data integrity infringements “beyond the lab” but they are also looking at it beyond the shop floor and into the supply chain. https://www.gmp-compliance.org/gmp-news/qp-not-present-in-company-prohibition-of-supply

The Role of Corporate Culture

It could be said that industry’s response to date has largely focused on weaknesses in the technology platforms and systems in use—most specifically in the quality control laboratory—by driving data integrity programs that focus on gap assessments of physical equipment and computerized systems to identify mitigation requirements. What has perhaps been overlooked in the pressure to complete the assessment work is the significant role that the prevalent culture within an organization can play in identifying and preventing data integrity risks.

Organizational culture directly influences day-to-day behaviors and actions, giving rise to data quality and integrity outcomes that matter to the patient and ultimately to the business. Furthermore, responsibility for the health of an organization’s culture lies firmly with its leadership. When leaders have a clear understanding of the desired culture and behaviors, they can consciously and more effectively influence employees by their own behavior. Leaders can achieve this by how they allow, reward, and model the desired behaviors for their associates.5

One of the first steps for success is to ensure that both corporate and site leadership are aware of and fluent in the increasing regulatory expectation for good data governance. They are then more likely to influence their organization toward the necessary actions. A clearly communicated good data governance program enables the entire organization to understand the desired state of protecting patient safety, ensuring product quality, and understanding the role of data integrity. Leaders should share this message broadly and frequently within the organization, both formally and informally. It is essential that they return frequently to the message to sustain behaviors and rearm the importance of data quality to overall product quality.

Leaders are also responsible for promoting an environment that is open and free from blame or fear, where ideas to improve quality and data integrity compliance are welcome, and where employees are not afraid to voice data integrity concerns. Many integrity breaches are not intentional, and if employees discover vulnerabilities, they should not be afraid to raise and address them. This “speak-up” culture is a key success factor that mobilizes the entire workforce to seek out and identify potential data integrity issues. This spreads the burden and increases opportunities for success, rather than leaving the task to the data integrity subject matter expert team.

Data Gemba

During the June conference, Dr. Nuala Calnan presented a very practical and effective way to drive the message right down to the shop floor, the lab bench, and the warehouse: Consider introducing the practice of routine data Gemba walks as a means to discuss and highlight data integrity risks. Gemba is a well-known operational excellence practice (which may be either formal or informal) of regular management visits to the shop floor to observe, assess, listen, and coach employees on issues of quality improvement.

Gemba walks confirm that desired quality behaviors are practiced on a day-to-day basis, and that opportunities for continuous improvement are routinely identified and implemented, as appropriate. They offer a safe way to raise “speak-up” concerns or issues, and maintain focus on the importance of data integrity to overall quality outcomes for the area. Data Gemba can be planned either for a physical area or by walking the data life cycle of a critical record through the facility and engaging frontline staff to share their insights. They offer a much broader, alternative perspective than that found by executing asset register assessments on a system-by-system basis.

The new GAMP® RDI Good Practice Guide: Data Integrity—Key Concepts 6 includes a data Gemba checklist template that offers both leader self-learning and coaching questions that can be used during a data-integrity-focused Gemba walk.

Nuala Calnan, PhD, and James Davidson, PhD
Nuala Calnan, PhD, and James Davidson, PhD

Understanding the Dam Data!

A fundamental consideration in the proactive communication of data integrity risks is to ensure that everyone in the organization understands what is meant by the term “data” with respect to good data governance and data integrity. A common problem is that the raw data (or result file) is backed up and available, but metadata and associated audit trail files are not secured as part of routine backups. When we talk about “data,” therefore, it is helpful to think about it as the “DAM” data (raw Data, Audit trail, and Metadata). This can serve as a reminder that ALCOA+ principles should ensure that all aspects of the raw data, audit trail, and metadata are complete, consistent, enduring, and available.

This can be a challenge for many older plant floor computerized systems, where backup and archive procedures may capture raw data, but the metadata and associated audit trail for that record are often stored in different areas of the system architecture or file structure. It’s important to remember that the goal of retaining and securing data is to be able to recreate the associated records; this cannot be achieved if the metadata and the audit trail are not also retained and linked to the raw data.

Some older SCADA,* building management, and stand-alone systems such as PLC-controlled autoclaves, fridges, and freezers also present challenges in terms of capability to meet audit trail review requirements.

Audit Trail Review Expectations

At the June conference, Dr. Aidan Harrington, Senior Consultant, DPS Group, explained that audit trails need to be “available, convertible into a generally intelligible form, and regularly reviewed.”2 3 Because audit trails tell us WHO did WHAT, WHEN, they should be capable of doing so automatically and contemporaneously. Harrington also noted that audit trails should ideally also tell us WHY the user undertook the action. In principle, he said, audit trails have two purposes:

  1. They provide a history for the data, which helps decide if the data can be trusted.
  2. They should deter wrongdoing.

Harrington added a cautionary note, however: Without adequate review, audit trails provide no meaningful deterrent. For many of the older systems mentioned above, ensuring that the audit trail is both accessible and available for routine review presents real challenges for industry.

In looking across the range of regulatory guidance on audit trails, from CFR 21 Part 11 9 right through to the latest MHRA GxP guidance,7 Harrington pointed to the confusing variety of terms used to describe how frequently audit trails should be reviewed: “regularly,” “adequately,” “periodically,” and “routinely.” Navigating a path through these options requires a robust risk assessment to determine the review period relevant to the intended use of the system in question.

Evaluating the different purposes of such reviews should also be part of the risk assessment. A likely scenario could include routine review of the data audit trail associated with a critical record (e.g., reviewing audit trails for nonconformance events associated with each batch record created). Beyond that, there may also need to be periodic checks of the audit trail or technical system logs, which are random or targeted to con rm correct, ongoing system operation by all user groups who have access to a given system, e.g., user, reviewer, system administrator.

Recent guidance documents 7 8 acknowledge that reviewing audit trails on many legacy systems will present a burden that is not sustainable in the longer term. They recommend that a more appropriate way to manage this burden may be to establish validated exception reports that identify and document “predetermined ‘abnormal’ data or actions, that require further attention or investigation by the data reviewer.” 7 The PIC/S guidance goes further to recommend that “companies should endeavor to purchase and upgrade software that includes electronic audit trail functionality.” 8 Until such time as the systems in use have been upgraded or replaced, it is important not to neglect the expectations for audit trail review and to implement practical “alternative arrangements to verify the veracity of data, e.g., administrative procedures, secondary checks and controls.”7

* Supervisory control and data acquisition systems

Third-Party and Outsourced Services

Finally, it is crucial that the control measures implemented for critical data are not myopically applied only to systems and personnel within the boundaries of the organization. Given the fragmented and complex nature of current pharmaceutical supply chains, it is essential that traditional supplier quality agreements be updated to reflect clear roles and responsibilities related to each data life-cycle activity. Furthermore, supplier and third-party auditing programs should routinely include evidence of good data governance in the day-to-day practices.

It is clear that the extent and impact of data integrity expectations has well and truly extended beyond the lab. Make sure, therefore, that your efforts across your product life cycles are prioritized according to your actual risks.