iSpeak Blog

Best Practices for “True Copy Verification” with Paperless Validation Systems

Philip Jarvis
Mark Drinan
Dave O’Connor
Chris O’Halloran
R Daniel Harrison
Thao Phan
Digitization

Members of the paperless validation subcommittee created this blog post to discuss and recommend “true copy verification practices” for the use of Paperless validation systems to satisfy the current guidance for data integrity, and to discuss how to fully eliminate paper from various validation processes without impacting compliance to these current regulations.

As the momentum grows in the rollout of various digital tools to aid verification and validation in the pharmaceutical space there are many questions in relation to the correct application of the True Copy process to digital evidence within protocols in a paperless validation system. A True Copy is an exact copy of original documentation that preserves the same content, meaning and attributes of the original. It is an electronic copy maintained in an electronic document management system.

A list of the current regulations and guidance used to create this document are outlined below:

  1. WHO Annex 5 - Guidance on good data and record management practices
  2. PIC/s Guidance - Good practices for data management and integrity in regulated GMP/GDP environments (1st July 2021)
  3. FDA - data integrity and compliance with Drug CGMP, questions and answers guidance for industry (Dec 2018)
  4. MHRA - GXP data integrity guidance and definitions.

This blog post also seeks to provide discussion and guidance on some other topics related to “true copy” practices as below:

  1. The correct application of second person verification in the true copy process
  2. How to compliantly destroy original copies after verification of true copy.
  3. How to handle different types of data
  4. Dynamic data requirements

True Copy Verification Practice Guidance, Is 2nd Person Verification Required?

The WHO guidance has detailed guidance on the retention of “true copies” under a section for “special risk management considerations for review of original records”, and clearly states that “2nd person verifier or technical verification process is required to compare the electronic copy to original” and that this 2nd person verification “should be done in a manner that is securely linked to the True copy”.

However, this does not align to the PIC/S or MHRA definitions of true copy, and discussion of practices in these guidance’s reserve 2nd person verification as a requirement for transcription of critical data only (i.e., filling in a paper-based batch record or analytical result).

Also, PICS discusses "The effort and resource assigned to data governance should be commensurate with the risk to product quality and should also be balanced with other quality resource demands." PICS 5.3.2

Our opinion is that based on the level of risk of the data, 2nd person verification may not be required. (i.e., C&Q activities conducted are well before any batch is manufactured), and that the level of checks in the C&Q process, allow for sufficient control of any risk, and that 2nd person verification is not required.

Table 1 below outlines a risk assessment for two common CQV data types, and the level of controls within the system that ensure there is a low risk to the patient if there is any tampering with data after the first person “true copy “verification.


Table 1: Risk analysis for common CQV Data
Type of CQV data Failure mode CQV process controls Other QMS controls Comments Final Risk level
MOC cert (paper or static PDF record) MOC is wrong as Certification is falsified after “true copy verification” or Compliance Observation for data integrity MOC is reviewed as part of evidence for IV test case reviewed by multiple approvers for completeness to acceptance criteria, which includes Quality

Preventive Maintenance and maintenance procedures.

Final batch release testing (impurities)

Paperless validation system audit trail

There are multiple layers of control after the first-person verification of true copy Low
Temperature trend (CPP) Incorrect temperature control functionality (batch out of specification) or Compliance Observation for data integrity Temperature trend verification test case (OV), reviewed by multiple approvers for completeness to acceptance criteria, which includes Quality reviewing all PUR related data.

Calibration system

Review of CPP’s for batch release

Paperless validation system audit trail

There are multiple layers of control after the first-person verification of true copy   Low

As per the table above, a sufficient level of control can be exerted by verifying the attachment in the paperless system is scanned up in alignment with ALCOA principles and when the approvers of the verification document approve the document, they again check that the evidence is complete, accurate, and fulfills the acceptance criteria of the verification. So, this in itself can be counted as a robust control, as the attachment is not editable once stored in the paperless system without an audit trail, and this audit trail will get reviewed as part of verification documentation approval.

Once CQV documents have been approved, then the original records can be destroyed in a Controlled manner ( see discussion on Destruction below).

However, for higher risk data (i.e., process validation records, and attachments of analytical test results) 2nd person verification may be required, as this type of record poses a greater risk to the patient. This would be in keeping with the regulatory guidance that data integrity practices / controls should be commensurate with the risk of the data.

Destruction Of Original Copies After True Copy Verification

Once true copy verification has been completed, the original records can be destroyed. Based on the regulatory guidance’s it is recommended that there should be a site/corporate procedure followed /created for the destruction of the original records, that includes a record of destruction.

This would satisfy the requirement under PICS section 8.11 for “Disposal of original records and true copies”. Which states 8.11 "A documented process for the disposal of records should be in place to ensure that the correct original records or true copies are disposed of after the defined retention period."

One other recommendation is that the company defines when the original records are destroyed after true copy verification.

We recommend that for mature organisations where True Copy principles are robust and well applied the original records may be destroyed immediately after True copy verification is completed. This negates the requirement for storage of paper evidence and correct GMP document management of these hardcopy evidences.

An alternative, and possibly more conservative approach, that is applied in industry is to await approval of the associated document (i.e., after post approval of a CQV document, once all reviewers have reviewed all attached data) This is to allow for referral back to originals if a True Copy query is raised at review.

Types of Data

The PIC’s Guideline categorizes data into two categories:

  1. Static Data - A record format, such as paper or an electronic record, that is fixed and allows little or no interaction between user and record content.
  2. Dynamic data - Records such as electronic records, that allow for an interactive relationship between the user and the record content

The MHRA guidance introduces the idea of original data (dynamic or static) being:

The first or source capture of data or information e.g., original paper record of manual observation or electronic raw data file from a computerized system, and all subsequent data required to fully reconstruct the conduct of the GXP activity

Using these classifications above, and the GXP activity to be achieved during verification, table 2 below gives recommendations of how to approach attachments of data into a paperless validation system.

Data can also be categorized according to risk, and most of the guidance documents are aligned on this approach. The PICs guidance states “The Pharmaceutical Quality System should be implemented throughout the different stages of the life cycle of the APIs and medicinal products and should encourage the use of science and risk-based approaches”.

This is in line with ICH Q9 principles and based on this Table 2 includes a column for data Criticality / risk of the data. The generally the concept that was applied was the following:

  1. If the risk of falsification/ modification of the data to the patient is low (i.e., there are multiple controls in place before product would make it to the patient, or the data is not critical to the process), then the data has a low-risk category.
    An example of this would be C&Q data, where batches have yet to be produced, and multiple controls mitigate the risks associated with modification of data effecting the patient.
  2. If the risk of falsification/ modification of the data to the patient is high (i.e., there are no controls in place before product would make it to the patient, or the data is critical to the process), then the data has a high-risk category.
    An example of this would be QC release data where batches of product are released to the patient based on the review of the data.

Dynamic Data Requirements

Section 7.7.3 of the PICS guidance states for Dynamic data you “must retain dynamic nature of data” and “include all metadata /audit trials”, but it does not discuss how do this.

However, the PIC/S guidance states (in section 7.7.2), the following:

It is conceivable for raw data generated by electronic means to be retained in an acceptable paper or pdf format, where it can be justified that a static record maintains the integrity of the original data. However, the data retention process should record all data, (including metadata) for all activities which directly or indirectly impact on all aspects of the quality of medicinal products, (e.g., for records of analysis this may include: raw data, metadata, relevant audit trail and result files, software/system configuration settings specific to each analytical run, and all data processing runs (including methods and audit trails) necessary for reconstruction of a given raw data set). It would also require a documented means to verify that the printed records were an accurate representation. This approach is likely to be onerous in its administration to enable a GMP/GDP compliant record

This is in alignment with the MHRA guidance under section 6.11.2 “True copy”.

To meet this requirement for OV type verification attachments like temperature trends that may come from a validated system such as PI or Delta V, and have meta data (such as filters applied /scalability factors, and audit trails), we recommend to following practice:

  1. Print out the trend and if required annotate the data with the reference to the validated data repository and attach as a true copy into the paperless system under the verification test case associated with it.
  2. Capture in the paperless system the reference to recreate the data in the validated system (if not annotated). This would ensure that Meta data can be reviewed in the validated system (e.g., an API based reference or link to the validated system).

For data coming from an unvalidated system, there should be some provision made in the verification protocol for storing and referencing the data so that it can be re-created in the future.

The scenarios referenced in this section are discussed further in table 2 below. Table 2 covers most scenarios encountered during various verification and validation processes.


Table 2 - Examples of evidence and “True Copy” verification methods for Paperless validation systems
Format of attachment or evidence of verification Data criticality /risk Format of record(dynamic /static) Type of attachment/evidence What GXP activity is attachment used for? What is considered original record? Is true copy verification required?(Y/N) Method of true copy Verification Justification Controls in place Can original record be destroyed (Y/N)
Electronic or Hardcopy

Low

(Far back in process from patient, other controls in place to control risks of Contamination from wrong MOC)

Static PDF of MOC certificate or paper copy Verification of MOC compliance to engineering Standards or acceptance criteria The PDF file in the engineering system is the original copy (i.e., EPIC/ EIDA) N – As Original record will get uploaded into Paperless validation system, and content verified in electronic system, the only verification should be that the scanned copy is accurate, legible. On upload into ETOP in Paperless validation system verification of accuracy of scanning or attachment will be conducted by person scanning the Original. Paperless validation system copy will be copy of record used for verification and is retained in a validated system (Paperless validation system), to ensure it is an enduring record. The MOC cert supplied by the vendor cannot be verified as original copy from Material supplier Supplier assessment of Equipment suppliers Ensures equipment suppliers QMS systems are suitable and therefore Certification supplied is not falsified. Y
Paper

Low risk

(Data could be used for verification of instrument functionality, however some other controls in place such as calibrations, and daily standards check in lab)

Static

Paper printout from gauge (PH meter in lab)

Print out maybe on Thermal paper and not enduring

Verification of functionality of gauge

For C&Q purposes

The paper printout N No, true copy requires just accuracy of scanning document into system. When paper copy uploaded it is verified by the person completing the scanning of document and attaching it into the Paperless validation system. The electronic copy then becomes the enduring record Verification of accuracy of copy attached into paperless system Y
Electronic Data High risk (Close to patient, verification of CPP’s) Dynamic Trend of parameters (Temperature/pressure) OV verification of functionality (i.e., temp control) The data is in the validated system(e.g., data historian) N

Since to do true copy verification you have to maintain the dynamic format of the data (i.e., filtering and scale)

We should not make a true copy

Verification in Paperless validation system provides a reference to the validated data source

A static copy could be taken of trend to help with review only

If referencing back to validated data source the static document can be re-created.

Procedures should be referenced to ensure that people reviewing static data is compared to validated data source.

N – original data in electronic repository should be retained
Electronic Data High risk (Close to patient, verification of CPP’s) Dynamic

Trend of parameters

(Temperature/pressure)

From an unvalidated source

(e.g., un-validated SCADA before final qualification)

OV verification of functionality

(i.e., temp control)

The data in the SCADA system N

Since to do true copy verification you have to maintain the dynamic format of the data (i.e., filtering and scale)

We should not make a true copy

Verification in Paperless validation system should provide reference to the original data and the original data should be able to be re-created from this (Use suitable data storage medium, i.e., a backup file of data on a validated server)

A static copy could be taken of trend to help with review only

If referencing back to validated data source the static document can be re-created N – The original data must be maintained as an enduring record along with ALCOA principles
Electronic Data High risk (Close to patient, verification of CQA’s) Dynamic Analytical test result from LIMS for a process validation protocol Evidence of successful process validation The data in the LIMS systems N

"It is not needed as the original data will always be available”

We should not make a true copy

Verification in the paperless system should provide reference to the original data and the original data should be able to be re-created from this (Use suitable data storage medium, i.e., a backup file of data on a validated server)

A static copy could be taken of data to help with review only

If referencing back to validated data source the static document can be re-created N – The original data must be maintained as an enduring record along with ALCOA principles
Paper High risk (Close to patient, verification of CPP’s) Static Analytical test result from QC lab for a process validation protocol (Example of a standalone PH meter which does not retain data, and analog reading is transcribed onto paper record /or print onto paper ticket) Evidence of successful process validation The paper-based Lab record Y True copy conducted as part of upload as attachment into test script via E-sig. When paper copy is uploaded it is verified by the person completing the scanning of document and attaching it into the Paperless validation system. True copy procedure, and instruction in test protocol Y

Conclusion

  1. We are recommending, that a 2nd person verification is not required for true copy with the exception of “critical data” as discussed above.
  2. Table 2 recommends the approaches for “true copy” verification, or if true copy verification is required, depending on the format and data criticality of the record.

Do your organizations procedures align with the concepts in this blog post, or do they categorize data differently and apply different rules around true copy? The team would like to hear your opinion on the current guidance’s and whether you agree with our conclusions.