Technical
March / April 2026

Audit Trail Review: Regulation and Practice in GxP Environments

Anna Dachs Soler
Sion Wyn
Audit Trail Review: Regulation and Practice in Gxp Environments

Since the US Food and Drug Administration (FDA) introduced the requirement for secure, computer-generated, time-stamped audit trails in 21 CFR Part 11, audit trail functionality has been recognized as a critical component of data integrity.

Background

Over time, the 2005 US FDA enforcement action against Able Laboratories1 highlighted the importance of reviewing audit trail entries as part of routine data verification, reinforcing that audit trails are not just technical features but essential compliance tools. This came after internal investigations uncovered systematic manipulation of analytical data, lack of audit trail review, and improper data verification practices. European Union (EU) GMP Annex 11 (2011) further emphasized that audit trails must be available in an intelligible form and subject to regular review, based on risk.

Yet despite these longstanding requirements, the effective and consistent review of audit trails remains a widespread challenge in GxP environments. Recent guidance from the FDA, UK Medicines and Healthcare Products Regulatory Agency (MHRA), The Pharmaceutical Inspection Convention and Pharmaceutical Inspection Co-operation Scheme (PIC/S) PI 041, and industry best practices, including ISPE GAMP® 5 Guide: A Risk-Based Approach to Compliant GxP Computerized Systems (Second Edition) and the ISPE GAMP® RDI Good Practice Guide: Data Integrity by Design, clarify that meaningful audit trail review must be integrated into the data life cycle and routine operation—not treated as an isolated or box-ticking activity. Regulatory bodies such as the MHRA and FDA, as well as guidance like PIC/S PI 041 and ISPE GAMP® Good Practice Guide: Data Integrity by Design, emphasize this need through detailed requirements on audit trail relevance, periodicity, and scope of review..2, 3

ISPE GAMP® 5 Guide (Second Edition) makes a critical distinction between data audit trails (linked to GxP-relevant records) and system or technical logs and promotes the use of validated review by exception tools to increase efficiency and reliability. To ensure a complete understanding of audit trail review tools, it is important to clarify the concept of exception reports. According to the MHRA Data Integrity Guidance and Definitions3, an exception report is: “A validated search tool that identifies and documents predetermined ‘abnormal’ data or actions, that require further attention or investigation by the data reviewer.”

This definition highlights the need for system-generated tools that help proactively detect potential data integrity issues and trigger timely review actions.

This article, grounded in real-world experience supporting small and mid-sized pharmaceutical and biotech companies, explores practical barriers, recurring misconceptions, and applied strategies to align audit trail review practices with regulatory expectations. It draws from key industry publications and regulatory guidance, offering a balanced and practical perspective to support compliance and operational maturity. The article also integrates lessons learned from GAMP Community insights and real-world implementation experiences. The intent is to support GAMP’s mission to align practice with principle, especially in resource-constrained settings.

Audit Trail Review: What Regulators Ask vs. What Industry Applies

Audit trail requirements are embedded in foundational GxP regulations and guidelines, such as 21 CFR Part 11 Subpart B (11.10(e)) in the US, EU GMP Annex 11 (Section 9), and PIC/S PI 041-1 (Section 9.6)1, 3, 6. These documents define audit trails as secure, computer-generated records that must:

  • Capture the who, what, when, and (where applicable) why of data actions
  • Be time-stamped, protected, and tamper-evident
  • Be reviewed as appropriate and necessary and retained for the entire retention period of the associated electronic record

However, although the regulations mandate the existence and review of audit trails, they often do not provide specific operational guidance on how to conduct those reviews meaningfully or efficiently. That role has been taken on by industry frameworks such as ISPE GAMP® 5 (Second Edition) and the ISPE GAMP® RDI Good Practice Guide: Data Integrity by Design, which offer detailed strategies for aligning system functionality with compliance needs. For example, GAMP guidance distinguishes clearly between data audit trails and system technical logs, emphasizing that only audit trails associated with GxP-relevant records require routine review.4, 5

In addition, GAMP guidance, based on regulatory guidance from FDA and MHRA, recommends that audit trail review should be part of the routine data review and approval process, and that personnel responsible for record review should review the relevant audit trail as part of the review. It also promotes the use of review by exception. This is supported by validated search functions that automatically flag deviations or abnormal events—thus avoiding unnecessary manual effort and reducing the risk of oversight4, 5. Despite the existence of this practical guidance, it is evident from industry experience and regulatory observations that companies still struggle to apply these principles consistently. The gap lies not in the lack of regulation or guidance, but in the translation of those principles into practical, everyday procedures.

Regulators such as the FDA and MHRA have echoed these expectations in their guidance documents:

  • The FDA states “Audit trails that capture changes to critical data must be reviewed with each record and before final approval of the record”2
  • The MHRA specifies that audit trail review should be “performed as part of the review of the record and not as a separate exercise”3
  • PIC/S PI 041-1 calls for the identification of “audit trail entries of significance for review” and stresses the need for integration into the routine workflow6

In practice, however, many companies:

  • Review technical system logs indiscriminately, confusing them with GxP audit trails
  • Lack clear standard operating procedures (SOPs) defining review frequency, scope, and criteria
  • Do not leverage system capabilities for filtering, flagging, or documentation

As a result, the process either becomes an excessive burden or a superficial check, neither of which fulfils the regulatory expectation of meaningful review. The challenge, then, is not just in meeting the letter of the regulation, but in achieving data integrity and data quality in a way that is efficient, risk-based, and sustainable. The following sections explore these operational gaps further, supported by field examples and recommendations to help organizations close the persistent gap between regulation and practice.

System vs. Data Audit Trails: Why This Distinction Matters

In practice, one of the most common misinterpretations observed during audits and implementation projects is the failure to differentiate between types of audit trails. This may seem like a technical detail, but it has real consequences on compliance, efficiency, and data integrity assurance. Some things that may be referred to as “audit trails” are not what regulatory agencies expect from a regulated data audit trail review.

The challenge, then, is not just in meeting the letter of the regulation, but in achieving data integrity and data quality in a way that is e cient, risk-based, and sustainable.

A data audit trail records the actions that impact GxP-relevant data, such as creation, modification, or deletion of a laboratory result or a manufacturing entry. These are the audit trails that, by regulatory expectation, must be reviewed as part of the record verification process. On the other hand, system audit trails or technical logs capture system-level events, including user logins, password changes, configuration adjustments, or role management. Although important for security and control, these are not typically required for routine review unless defined by risk-based processes.

Unfortunately, in some organizations, all “audit trail” entries are treated equally. Reviewers are expected to examine system logs with the same rigor as data entries, often without adequate tools or context. This leads to either:

  • Exhaustive and low-value reviews of nonrelevant information, or
  • Superficial reviews that overlook critical changes to regulated records

This confusion can usually be traced back to SOPs that do not define review scope clearly, or to a lack of alignment between quality assurance (QA), IT, and business users. ISPE GAMP® 5 (Second Edition)4 makes this distinction explicit, advising that only audit trails linked to predicate records (the subject of GxP regulations) require routine review. Similarly, PIC/S PI 041-16 advises organizations to assess which audit trails are significant and relevant, and to define appropriate review strategies based on data criticality. The purpose here is not to reduce oversight, but to ensure that effort is focused where it matters most. In regulated environments, applying a one-size-fits-all approach to audit trail review wastes resources and also obscures true compliance risks. A mature approach requires training, system configuration, and procedural clarity. Distinguishing between data and system audit trails is a foundational step toward meaningful, risk-based data integrity management.

Table 1: Common situations and targeted recommendations.
Real IssueGxP Risk/ConcernRecommended Action
Legacy system without audit trail functionalityThe bigger risk to the patients is the inability to identify and review changes in the data to understand their potential impact on product qualityPlan for system replacement or retirement; migrate to a compliant solution with validated audit trail functions
Audit trail exported and reviewed manuallyPatient safety is the primary concern with the loss of data integrity and the lack of review traceabilityPerform review within a validated environment; enable electronic review tracking in system if possible
Reviewers checking login history routinelyMisuse of resources; not required unless risk-justifiedRestrict to investigation scenarios or periodic system-level review per risk assessment
Configuration change logs reviewed without clear criteriaUnclear linkage to change management, engineering, administration, or other control processesEnsure configuration changes are managed through established management process change control processes
System generates thousands of entries with no filteringOverwhelming review volume, reduced effectivenessImplement exception-reporting or risk-based sampling strategies
Reviewers unclear on what to checkInconsistent review, audit readiness gapsProvide targeted training; define clear criteria and examples in SOPs
Use of hybrid, non-validated review methods (e.g., printouts, Excel)Add complexity and are error prone; noncompliance with ALCOA+Avoid hybrid methods; review and document findings within validated systems only

For example, some companies still perform routine review of system logs such as login history, treating it with the same level of criticality as data audit trails. This practice raises an important question: what value does reviewing login records on a periodic basis bring to data integrity assurance? According to regulatory guidance and GAMP principles, such technical logs are not required to be reviewed routinely or periodically unless justified by risk. Their purpose is better aligned with security investigations, incident management, or demonstrating systems are under control during periodic review, but not with ongoing routine GxP data review processes.

Similarly, the review of system audit trails to detect configuration changes, such as updates to timeout settings or security parameters, should be governed by the organization’s change control procedures. If a change or other activity is controlled through an effective change control, configuration management, administration, engineering, or maintenance process, the existence of a corresponding system log entry, even if called a “system audit trail,” may serve as supporting evidence; however, it does not imply that this log must be reviewed periodically. Reviewing such entries routinely does not add value. The focus and effort should be on ensuring that such processes are effective and robust.

In both scenarios, clarity in procedures and alignment with risk-based principles are essential. Organizations should ensure that their SOPs reflect a pragmatic and value-driven approach to data audit trail review and other system log reviews, avoiding unnecessary effort while maintaining control and compliance and managing quality risk.

Persistent Gaps Observed in Practice

Although regulatory guidance and GAMP principles are clear, there remains a significant gap between what is expected and what is consistently applied in the field. This section highlights recurring gaps observed during audits, consulting work, and informal discussions with QA and IT professionals in GxP-regulated environments.

A frequent issue is the existence of SOPs that require “audit trail review” without providing specific instructions or criteria. Terms like “review audit trail” or simply “review” are often used generically, creating significant confusion for those responsible for the task. Reviewers are left uncertain about what type of activity is expected: whether the requirement refers specifically to the data audit trail review mandated by regulations, to other system or technical log reviews, or to a broader, more general verification activity. This lack of precision represents a persistent gap in practice, as the reuse of the term review without explanation or definition results in inconsistent approaches. Some reviewers may miss critical regulatory checks, whereas others spend time on irrelevant system records.

Moreover, even when organizations define in detail what should be reviewed, certain checks may not make sense, depending on system design. For example, some SOPs include requirements to verify audit trail entries related to data reprocessing. However, if a system does not technically allow reprocessing of results, such a review point adds no value and only creates confusion. This illustrates that definitions should be clear and also tailored to the actual capabilities and risks of the system in scope.

Another gap lies in the lack of reviewer training. Individuals tasked with “audit trail review” often lack a full understanding of the data life cycle, the purpose of data audit trails, or the system’s technical capabilities. As a result, reviewers may miss critical data changes. Conversely, they may over-focus on irrelevant system logs or technical records not intended to serve as audit trails by the regulations.

From a systems perspective, many platforms still fall short in providing tools to support meaningful audit trail review. Systems that do not allow filtering or keyword search may make it extremely difficult to carry out a focused and efficient review. In some cases, companies resort to printing PDF exports or scanning raw log files manually. These practices undermine data integrity and also introduce unnecessary complexity and room for error.

Another commonly observed challenge is the persistence of hybrid or workaround approaches. For instance, in some environments, audit trails are exported and reviewed outside the validated system, with reviewers annotating printed copies or spreadsheets. Although perhaps well-intentioned, such practices are complex, error prone, and highly vulnerable to manipulation. They are also not aligned with current data integrity expectations outlined in PIC/S PI 041-16 and GAMP guidance.4 Moreover, exporting audit trails often breaks the linkage to the associated data records. In well-designed systems, audit trail entries are relationally linked to the corresponding record, allowing reviewers to trace the impact of changes directly and in context. Once exported, this relationship is lost, making it harder to verify the integrity of the data or understand the significance of modifications without full system visibility.

Finally, system design limitations remain a significant root cause. Some “audit trail” modules generate thousands of entries per month across fragmented files, the majority irrelevant to data integrity or quality, and making meaningful manual review impossible. In these cases, companies must apply risk-based sampling strategies or invest in exception-reporting tools to ensure that review activities remain focused and value-driven. Ultimately, the gaps observed are not due to lack of intention. Rather, it is a combination of system constraints, procedural vagueness and lack of clear objectives, and underestimation of the complexity involved in the review of such information. This includes in actual data audit trails, or in many and various system and technical logs.

This article builds on contributing recent examples and perspectives gathered from implementation support and field observations, with the intent of reinforcing a shared understanding across the community.

Practical Examples From the Field

Real audit trail scenarios encountered in regulated environments often vary by system maturity, available functionality, and interpretation of expectations. Table 1 summarizes common situations and provides targeted recommendations, particularly in the context of system audit trail review during periodic assessments.

This approach reinforces that not all audit trails require the same review frequency or depth. When organizations decide to review system logs, such as login events or configuration changes, as part of their periodic review strategy, this activity should be understood not as a direct control of product-related data, but rather as a way to ensure the system remains in a validated state. This aligns with the principles in ISPE GAMP® 5 (Second Edition)4 and PIC/S PI 041-16. For example, when configuration changes are reviewed during periodic audit trail reviews, it is not to reapprove changes (as that is handled via change control), but rather to confirm traceability and that the change control process is consistently followed.

In situations where audit trail exports, which in fact are system logs, result in large numbers of fragmented files (e.g., monthly CSV logs across multiple instruments), full review is not feasible due to system limitations or other reasons. Organizations may define risk-based sampling strategies internally. To do this, records are reviewed on a sample basis and selected by subject matter experts applying critical thinking and considering several factors, including system risk, complexity, size, novelty, relevant incidents, and operational and change history. These system “audit trail” reviews are distinct from routine second-person verification of data and should be defined separately in internal procedures.

This perspective complements existing guidance and highlights how audit trail review, when applied correctly, contributes to a defensible and sustainable data integrity model. Recommendations for GxP-enforced environments are the following:

  • Begin with process analysis: Identify where critical data are generated or modified, assess the risk of each business process, and determine where audit trail review is necessary. This ensures that SOPs focus on areas of highest impact and relevance
  • Define SOPs clearly: Specify what to review, when, and by whom. Include review of data audit trails in data verification; define periodicity for system audit trails
  • Train reviewers: Focus on data life cycle, system behavior, and what constitutes a red flag
  • Specify system capabilities in requirements specification/procurement: Filtering, keyword search, flagging changes, electronic review documentation
  • Use exception reporting where validated and appropriate

The Role of Suppliers

Audit trail review is not just a procedural or operational challenge; it is also a design and capability issue. Suppliers play a critical role in enabling effective, compliant, and efficient audit trail reviews. Designing systems that enable effective audit trail review is more than a convenience; it is also a regulatory and business imperative. Suppliers of GxP-regulated software play a foundational role in ensuring that systems support data integrity by design.

Figure 1

Many of the challenges surrounding audit trail review originate from procedural or training gaps or from system design limitations. If a software solution does not offer intuitive, searchable, and context-rich audit trail functionality, such as user-friendly filtering, timestamp clarity, and clear linkage to data records, even highly-trained reviewers will encounter difficulties performing compliant and efficient reviews that meet regulatory expectations.

To support meaningful and efficient audit trail review, software vendors should know the following (see Figure 1):

First, these recommendations are more than technical. They reflect a quality by design (QbD) approach to software development that anticipates user needs and regulatory scrutiny. As PIC/S and GAMP emphasize, audit trail review goes beyond just having a log and includes having a log that supports defensible, efficient, and meaningful review.

Second, manufacturers should consider including use cases for audit trail review as part of the requirement specifications and trace those through design, development, and validation. Involving QA and end users early in the specification phase helps ensure that the audit trail functionality is usable in practice, and not just compliant on paper. Third, by supporting smarter design, suppliers help enable safer processes, better product quality, and ultimately, stronger regulatory compliance and smoother inspections.

Bringing Practice Closer to Principle

Audit trail review is not a checkbox. It is a critical quality step that ensures data reliability, traceability, and regulatory compliance. But for it to be effective, organizations must distinguish between types of audit trails, apply fit-for-purpose procedures, and invest in system capabilities that enable meaningful review. Suppliers, QA professionals, and system owners each have a role to play. By aligning system design, procedures, and training with ISPE GAMP® 5 (Second Edition) and regulatory guidance, we can bridge the gap between policy and practice and build stronger, more trustworthy digital environments for GxP data.7

Conclusion

This article examines the persistent gap between regulatory requirements and practical implementation of audit trail review in GxP environments. First, the distinction between data audit trails and system/technical logs is fundamental. Organizations must recognize that only audit trails linked to GxP-relevant records require routine review as part of data verification, whereas system logs serve different purposes and should be managed through separate, risk-based processes. Second, meaningful audit trail review requires clarity at three levels: procedural (well-defined SOPs with specific criteria), technical (systems with adequate filtering, search, and documentation capabilities), and organizational (trained personnel that understand both the regulatory intent and system functionality) levels.

Third, suppliers bear significant responsibility in enabling effective audit trail review through QbD principles. Systems that lack intuitive, context-rich audit trail functionality will continue to create compliance challenges regardless of procedural improvements. Finally, the evolution from paper-based to digital environments demands a corresponding evolution in how we approach data integrity controls. Audit trail review is not a retrospective check but an integral part of the data life cycle that ensures reliability, traceability, and ultimately, patient safety. By aligning system capabilities, procedures, and training with current regulatory guidance and GAMP principles, organizations can transform audit trail review from a compliance burden into a value-adding quality activity..8

Not a Member Yet?

To continue reading this article and to take advantage of full access to Pharmaceutical Engineering magazine articles, technical reports, white papers and exclusive content on the latest pharmaceutical engineering news, join ISPE today. In addition to exclusive access to all of the content in Pharmaceutical Engineering magazine, you will get online access to 24 ISPE Good Practice Guides, exclusive networking events, regulatory resources, Communities of Practice, and more.

Learn more about the valuable benefits you'll receive with an ISPE membership.

Join Today


About Pharmaceutical Engineering

ISPE members receive an annual subscription to ISPE’s award-winning Pharmaceutical Engineering magazine as part of their membership benefits. Published six times yearly, each issue features contributions from expert authors and technical articles highlighting the latest industry trends and innovations.

Learn more

Join ISPE Today

Becoming a member of ISPE offers numerous benefits, including access to a vast network of professionals, exclusive training events, and valuable resources. As a member, you'll join more than 22,000 of your professional peers from over 120 countries in advancing solutions that lead to improved patient health. Membership provides access to 20+ complimentary ISPE Good Practice Guides, a robust library of on-demand training and e-learning resources, and much more. Learn more and consider joining today.

Become an ISPE member 

ISPE members: Get more involved by volunteering.