March / April 2021

The History & Future of Validation

Anthony J. Margetts, Ph.D.
Line Lundsberg-Nielsen, PhD
The History & Future of Validation

Across every industry today, digitalization is driving the use and value of data to disrupt traditional business models and ways of working. In pharmaceuticals, the promises of Industry 4.0 are expected, and needed, to finally modernize the legacy approaches that have evolved since the 1970s. Validation is an obvious target for digital disruption because of the inefficient, document-heavy methods in place and the huge costs and time wasted, and because it is a barrier to efficient and effective technologies that can advance safer and better quality products. This article reflects on the history of validation and anticipated future directions.

The lead author of this account has used personal experiences to help tell the story. For this reason, the article uses the first person in parts of the narrative.

The First 50 Years

This history begins with the perspective of a leading figure in validation, James Agalloco, who just achieved a great milestone: four decades of being involved with ISPE. He has stated that the origins of validation in our industry can be traced to terminal sterilization process failures in the early 1970s.1 One case was the 1971 Devonport incident, in which a batch of 5% dextrose IV bottles that were not correctly sterilized reached the market and were administered to patients. Sadly, five patients at a Devonport, England, hospital died after receiving the contaminated solution.2 I knew the manager involved, and such tragedies refocused everyone in the industry on the fundamental importance of the safety of our drug manufacturing processes.

The first UK “Orange Guide,” titled “Guide to Good Pharmaceutical Manufacturing Practice,” was published in 1971. The edition released in 1983 included wording on validation. Today, the UK Orange Guide covers EU GMP, rather than British GMP.3 Such international efforts have encouraged the standardization of regulations.

In the US, the GMPs for drugs (21 CFR Parts 210 and 211) and medical devices (21 CFR Part 820) were first published in 1978 and, like the Orange Guide, included validation as a central term in 1983. Current versions of the GMPs are available from the US FDA website.4

At the Parenteral Drug Association Annual Meeting in 1980, Ed Fry of the US FDA gave a talk titled “What We See That Makes Us Nervous,” in which he expressed the need to improve pharmaceutical manufacturing processes. The FDA recognized that processes were not robust, and throughout the 1980s, the regulators considered how to make companies more effectively validate their processes and published a series of seminal guidance documents, such as the 1983 guide to inspection of computerized systems in drug processing.5 The FDA’s discussions included concepts of scientific understanding based on process development. Despite these discussions, when the FDA published “Guidance for Industry: Process Validation: General Principles and Practices” in 1987, the guidelines did not mention the design of the process.6

In 1984, however, Ken Chapman published a paper about process validation,7 which introduced the life-cycle concept and explained that the ability to successfully validate commercial manufacture depends on knowledge from process development. Chapman was also very active in the early days of computer validation, and he developed the idea that a computerized system consists of software, hardware, operating procedures, people, and equipment—and sits in an operational environment that has to be managed. This model is very important and relevant today.

In 1987, with increased understanding that computer systems were being used in manufacturing, the US FDA sent four inspectors to a master of science program in applied computing at the University of Georgia, Athens. In 1991, an FDA inspector visited Glaxo and Imperial Chemical Industries Pharmaceuticals manufacturing sites in the UK and Italy and, for the first time, the regulators raised concerns about the lack of validation of computer systems. These inspections led to the formation of the GAMP® Community of Practice to develop an industry-wide response to meet the US FDA’s expectations. (For a history of GAMP, see reference.8 )

Table 1: Stages in US and EU guidance on the process validation life cycle.
Stage US EU
1 Process design Pharmaceutical development or process design (ICH Q8)
2 Process qualification (PQ) Qualification and validation
2.1 Qualification of equipment and utilities Qualification (Annex 15)
Installation qualification (IQ)
Operational qualification (OQ)
Performance qualification (PQ)
2.2 Process performance qualification (PPQ) Process validation (PV)
Continuous process verification (CPV)
3 Continued process verification (CPV) Ongoing process verification (OPV)

In the early 1990s, the FDA launched their preapproval inspections to affirm that commercial materials had their basis in the pivotal clinical trial process and materials. I had the experience of witnessing an inspector stop an audit because we could not demonstrate that the process being operated was the one used for the clinical trials. In the same inspection, the inspector asked specifically for validation plans and validation summary reports, now considered a central element of the quality system needed for manufacture of drug products.

A sequence of FDA investigations of Barr Laboratories that started in 1989 became a huge problem for the company, as inspectors repeatedly ob-served Barr’s failure to follow cGMPs while the company disputed those findings. Ultimately, the conflict landed in the US District Court of New Jersey. In the 1993 case, United States v. Barr Laboratories, Inc., Judge Alfred Wolin declared that process validation is required by GMPs.9

In 2004, the FDA published “Pharmaceutical cGMPS for the 21st Century—A Risk-Based Approach.”10 This included a reference to the revised compliance policy guide (CPG) for process validation.11 Then, in 2011, 30 years after Ed Fry raised concerns and 25 years after Ken Chapman published his paper, the FDA published “Guidance for Industry: Process Validation: General Principles and Practice.”12 In this guidance, the FDA adopted a life-cycle approach, moving from process qualification to validation in three stages, Stage 1: Process Design, Stage 2: Process Qualification, and Stage 3: Continued Process Verification.

Between 2005 and 2009, the International Council on Harmonisation (ICH) produced a series of quality guidelines emphasizing the importance of pharmaceutical development, the life cycle, and the framework of quality risk management:13

  • ICH Q8 Pharmaceutical Development (2005; minor updates 2009)
  • ICH Q9 Quality Risk Management (2005)
  • ICH Q10 Pharmaceutical Quality System (2008)

Among the ICH quality guidelines, Q6 (1999), Q7 (2000), Q9, and Q10 specifically require assessment and approval of suppliers. Use of approved suppliers is an important part of the quality process. Q7 covers the life-cycle approach for active pharmaceutical ingredients.

In 2007, the American Society for Testing and Materials (ASTM) with ISPE involvement published standard ASTM E2500, Specification, Design, and Verification of Pharmaceutical and Biopharmaceutical Manufacturing Systems and Equipment.14 This introduced a risk-based approach to qualification of unit operations in GMP manufacturing that leverages engineering activities to reduce qualification risk.

In 2015, Annex 15: Qualification & Validation was published as part of the EU Guidelines for Good Manufacturing Practice for Medicinal Products for Human and Veterinary Use.15 The next year, the EMA published two process validation guidelines.16 ,17 These guidelines used a similar life-cycle approach to the one used by the FDA; however, staging terminology varies (see Table 1).

In FDA guidance, activities covered by “continued process verification” include routine monitoring of process parameters, trending of data, change control, retraining, and corrective and preventive actions (CAPA). In EMA definitions, “continuous process verification” operates in place of process validation.

At the same time that regulatory authorities were producing guidelines and standards, the pharma industry and others introduced many improvement initiatives, including operational excellence, lean manufacturing, and Six Sigma. Around the world, companies outside of pharma adopted ISO 9000 quality management standards18 as a basis for their quality system improvements, and they could see the benefits in the supply chains. Some companies could see the benefit of understanding the process as part of validation, but this was in complete contrast to many pharmaceutical companies around the world. In the pharma industry, most did not see process validation as a benefit. Instead, they saw only a necessity to perform three consecutive process validation batches and document that performance.

Throughout the early decades of validation history, I watched the battles between regulatory teams trying to get processes registered with as much information as possible, and production teams that did not want to be too specific because they knew that they might fail in process validation, or later during commercial manufacturing. Much of the resistance to specificity stemmed from the burden of filing regulatory variances for what should be minor process changes operating as part of continuous improvement.

Since the new millennium, with the help of the FDA process analytical technology (PAT) initiative and ICH, more of us in the pharma industry have realized the importance of process development, risk assessment, and process understanding, and have come to understand that allowable limits for critical quality attributes (CQAs) and critical process parameters (CPPs) can establish a rational validation framework to help manufacture safe and effective products reliably.

In the era of science-based process validation and personalized medicine, the number of process performance qualification or process validation (PPQ/PV) batches must be justified for small molecules, large molecules, and advanced therapy medicinal products. We now realize that these processes require real-time monitoring of each batch to maintain them in a state of control. Fortunately, the EMA has stated that continuous process verification may provide a practicable method of managing batch-to-batch consistency, quality assurance, and quality control.16

ISPE’S Contributions

No history of validation can overlook the significance of ISPE’s role in establishing GAMP and commissioning and qualification (C&Q) concepts.


GAMP introduced a number of concepts that are important in validation today:

  • The life-cycle model concept, which is now seen as fundamental for process validation.
  • The expectation to see validation activity defined upfront in validation plans and closed off by formally signed validation reports produced by the regulated company.
  • The concept of the user requirement specification (URS) as a basis of qualification. This was developed further by ASTM E250014 and by the ISPE commissioning and qualification guide.19
  • The concept of using approved suppliers, introduced in 1994.
  • The concept of risk assessment, introduced in 2001.
  • The V model to link specifications to verification, introduced in 1994. At that time, some companies wrote installation qualification (IQ) and operational qualification (OQ) documents that did not refer to any specifications. This link between specifications and verification is an important part of validation today.
  • Key terms to help to focus risk assessment, including patient safety, product quality, and data integrity. In 2017, GAMP published an important guide dealing with data integrity,20 which is a fundamental part of process validation.

C&Q Concepts

The ISPE Baseline Guide Vol. 5: Commissioning and Qualification, originally published in 2001, was revised in 2019.19  The

guide describes how systems are commissioned and critical aspects (CAs) and critical design elements (CDEs) are qualified. Critical aspects and critical design elements are linked to QCAs and CPPs. Facilities, equipment, and systems supporting processes should be qualified using these concepts to reduce the burden of non-quality-impacting documentation, and repeat testing, which were notable in the past.

Key aspects of C&Q include:

  • Commissioning is executed and documented as Good Engineering Practice (GEP).21
  • Good engineering practice verifies that the URS requirements are all incorporated, have been approved in the design review, and have been tested and documented as working in the acceptance and release report or qualification report.
  • In good engineering practice, everything is tested to ensure the system is fit-for-purpose.
  • Systems are 100% (GEP) tested during commissioning, with approximately 10% of testing focused on the CAs/CDEs for qualification.
  • The focus for qualification is on robust testing and documentation of the CAs/CDEs as appropriate to the level of risk controls applied.
  • Lists of tests, test scripts, acceptance criteria, and traceability are all covered by Good engineering practice.
  • Computer systems controlling equipment are qualified with the equipment.
  • The commissioning and qualification guide is clear that quality does not approve commissioning documents. The guide notes that quality will approve the commissioning and qualification plan and the acceptance and release report.
  • Typically, major pharmaceutical companies cover all the engineering associated with a new project in one commissioning and qualification plan and in the final acceptance and release report, so the role of quality assurance is limited to approval of these documents and the use of approved subject matter experts who oversee the qualification work.
  • Much of the qualification supporting data can be provided by approved suppliers. The supplier assessment is an important step to deciding the validation strategy, and the validation plan should refer to the use of supplier qualification practices as much as possible.

Looking Forward

The following are important to incorporate into the proposed new “Validation 4.0” framework that will enable Industry 4.0 changes in the pharmaceutical industry.

Leveraging the Product Life Cycle

The life-cycle model concept builds on the importance of data from pharmaceutical development as a fundamental for process validation. Requirements are an output from development and needed as a baseline for everything—including processes, facilities, utilities, systems, and equipment—to define the CQAs, CPPs, CAs, and CDEs so that these can be verified later. Requirements can be handled as processes and more clearly understood by describing them using illustrative process maps. Processes are further detailed using data maps showing the flow and relevance of information at each step and activity across the end-to-end product life cycle.

Risk Assessment and Controls at Design

This part of the Validation 4.0 framework focuses on aspects of the process or system that are important to patient safety, product quality, and data integrity, and it allows the validation effort to be focused on critical areas.

Process and data maps are used to better understand the risks to the process, and the risks to data. Risk assessment and controls analysis should be started as early as possible during process and system development and specification. The control strategy is an important part of the design, and doing this work early allows for generation of suitable options that lower risk and a clear identification of the data that must be measured to ensure the state of control. Risk assessment can be used to evaluate data integrity to show where controls are needed to ensure that processes are operating correctly.

Data-Driven Process Validation

As noted previously in Table 1, the US FDA’s structure for process validation has three stages:

  • Stage 1 is the essential link to the development stage, covering process design and establishing the control strategy. It also includes the design of equipment and automation systems, assessment of input material attributes, process dynamics and variability, and development of strategies for process monitoring and control.
  • Stage 2 has two parts: Stage 2.1, qualification of the equipment, utilities, and facility, demonstrates the equipment and systems work as intended. Stage 2.2 demonstrates the robustness of the manufacturing process and the adequacy of the control strategy (i.e., verification of the control strategy).
  • Stage 3, continued process verification, provides continual assurance that the process remains in a state of control during commercial manufacture.

Annex 15 of the Pharmaceutical Inspection Convention/Pharmaceutical Inspection Co-Operation Scheme (PIC/S) GMP guide22 describes the requirements for process validation in some detail and includes the points described earlier from US regulations. The PIC/S guide also states that for products developed by a quality by design approach, where it has been scientifically established during development that the control strategy provides a high degree of quality assurance, continuous process verification can be used as an alternative to traditional process validation.


Validation is here to stay—it is an integral part of regulatory requirements and of the manufacturing component of the healthcare environment. The added value of validation must be to demonstrate that the manufacturing system is fit for the intended use, and that the control strategy clearly reduces the risk to patient safety. Also, validation in itself should not be a barrier to innovation.

Continuous process verification is a key target for Validation 4.0. We need to develop methods that encompass the continuous monitoring of data, from the process and the risks to the control strategy, to ensure our processes are always valid. By building in feedback to the process, we enable a control model that can develop and respond to change, and we can monitor processes in real-time.

Because parts of the model may change during operation, monitoring of the process and risks is necessary and will ensure that we constantly learn more about the process as it becomes mature through the product life cycle. Establishing this concept early and systemizing it in tools is expected to be an effective way to move toward the application of digital twins. A digital twin is a replica of an intended or operating process, which can be used to plan and analyze the process and understand the effect of design and proposed changes.

A stated goal of Validation 4.0 is to potentially eliminate Stage 2 of process validation (verification of the control strategy by testing). By bringing R&D and Stage 3 operations closer together and moving to continuous verification from real-time data, we can speed up the validation process, keep up with innovation in the new digital world, and reduce risks to patient safety.

Available in Russian

История и будущее валидаци

Individual Download

Скачать сейчас!