Special Reports
November / December 2017

Prepare for Regulatory Audits with Your Supplier

Heather Longden
ISPE Pharmaceutical Engineering

Data integrity continues to be a very hot topic for both regulators and the pharmaceutical industry. With the increased observations about data integrity in laboratories, could it be that analysts have changed how they do science in the laboratory? Are analysts working differently today? Have they suddenly started disregarding the importance of the data they generate? Can regulators no longer trust laboratory results?

Experienced lab managers are unlikely to observe any significant change in analysts’ behavior, which could account for the increase in the number of US Food and Drug Administration (FDA) Form 483 observations and warning letters1 published each month. No one believes that the majority of analysts are falsifying results, intentionally or otherwise. But it is clear regulators do not have the same level of trust in scientists’ motivations and behavior.

While based on the actions of only a few laboratories, this lack of trust seems justified. In a small number of cases, products or studies that should have been rejected based on scientific evidence have had that evidence hidden or manipulated to “push the data through” and deceive the quality unit into allowing it to pass. In a larger number of cases, it has become normal practice to polish results that almost pass to avoid the tedious work of either providing official scientific justification for the invalidation of out-of-specification (OOS) results or of instigating a full OOS investigation into failed products or studies.

In significantly many more cases, however, lax habits, insufficient care, poor understanding, or lack of knowledge have meant that laboratories, in particular, were not subject to rigorous oversight by quality units as they created data during the analytical process, provided the outgoing paper reports gave the appearance of being in compliance.

Why is it that in the last three years regulators are finding issues in electronic data, specifically chromatography data? Something has changed. To better understand analytic electronic data, regulatory agencies including the FDA began hiring experienced and knowledgeable scientists and trained them on how electronic systems were designed, how the technical controls work, and what records and metadata might be found in electronic data capture (EDC) systems in laboratories. Chromatography data systems (CDS) are the most common.

Inspectors are now more aware of how a computerized laboratory system works and are able look for evidence of:

  • Missing technical controls that are explicitly defined in the regulations
  • Insufficient quality oversight in cases where scientists must make scientific decisions that affect data accuracy
  • Deliberate falsification of data
  • Obscuring OOS results in nonreported orphan data

If not familiar with a given system, inspectors will expect that laboratory staff can help them understand how EDC systems work. Laboratory reviewers should be using these same tools to look for potential data integrity gaps or issues.

In an article published earlier this year, Barbara Unger writes, “How quickly can the audit trails be provided to an auditor? When it takes four staff members a half hour to locate them, it suggests the audit trails are not routinely evaluated.”2 So how can a lab manager be sure that his or her staff knows the application at least as well as the auditor?

Is your vendor knowledgeable about electronic records regulations and regulatory compliance?

The scope of expected technical controls has been defined for almost 20 years. Vendors serious about serving regulated companies will have equipped their customers with tools to help meet electronic record compliance rules. According to guidance provided by the UK Medicines & Healthcare products Regulatory Agency (MHRA), regulated companies still using software without audit trails have until the end of 2017 to address this issue. 3

Vendors must have expertise in what the rules mean, how technical controls can help meet them, and how laboratories ought to leverage the tools to help manage or monitor users’ behavior. Furthermore, the companies can advise when scientists should be trusted to be scientists, and when quality reviewers need to perform quality reviews.

Vendors also have general insight into how companies similar to yours have addressed data integrity. While nobody expects vendors to divulge competitors’ secrets, they will have had opportunity to experience many different approaches to meet compliance needs, and will know which are successful and practical.

Yet how often are the laboratory and quality unit staff able to leverage that expertise? Did the company try to save money by instigating “train-the-trainer” programs, whereby a handful of people were trained "a long time ago," by the vendor, but everyone else was trained “on the job”? Unfortunately, many regulated companies are conservative and resistant to change. The software version deployed in 2002 is often still in use, unchanged and un-updated as “it seems to do the job well enough.”

Vendors should always be consulted for additional training and updated knowledge. The worst time to call a vendor for advice, however, is in the middle of an audit or inspection. There are a large number of caveats to consider before you pick up the phone:

  • Does the laboratory run a standard version of software that your vendor can easily answer questions about?
  • Is there anything customized or unique in how the software is configured and used?
  • Are there procedures (documented, validated, and in use) to manage the data and secure user access in a manner that the vendor might describe as “normal use”?
  • Does the vendor have any special knowledge about your company or your use of the software?
  • Are the vendor’s representatives trained in your SOPs and audit processes?
  • Is there any chance that your vendor representative might just make matters worse, despite good intentions?
  • Given that you may not know how many of the answers are “No,” is it risk-free to ask vendors to interact live with your auditor?

What about talking to the war room and providing documentation during or after the audit?

In this instance, you should consider your understanding of your electronic systems and the timeliness of your answers.

If you really do not know the answer to a question and can’t respond in a timely manner, it is likely that this aspect of the system is little known and therefore little used. For some questions, this may be acceptable. If it is a task you do relatively infrequently, and only a handful of people know the content of that particular standard of practice, not having an immediate response may be considered understandable. But not knowing if you have audit trails enabled, or where to find them, is a more serious issue.

Deferring an answer until after the inspection or audit, and then promising a “letter from the vendor on company letterhead,” is equally full of risk:

  • It indicates lack of knowledge in your organization.
  • You are now relying on the vendor to help complete your regulatory response.
  • Your vendor may not be able to respond in the timely manner that is required.
  • Any such response may require detailed knowledge of your use of the application and possibly user actions related to a specific “event.”
  • The response may provide additional evidence that continues to uphold the auditor’s view that you are not in control of your data.


SUPPLIER ASSESSMENT When your computer-system validation (CSV) will depend largely on vendor testing, it is essential to perform a detailed supplier assessment, ideally long before any order is placed. During this process you can assess how knowledgeable your vendor is with your regulations, how detailed their own software development life cycle and verification is, and how responsive they can be to answer or escalate any questions. This is also the time to include an evaluation about other professional services they may offer: training, consultancy, and regulatory good practice advice.

DEPLOYMENT PLANS Most laboratory vendors understand that deployment cost and time should be kept to a minimum. Yet when it comes to deployment plans, a realistic projection for all its phases probably is based on dozens, if not hundreds, of similar cases. Insisting on shortcutting deployment plan proposals, whether to meet urgent deadlines or to save money, will introduce compromises, which may put your laboratory at risk. Unless you have expert users of these systems already in your laboratory, it makes good sense to accept offers of help that will help you make the most of any new computerized system.

A company may have its own project managers or preferred third-party project managers to drive deployment plans. If the vendor offers such services (at least for the initial rollout), however, consider them to gain access to the vendor’s solution-specific experience.

QUALIFICATION AND VALIDATION SERVICES As noted in the MHRA’s March 2015 guidance document, “acceptance of vendor-supplied validation data in isolation of system configuration and intended use is not acceptable … vendor testing is likely to be limited to functional verification only.” 4 The guidance highlights the issue, stating, “Computerised systems should comply with regulatory requirements and associated guidances, and be validated for their intended purpose. This requires an understanding of the computerised system’s function within a process.”4

How can a laboratory adopt and validate a new computerized system when they may not have full understanding of how this system will eventually be used? This conundrum is why vendor assistance in the qualification and validation of new systems is critical. Vendors (or knowledgeable third parties) should be able to offer learning experiences as they assist any regulated laboratory with CSV exercises.

Understanding how much documented verification is completed at the vendor site before release is a critical part of supplier assessment, and will connect directly with appropriate qualification and validation testing. If the vendor can provide documentation or a summary of testing either during or after assessment, the documentation or summary might guide a risk-based validation effort.

It is very common to leverage installation qualification and operational qualification services from vendors. As they can vary in detail and scope, make sure to understand what is offered, how long it might take to execute, and how much of your own verification testing it might cover. Additionally, if you intend to cover these topics in user acceptance tests, be sure to find this out in advance.

During the validation consulting process, a laboratory is very likely to design exactly how it will use any system or software application. Defining, documenting, and exercising standard operating procedures (SOPs) are critical pieces of the validation process and are unlikely to be included in the services the vendor can provide.

TRAINING AND CONSULTANCY Will training come before validation, during that process, or afterward? Learning is continuous during the deployment of any new computerized system, yet many laboratories treat training as an optional extra—something that can be skipped to minimize deployment costs or added on as a last-minute exercise. Keep in mind that part of validation is the transfer of the knowledge of the product from the vendor to the company.

With today’s focus on a wide variety of data systems, and with auditors and inspectors gaining experience with these systems, it is essential that all staff (including IT support staff, department managers, and your quality unit) be knowledgeable in your deployed applications.

Review of paper records is no longer acceptable as a review of “complete data.” All of the recently published guidance discusses the risks of relying on review of either paper or PDF records (static data) alone. Ensuring quality personnel are comfortable reviewing electronic data comprehensively is a major step from examining printouts. 

Training a large number of expert users in a company is often seen as the best approach when introducing a brand-new computerized system. Subsequent user training can then be a combination of product training and laboratory-specific SOP training. But note that after that initial phase, relying on internal training alone has risks.

As use of the system changes, it is important to ask the vendor’s advice about how to best approach those changes. These simply may be new users with new requirements, a new software version, or it might be a shift in how the software is used, i.e., from using chromatography software as an electronic peak integrator (with all further calculations being performed in a laboratory information management system, an electronic lab notebook or Excel), to automating those calculations in the chromatography software application. By simply continuing to use the software in the same way, you may be missing opportunities for further automation and for the elimination of risky manual steps. Asking the vendor’s advice to help design new ways of working and devise new training material for expert teams can only improve efficiency and reduce errors in the long run.


SOFTWARE PLATFORMS It is very common for regulated laboratories or manufacturing plants to invest significant time installing and validating a computerized system and then be ultraconservative regarding updates—or even service releases and hotfixes.

Designing validation protocols to permit regular updates “when they make business sense” will prevent a company from relying on software that inevitably is missing new features and may contain uncorrected (but now known) defects. Vendors are keen to improve functionality and address defects, yet the very users who report the defect or suggest the enhancement are often denied access to the new versions by management’s reluctance or IT’s inability to implement the new software.

Being aware of all changes and enhanced functionality included in new software releases is key if business units are to evaluate the effectiveness of any potential update. Too often, upgrades are not “permitted” unless some wider global IT or platform change requires it. The users’ effectiveness or compliance appears to be subservient to the IT department’s schedule. Vendors should be able to help you fully understand the productivity enhancements as well as the concerns that running severely out-of-date software can bring.

One of these concerns is the vendor’s ability to support the users and quality unit in case of an audit. Release notes for each software version are normally widely available and may be read by the various regulatory agencies as well as by the quality units of other pharmaceutical companies that might wish to audit you. Ensuring that staff and support channels are aware of which service releases and patches you have deployed, and which you have chosen not to deploy, is critical when addressing technical questions.

INDUSTRY TRENDS Regulators are increasingly aware of the vulnerabilities of specific systems and vendors are well positioned to help companies fix or address these issues. Any reputable vendor will be watching the regulatory news and assessing the latest guidance, changes, and public regulatory findings, just as your own quality units will be doing. When anything new occurs or is cited as a concern, your vendor should be able to help you understand the root cause of that citation, the true concern of the regulator, and how it might affect use of similar software in your company.
This is an opportunity to tap into your vendor’s knowledge about data integrity and prepare your teams for the next inspection. Software vendors have a major interest in your continued success and should be able to review how you intend to keep ahead of these industry trends.


Vendors are often asked, “Do you provide training to the health authorities to help them identify issues within regulated companies?” No vendor wants to see their customers get into deep water with any agency or sponsor company that may be looking for confirmation of data integrity. Users or quality assurance teams are often tasked with “training the investigator” on software during the stressful time of an audit. This is especially difficult when the visitor’s experience of that kind of system is limited.

Training provided to regulators directly from the vendor, outside of an audit situation, should ease the audit process rather than make it more uncomfortable. On the other hand, being prepared to clearly and confidently explain the software capabilities—and how your company leverages the functionality and tools to ensure data integrity in your operation—will enhance the auditor’s impression of your understanding and control of the data supporting your quality products or research.