How Robust Is Your Process Capability Program? Part 2

This article was originally published in the January-February 2018 issue of Pharmaceutical Engineering® magazine.  Catch up on this series by reading:

By:  Philippe Cini, PhD; Gretchen Allison; Gerald Leister; Eda Ross Montgomery, PhD; Julia O’Neill; Paul Stojanovski; Michael Thomas; and Arne Zilian, PhD


There were 15 respondents from 11 "big pharma" companies (annual sales > $1 billion) whose level of understanding using process-capability indices, level of involvement, and statistical understanding were rated as good to excellent. Of these respondents, 53% used Cpk for calculating process capability, 27% used Ppk, and 20% used both. The demographics of the 15 respondents are further described in Figure 3 and Figure 4.

How Robust Is Your Process Capability Program? Figure 3 - Pharmaceutical Engineering Magazine 2018 Jan/Feb Issue
Figure 3: Type of organization interviewed
How Robust Is Your Process Capability Program? Figure 4 - Pharmaceutical Engineering Magazine 2018 Jan/Feb Issue
Figure 4: Scope of involvement

Participants measured process capability for between 3 and 15 years; all reported a minimum of 2 years to see benefits.


The goal of a global procedure is to define process-capability standards regarding the application scope, capability calculation, and response to low-performing processes.

On a scale of 1 to 5, survey participants rated their current state on average at 4.2, indicating that process capability SOPs exist globally or at a business unit level, capability analysis is done for the product portfolio, and a response is defined for low-performing products.

Some respondents stated that their process-capability program began several years ago and according to a respondent, now include "… all drug substances and products, global, includes third parties, fairly matured, structured process for years." For those in an earlier phase of the journey, as one respondent indicated, "policy expectations are defined, [but] contract manufacturers may not be up to speed yet."

Because of the small body of data available, companies typically struggle to include products from development. Some respondents in commercial manufacturing recommended starting with control charts for all products, and expanding later to evaluate process capability.

Participants clearly recognized the benefits: "Because of the procedures we have in place to address low-capability products, we have seen our capabilities rise over the years, greatly reducing our number of out of specifications (OOSs)."

When asked where they would like their program to be in 2–3 years, the average response was 4.9. To achieve this level of improvement, process capability should be evaluated not only at internal manufacturing sites, but also at contract manufacturers, testing laboratories, and in development.

On their journey to achieve this higher level of maturity, some survey respondents expressed the intent to extend their capability analysis to low-volume products, as well as older and local market products. Once a low-performing product is identified, process issues must be differentiated from testing issues. Cpk and Ppk indices (probabilistic measures) should also be compared to the actual OOS rate.

Several opportunities for further development were also mentioned. One discussed the scope of variables, saying "Go beyond quality control data and find additional leading indicators of potential batch rejections." Another comment said that process capability may be just one of several indices for monitoring: "[Craft] a comprehensive quality scorecard for the product with more than Ppk."

Data Management

Data management is a foundational element of the process-capability pyramid, and it must be addressed in a satisfactory manner to reap its benefits. This seemed to be well understood by survey respondents, as this area showed the largest gap between the current and desired future states.

The goal of data management is to:

  • Capture, organize, control, and distribute product and process data across organizational boundaries
  • Support collaboration and decision-making among strategic partners, suppliers, and customers
  • Make on-spec product in a reliable and efficient manner

Data gathering and management can be arduous if the data are recorded on paper-based systems (e.g., batch records, certificates of analysis, printouts) and then transcribed into a secure electronic database. This process is prone to errors and requires that the data integrity be checked.

On a rating scale of 1 to 5, survey participants rated themselves an average of 3.1 for this area. At this level, databases are structured consistently across products and across sites; data compilation is in part manual and in part automated. Participant comments included: "Some databases are structured; some are manually entered/updated. Network roll-up not available (in some cases 100% manual). Product-specific databases may or may not be automated."

When asked where they would like their data management programs to be in 2–3 years, the average response was 4.5. To accomplish this, manual processes that require data verification must evolve to automated processes that include data integrity authentication. Unfortunately, pharmaceutical data management has not kept pace with industry changes and expansions over the past three decades. Local systems are still being used, even though a global system would allow data aggregation and comparison within and between product groups. It is difficult to make real-time decisions or obtain information on demand using slow off-line systems.

As pharmaceutical and biologics companies become virtual, using contract manufacturing organizations and contract laboratories, linking external sources will become more critical.