InTouch
January / February 2026

Digital Manufacturing and Supply: An ISPE D/A/CH Workshop

Ursula Busse, PhD
Viktor Mettler
Andreas Kleeb
Laura Kuger
Roger Nosal
Nick Lee, PhD
Christian Woelbeling
Arne Seeliger
Margaux Penwarden
0126_PE_JF_P+E_12

The 2025 ISPE D/A/CH workshop on “Pharma’s Journey to Digital Manufacturing and Supply” united pharma leaders, technology experts, and regulatory authorities for three days of intense exchange on the digital transformation of our industry.

Inaugurally, the workshop proposed a unique format, combining keynote presentations, panel discussions, gallery walk poster sessions, dedicated tours, and four artificial intelligence (AI)-supported interactive sessions. These sessions focused on generative AI (GenAI) in GxP environments, predictive process modeling, digital twins, and digital tools for precision medicines. This format enabled all participants to actively engage, contribute, and learn from each other throughout the workshop, creating a unique forum for in-depth dialogue. The workshop turned out to be a major milestone in fostering networking and collaboration between industry and regulators to shape the future of Pharma 4.0™, highlighting ISPE’s role as a catalyst for advancing digital maturity across the pharma industry.

This article summarizes the key findings and practical insights generated during the workshop’s interactive sessions and panels.

A Convergence of Minds in Basel

Hosted by Roche in Basel, Switzerland from September 15–17, the ISPE D/A/CH workshop on “Pharma’s Journey to Digital Manufacturing & Supply” brought together over 130 pharmaceutical leaders, technology experts, and regulators for three days of intense engagement and productive exchanges. The event drew an impressive lineup of representatives from prominent European Health Authorities, including the European Medicines Agency (EMA), Swissmedic, the Irish Health Products Regulatory Authority (HPRA), and the Dutch Medicines Evaluation Board (MEB), creating a unique forum for in-depth dialogue.

The workshop’s primary goal was to accelerate the industry’s digital transformation of manufacturing operations by moving beyond theoretical discussions. Participants explored real-world digital use cases, shared lessons learned, and collaborated on practical solutions to overcome adoption challenges. The format encouraged direct interaction, hands-on problem-solving, and cross-functional knowledge sharing. This helped to build a shared understanding of the journey to digital transformation. Ultimately, the workshop marked a major milestone in fostering networking and collaboration, particularly between industry and regulators. It served as a powerful catalyst for stakeholders to build a more agile, intelligent, and patient-centric manufacturing and supply ecosystem, while working together to break down silos to shape the future of Pharma 4.0™.

Workshop Outcome

The workshop focused on four main themes: GenAI in GxP environments, predictive process modeling, digital twins, and digital tools for precision medicines. Deep dive sessions were framed by keynote presentations, regulatory panel discussions, gallery walk poster sessions, dedicated tours, and dedicated networking opportunities.

An Innovative Format to Cultivate Unprecedented Collaboration

The strategic design of the workshop’s highly interactive format was crucial to its success, enabling a level of open dialogue and deep engagement rarely seen at industry conferences. The structure was engineered to dismantle traditional barriers between presenters and the audience, ensuring every participant could contribute and learn from the collective expertise in the room. At the core of this approach were “world café style” roundtable discussions. Participants were divided into four color-coded groups and rotated across numbered tables to address predefined questions. These questions focused on risk management, scalability, life cycle management, and inspection readiness.


Participants discussing digital use cases in one of the breakout sessions.


Genuine thanks to the organizing committee for proposing us this workshop format, which fostered a unique environment of open discussion, round tables, and intensive exchange of knowledge. I do not recall attending a similar event where participants so readily set aside company affiliations and titles, engaging humbly with a genuine intention to contribute.”

Alvaro Avivar-Valderas, Takeda 

Each deep dive session started with short presentations of real-world digital use cases. A moderator at each table facilitated the discussion and entered the group’s consolidated answers into online forms, creating a digital record of the collective insights. During networking breaks, a GenAI tool consolidated these responses in real time, producing concise summaries that highlighted areas of agreement and disagreement across groups. These summaries were then fed into the subsequent panel discussions. Panelists included use case owners and regulatory representatives.

Using the AI-generated insights, the audience’s most pressing questions could be addressed. This created panels that were dynamic, interactive problem-solving sessions. The workshop’s unique format enabled focused, substantive exploration of the most critical digital topics in pharma manufacturing and supply. Attendees widely praised the format for its ability to generate actionable insights and foster collaboration.

Workshop Deep Dive 1: Generative AI in GxP Environments

While the promise of GenAI is immense, its application in GxP environments introduces a fundamental tension between its probabilistic nature and the deterministic requirements of regulatory compliance. The workshop tackled this challenge directly, anchored by practical use cases from Sanofi, which demonstrated how GenAI can support quality business processes for annual product quality review writing, and Roche, which showcased its use in accelerating the deviation handling process. The panel, featuring Sina Berndl of Swissmedic and Roberto Conocchia of the EMA, helped frame the conversation around regulatory expectations with the following main outcomes:

  • Human accountability is non-negotiable
  • AI-assistants can be used in GxP environments for non-critical, supportive steps in quality business processes
  • A robust AI control strategy is essential
  • Shift human focus from data gathering and content generation to critical review
  • Foster and train critical thinking

Justification and Scope of Use

A clear consensus emerged that GenAI’s current role is as a supportive tool for non-critical steps within quality business processes. A prime example is its use in drafting initial deviation descriptions before formal entry into a Quality Management System, shifting human effort from content generation to value-added critical review. The justification for its use in a GxP context is anchored by two non-negotiable pillars: a robust AI control strategy and an accountable “human in the loop” who verifies all outputs and remains fully responsible for the final decision.

Key Risks and Mitigation Strategies

Discussions identified several primary risks associated with deploying GenAI in GxP settings, alongside specific, practical mitigation strategies that go beyond standard procedures. Table 1 summarizes the audience’s consolidated views.


Table 1: Summary of audience views
RisksMitigations
Inaccurate content (i.e., AI hallucinations)Comprehensive user training on critical review, ensuring a qualified human reviewer is accountable for the final output, and implementing a “four-eyes-principle” before formal submission.
Poor input quality (“garbage in/out”)Implementing strong data governance and curation for training data, establishing clear prompt engineering procedures, and validating input data where possible.
User over-reliance (“blind trust”)Continuous performance monitoring of the model, combined with the crucial strategy of periodically challenging reviewers with known incorrect outputs to test their vigilance and prevent automation bias.

GenAI Validation and Life Cycle Management

While specific standard operating procedures (SOPs) for validating GenAI are still largely nonexistent, the prevailing approach is to adapt existing computer system validation (CSV) frameworks, such as the V-Model, rather than reinventing the wheel. The key challenge is maintaining a validated state for a dynamic system. The core life cycle management strategies discussed include:

  • Continuous monitoring: Actively tracking model performance against predefined Key Performance Indicators and thresholds, often visualized on dashboards to detect performance degradation or model drift.
  • Human-in-the-loop review: Implementing regular, sample-based reviews by subject matter experts to verify outputs. The novel strategy of challenging reviewers with known incorrect outputs was identified as a critical control to ensure sustained vigilance.
  • Update triggers: Defining clear criteria for when a model needs to be updated. Common triggers include performance degradation below a set threshold, the availability of new and better technology, or significant user feedback indicating a decline in utility.

Ensuring Inspection Readiness of GenAI Applications

To be inspection-ready, a GenAI application requires the standard CSV documentation package, including the user requirements specification, validation plan, and report, and a requirements traceability matrix. However, this must be supplemented with documentation that addresses the unique nature of AI. This includes a thorough risk assessment file, clear SOPs governing the system’s use that respect ALCOA+ (attributable, legible, contemporaneous, original and accurate, consistent, enduring and available) principles. The challenge of maintaining a validated state for dynamic GenAI models leads directly to the core issues addressed in predictive process modeling, where model evolution is a central feature.

Workshop Deep Dive 2: Predictive Process Modeling

Predictive process modeling was framed as a critical enabler for Pharma 4.0™, offering the ability to move from reactive to prospective manufacturing control. The session explored compelling use cases, including feedback control methods in manufacturing from Johnson & Johnson, smart yield optimization using self-learning AI engines from Takeda, and a detailed look at critical quality attribute multivariate analysis in cell culture.

Implementation and Scalability

The discussions distilled a clear set of strategies for implementing predictive models successfully and ensuring they can be scaled across a global manufacturing network. These include the following:

  • Adopt a structured life cycle approach: Models must be managed with the same rigor as any other critical asset, with a formal process and systemic maintenance that governs their journey from initial concept through to commercial deployment and eventual retirement.
  • Standardize data platforms: A centralized data platform is essential to handle diverse data inputs, i.e., common vocabulary/terminology, from multiple sites and systems. Establishing standard nomenclature and data structures is a prerequisite for building scalable and reliable models and ensuring precision and accuracy of output performance.
  • Employ hybrid models: For greater robustness and transparency, the use of hybrid models was strongly recommended. This approach combines mecha-nistic, first-principles models (which describe the underlying physics and chemistry) with data-driven components to create a system that is both powerful and more easily explainable.

Life Cycle Management and Governance

A key theme was the need to adapt existing governance frameworks, such as GAMP, to accommodate the dynamic nature of predictive models. Robust change control is paramount. For technical governance, participants suggested using established Machine Learning Operations (MLOps) tools, provided they are operated in a GMP-compliant manner. Specific recommendations included using Git for version control of model code and MLflow for creating a model registry to track different versions and trends in performance.

Ensuring a Continuously Validated State

The conversation highlighted a fundamental shift from reliance on traditional, point-in-time inspection to achieving a state of continuous validation. This evolution supports a move toward forward-looking concepts like “quality control by exception” and, ultimately, “real-time release testing.” Critically, achieving this state requires a strong, collaborative framework between R&D, where models are often initially developed, and manufacturing, where they are implemented, used, and monitored over their life cycle. As these predictive models grow in complexity and scope, they form the foundation for their more comprehensive evolution: digital twins.

Workshop Deep Dive 3: Digital Twins

Digital twins—dynamic, virtual replicas of physical processes—were presented as a technology with the potential to revolutionize process development, optimization, and real-time control for shortening time to market and product quality. The session explored this potential through two use cases. The first use case was from Körber Pharma, which demonstrated an end-to-end twin for adaptive process control. The second use case was from Merck KGaA, which detailed its use for predictive control of cell culture experiments. The panel discussion, joined by Marcel Hoefnagel of the MEB and chair of the EMA Quality Innovation Group, delved into practical challenges and opportunities.

Data as the Foundation

The discussions confirmed that data is the foundation of any successful digital twin, and significant challenges remain in ensuring its quality and governance.

Data quality

There was a strong consensus that data quality is highly variable. In automated, GMP-regulated settings, data is often of high quality (rated 8/10), but in manual development processes, a lack of standardization frequently leads to inconsistent and poor-quality data (rated 3–5/10).

Data requirements

Although the core data types needed—such as critical process parameters and raw material attributes—are generally known, the specific data requirements for a model often evolve during its development. This makes the non-negotiable involvement of process experts essential for identifying and defining data needs.

Data governance

To address these challenges, effective data governance is crucial. Key solutions discussed included implementing robust master data management, creating a “single source of truth” for all process data, and leveraging structured data warehouses with clear ontologies to ensure consistency.

Broader Implementation Challenges

Beyond data, the roundtables identified several other significant hurdles to widespread adoption. A knowledge and skills gap were cited as a major barrier, with a shortage of personnel possessing dual expertise in both pharmaceutical manufacturing and data science. Furthermore, integration and technical debt present a substantial challenge, as connecting modern AI solutions to legacy systems is often complex and resource intensive. Also, deep process understanding and a quality by design approach is a prerequisite for a profound application of digital twins.

Technical Feasibility and Architecture

There was a strong consensus among participants that real-time data access and model application are technically feasible with current technology. The debate then shifted to architectural choices. A hybrid model emerged as the overwhelmingly preferred approach, balancing the benefits of on-premises and cloud solutions. This allows companies to leverage on-premises systems for their enhanced data security and low latency—critical for real-time control—while using the cloud for its superior computing power and scalability, which are ideal for model training and large-scale data storage.

The Regulatory Landscape

The workshop reached a unanimous and unambiguous conclusion: the current regulatory requirements for model validation and life cycle management are considered unclear. Participants described the landscape as fragmented. They expressed a desire for clearer, harmonized global guidelines, such as a new International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH) standard. Such guidelines could help reduce regulatory uncertainty and provide a more predictable framework for innovation. The application of these powerful digital tools becomes even more critical in the highly complex and individualized world of precision medicines.

Workshop Deep Dive 4: Digital Tools for Precision Medicines

The final deep dive session addressed the unique challenges and opportunities of applying digital tools to individualized therapies. The conversation evolved around a diverse set of use cases. This included Takeda’s use of AI in advanced therapy medicinal products manufacturing and statistical process control (SPEC) for control limit calculations in cell therapy. In addition, Galapagos’ work on decentralized manufacturing for autologous cell therapies and Genentech/Roche’s application of machine learning (ML) for individualized mRNA immunotherapies was discussed.

Scaling Individualized Therapies

To scale production and dramatically reduce turnaround times for individualized therapies, participants identified a clear set of strategies. These include a push for end-to-end automation to minimize human variability, process standardization through concepts like validated platform approaches to avoid reavalidating for each patient, and the adoption of decentralized manufacturing models where production occurs closer to the patient. In this manufacturing setup, governance is maintained through a centralized manufacturing execution system (MES).

Ensuring Chain of Identity and Custody

For individualized therapies, maintaining the chain of identity and custody is a paramount, non-negotiable requirement. The proposed solutions rely on robust, end-to-end digital systems. This includes, for example, an MES to track a unique patient identifier throughout the entire life cycle, from initial sample collection to final administration. The use of advanced technologies like blockchain was also suggested as a potential way to further enhance data integrity and ensure an immutable audit trail.

Evolving Control Strategies

A key discussion point was the trade-off between traditional statistical process control (SPC) and modern AI. The consensus was that SPC remains the preferred tool for simple, well-understood processes due to its interpretability and strong regulatory acceptance. In contrast, AI brings significantly more value to complex, multivariable processes where relationships are not yet fully understood. Crucially, a nuanced counterpoint emerged: although AI is often associated with large datasets, it can also be valuable in data-scarce situations (like n=1 therapies) by leveraging prior knowledge, whereas SPC inherently requires abundant historical data to be robust. These focused discussions culminated in a clear vision for the path forward.

The Path Forward: Regulatory Considerations

The workshop’s regulatory panels provided a measured appreciation of the regulatory expectations associated with the adoption of AI and ML tools and applications in manufacturing. They comprised regulators from European national agencies and EMA. These included EMA’s Head of Quality and Safety of Medicines, Evdokia Korakianiti. Also on the panel were chemistry manufacturing and controls experts Nick Lee (HPRA), Helen Thomas (Swissmedic), and Dolores Hernan (EMA); and GMP experts Sina Berndl (Swissmedic) and Roberto Conocchia (EMA). Gert Thurau (Roche) joined the panels to provide valuable regulatory and quality insights from an industry perspective. He highlighted the importance of considering the learnings from the introduction of past innovative approaches, namely process analytical technologies and near infrared spectroscopy (NIR) guidelines, to effectively understand the benefits and challenges in implementation and adoption.

The following recommendations and considerations were provided:

  • AI is a tool, and like all tools should be fit for its intended purpose.
  • While AI offers powerful capabilities, it is crucial to focus on solving real-world problems rather than applying technology for its own sake.
  • AI can be leveraged to support non-critical GxP steps, provided there is a solid AI control strategy in place.
  • Structured data is a fundamental criterion for the implementation of AI-driven approaches where quality, reliability, and effective change management are unequivocal priorities. Robust data governance and standardized processes are essential to maintain data integrity throughout the life cycle of digital applications.
  • Human accountability and critical evaluations of application risks and overreach should be the focus of responsible oversight.
  • Critical thinking is an imperative to ensure that AI applications are suitably justified, contextualized, implemented and governed, and are controlled by humans (i.e., human in the loop).
  • Engaging regulators early in the development process and maintaining the dialogue throughout the development of digital innovations encourages alignment, fosters trust and affords companies relevant and valuable guidance.
  • Furthermore, understanding the current regulatory frameworks and what is possible is also an important factor.

Although panelists were primarily focused on current EU initiatives, including the pending EU GMP Annex 22 draft, all acknowledged the importance of international harmonization as an incentive and imperative to encourage digital innovation.

Conclusion

The ISPE D/A/CH workshop was a resounding success, serving as a powerful catalyst for advancing digital maturity across the pharmaceutical industry. Thanks to the generous hospitality of Roche and the expert facilitation by ISPE, the event transcended typical conference formats. It was a true collaborative forum where industry leaders and regulators jointly tackled the future of digital manufacturing and supply. Hopefully, the practical insights and clear consensus points generated during these sessions will positively impact the development of key regulatory guidance, particularly the upcoming revisions to EU GMP Annexes 11 and 22.

As the digital journey continues, events like the ISPE D/A/CH workshop on “Pharma’s Journey to Digital Manufacturing and Supply” are essential. They provide the critical platforms needed for all stakeholders to navigate complexity, share knowledge, and drive innovation together. By fostering this spirit of collaboration, the industry can continue to jointly shape a more efficient, intelligent, and patient-focused future for pharmaceutical manufacturing and supply.

AI