iSpeak Blog

The Laboratory Automation Plug and Play (LAPP) Framework – The Case for Standardization in Research and Development (R&D) and Quality Control (QC) Laboratories: Part One

Ádám Wolf
David Wolton
Miguel Alvariño Gil, PhD
Eugene Tung
Christian Stirnimann, PhD
Youjie Zhang
Cornelia Steinhauer, PhD
Jeff Van Doren
iSpeak

Laboratory automation in the pharmaceutical industry is rapidly evolving due to increasing regulatory demands and the need for efficiency, flexibility, and data integrity. However, fragmented systems, proprietary protocols, and inconsistent validation frameworks continue to limit progress.

Executive Summary

Laboratory automation in the pharmaceutical industry is rapidly evolving due to increasing regulatory demands and the need for efficiency, flexibility, and data integrity. However, fragmented systems, proprietary protocols, and inconsistent validation frameworks continue to limit progress. The LAPP initiative proposes a standardized, modular, and interoperable approach to automation with the aim to achieve true plug-and-play capability. This paper outlines the challenges and opportunities in R&D and QC environments and provides practical recommendations for the path forward.

Part One This article marks the first part of a three-part series on the LAPP Framework—a cross-industry initiative to enable modular, interoperable, and compliant automation across laboratories of the pharmaceutical value chain.

Here, we lay the foundation by examining why standardization is urgently needed in both R&D and QC laboratories. We look at how fragmented architectures, proprietary systems, and inconsistent validation practices limit scalability, flexibility, and data integrity.

This first part establishes the context and drivers for harmonization, highlighting the shared challenges across R&D and QC and the need for common architectures, open interfaces, and shared semantics. It sets the stage for Part Two, which will explore how the information technology (IT)-operational technology (OT) convergence and modular validation approaches can address these gaps and introduce the LAPP Framework—a concept structured around three pillars: semantics, information models (digital twins), and a reference architecture model. Part Three will then dive into the technological building blocks that make this vision achievable through interoperability and data standardization.

Introduction

The modern pharmaceutical industry is undergoing a fundamental shift toward automation in QC laboratories. As regulatory requirements grow stricter and the demand for higher throughput increases, laboratories are turning to robotics and digitalized workflows to enhance efficiency, ensure consistency, and reduce operational costs. However, despite these advancements, the lack of standardized system architectures, software topologies, and interoperability protocols remains a major barrier to achieving true plug-and-play automation.

Current laboratory automation solutions are often proprietary and fragmented, limiting interoperability across different platforms. Many instruments are designed for standalone operation, with vendor-specific control systems and proprietary communication protocols. These issues are not new, as comprehensive overviews of automation in analytical assays have long acknowledged the challenges of system integration and interoperability1. These challenges lead to inefficiencies, higher integration costs, and a reliance on custom engineering solutions that minimize system flexibility. Even when integration is achieved, maintaining it over time presents additional burden, as every software update or system modification can require extensive reengineering and revalidation efforts, and in some cases, ongoing compatibility may not even be technically feasible.

To address this issue, the pharmaceutical industry must establish a common vision for a standardized automation ecosystem in QC laboratories—one that enables plug-and-play integration between laboratory instruments, robotic systems, and digital infrastructure.

Standardization efforts must focus on three key areas:

  1. System Architecture and Software Topology: A well-defined reference architecture model should serve as a blueprint for laboratory automation, providing clear guidelines on how laboratory instruments (devices), orchestrators, and control systems interact. By adopting modular and scalable frameworks, laboratories can reduce development overhead and enable seamless deployment of automation solutions across multiple sites.
  2. Interoperability Protocols: To eliminate proprietary silos, laboratories must adopt open communication standards that ensure device compatibility across vendors. Existing initiatives, such as the SiLA2 2 and Laboratory and Analytical Device Standard (LADS) 3 protocols, offer a foundation for enabling standardized control and information exchange between laboratory instruments and automation software. Adopting these standards is critical to fostering a truly interoperable automation ecosystem.
  3. Information Models (Ontologies, and Semantics): In addition to hardware and software interoperability, a shared digital representation of laboratory devices and workflows is essential for automation orchestration and real-time decision-making. Digital twin technology enables the creation of virtual representations of laboratory assets, capturing real-time operational states, spatial relationships, and functional capabilities. Digital twins have been systematically characterized in the manufacturing domain as key enablers of predictive and responsive automation, with lessons applicable to laboratory environments as well 4. Additionally, semantic frameworks and ontologies provide a common language for defining laboratory processes, ensuring that different systems can interpret and execute workflows consistently. Standardized information models will facilitate machine-readable descriptions of laboratory activities, allowing for automated workflow adaptation, intelligent scheduling, and predictive maintenance.

The LAPP Framework aims to address these standardization challenges by defining a structured approach to system integration. By leveraging hierarchical workflow decomposition, digital twin technology, and a reference architecture model, LAPP provides a blueprint for achieving seamless, scalable, and vendor-neutral automation in life science laboratories.

Through collaboration between pharmaceutical companies, automation vendors, and industry consortia, the adoption of standardized system architectures and interoperability protocols will enable laboratories to move beyond fragmented, hardcoded integrations toward flexible, future-proof automation solutions. Establishing a shared vision for plug-and-play laboratory automation is not just a technical necessity—it is an industry-wide imperative for ensuring efficiency, compliance, and innovation in pharmaceutical quality control.

The Need for Standardization in R&D Laboratory Automation

In R&D laboratories, automation plays a crucial role in accelerating drug discovery, optimizing high-throughput screening, and improving experimental reproducibility. Program-guided methods for designing and evaluating experiments are increasingly supported by automation frameworks that combine liquid handling, sensor integration, and automated data pipelines 5. Unlike Quality Control (QC) and manufacturing, where processes are well-defined and highly regulated, R&D workflows are dynamic, evolving, and frequently reconfigured to accommodate new assays, technologies, and research objectives. This constant adaptation demands a flexible and modular automation infrastructure that can integrate emerging technologies without extensive reengineering.

Currently, R&D laboratories rely on a diverse ecosystem of instruments, robotic platforms, and data infrastructure tools, many of which operate as standalone systems with proprietary software and limited interoperability. This fragmentation creates barriers to adaptive workflow automation, requiring manual intervention for instrument setup, sample handling, and data exchange. While stand-alone automation solutions exist, they are often custom-built for specific tasks and lack the connectivity and scalability to support a continuously evolving experimental landscape.

To enable true automation-driven discovery, R&D laboratories need standardized integration frameworks that allow instruments, robotics, and informatics systems to communicate seamlessly while remaining adaptable to new scientific advancements. Key priorities for standardization in R&D include:

  1. Standardized System Architecture

    By establishing common standards, harmonized guidelines contribute to increased professionalism, interoperability, and efficiency, benefiting the broader development ecosystem. Moreover, they promote the creation of reusable building blocks, which can significantly speed up future integrations and development efforts by providing ready-to-use components that adhere to industry standards. System architectures like ARChemist exemplify this approach, demonstrating how modularity and vendor-agnostic interfaces enable flexible orchestration of robotic chemistry workflows.

  2. Data-Centric Automation with Digital Twins and Ontologies

    The experimental nature of R&D demands real-time feedback loops that enable the rapid adjustments of workflows. Digital twin technology can model laboratory environments, predict system performance, and optimize instrument configurations before physical execution. Meanwhile, semantic ontologies ensure that experimental protocols are described in a machine-readable format, allowing automation systems to intelligently adapt to changing conditions. Recent efforts to formalize biological workflows into open, machine-executable formats have shown promise in enabling protocol reusability, intelligent orchestration, and AI-supported research 7. These technologies are essential for enabling self-optimizing, AI-driven research environments, where robotic systems can adjust variables based on real-time assay results, increasing efficiency and scientific insight.

  3. Semantics: Cross-Vendor Scripting Framework

    Harmonizing scripting guidelines, e.g., for liquid handling systems across vendors, ensures consistency, readability and maintainability of code, making it easier for different teams to collaborate and integrate their workflows effectively. They reduce the likelihood of errors by promoting best practices in coding, validation, and error handling, leading to higher code quality and more robust scripts across the industry. Additionally, unified guidelines support scalability, enabling the codebase to grow and evolve without becoming unmanageable. They also speed up the onboarding process for new developers and facilitate the use of automated tooling for code formatting and testing. One such initiative is the eXtensible Description Language (XDL), an open and platform-agnostic standard designed to represent chemical protocols in a structured, machine-readable form. XDL enables the unambiguous definition of liquid handling operations, facilitates automated verification, and promotes reproducibility across laboratories and automation platforms 8.

By adopting standardized system architectures, open interoperability protocols, and intelligent information models, pharmaceutical R&D can fully leverage next-generation automation to accelerate discovery, enhance scientific insight, and streamline the transition from research to development.

The need in QC

The case for plug-and-play automation in QC laboratories is especially pressing. Unlike R&D environments, where there is room for experimentation and flexibility, QC labs operate under stringent regulatory oversight and must prioritize capacity, compliance, and reproducibility. Despite increasing interest in automation, most QC laboratories are still far from realizing the vision of a truly digital, modular, and interoperable environment.

Today, many QC labs continue to rely heavily on paper-based documentation and isolated digital systems that do not communicate effectively. The shift from paper to paperless is ongoing, but progress is often fragmented, with many implementations limited to non-standard, vendor-specific solutions. These "small island" systems may digitize individual processes but fail to support end-to-end data integration or orchestrated automation. Without a unifying architecture or semantic framework, the result is a patchwork of incompatible components that hinder scalability and cross-site harmonization. From a financial perspective, the implementation of automation in QC laboratories poses unique challenges compared to R&D settings. A major factor is the significant overhead associated with commissioning, qualification, and validation (CQV), which alone can jeopardize the return on investment. While capital expenditure—including the cost of equipment, integration engineering, and facility modifications—is a substantial one-time cost, CQV expenses must also be accounted for. These vary depending on the complexity of the automation solution, the number of systems it touches, and any required changes to existing methods or procedures. Furthermore, integration with both IT and OT systems in QC—and often with broader enterprise systems like manufacturing execution systems (MES); systems, applications, and products in data processing (SAP); or document management systems—can further increase costs. In addition to capital costs, operational expenditures such as maintenance and software updates must be factored into the total cost of ownership. Many of these cost drivers are directly linked to the complexity of the solution and the integration effort required. Reducing these challenges demands a long-term commitment to harmonization, standardization, and modularization—principles that this paper strongly advocates.

A key prerequisite is integrated data flows between instruments. While individual devices may offer digital interfaces, they rarely provide standardized application programming interfaces (APIs) or data models, making it difficult to automate result capture, context-aware decision-making, or workflow handovers. As a result, valuable data remains siloed, and opportunities for advanced capabilities such as real-time release testing, predictive maintenance, or adaptive scheduling are lost.

Furthermore, true digital transformation in QC requires more than connectivity—it requires meaning. Without a semantic layer that governs how data is interpreted and used, automation remains brittle and overly reliant on hardcoded logic. This becomes especially important when sharing data with other companies, where a semantic layer greatly simplifies the interpretation of external data. Standardized ontologies and machine-readable information models are critical to enabling intelligent orchestration, exception handling, and long-term system maintainability.

While greenfield laboratories—those built from scratch—can more readily adopt modular, standards-based automation architectures, it is unclear whether most companies truly consider automation from the outset in greenfield projects. In many cases, QC is involved too late in the design process, effectively turning what could have been a greenfield setup into a practically brownfield challenge. Most QC automation projects are conducted in true brownfield sites with legacy infrastructure. In these contexts, integration efforts must be pragmatic and capacity-driven, minimizing disruption to ongoing operations while progressively building toward a plug-and-play future.

Encouragingly, the growing adoption of standard application programming interfaces (APIs) and interoperability frameworks offer a pathway forward. As pharmaceutical companies and vendors converge on shared standards, the foundation is being laid for scalable, modular automation that can evolve alongside laboratory needs.

Once data flows from different instruments and systems are consolidated, QC labs can begin to model their processes and identify opportunities for improvement while recognizing the inherent constraints of the current setup. This level of operational transparency is not commonly available today but is essential for driving continuous improvement and informed decision-making.

To bridge the gap between vision and reality, the industry must prioritize standardization not only as a technical enabler but as a strategic imperative for modernizing QC laboratories. The LAPP Framework addresses this by offering a structured, scalable model for laboratory automation integration—capable of meeting today’s regulatory demands while unlocking tomorrow’s digital potential.

Conclusion

Both R&D and QC laboratories struggle with fragmented systems and non-standardized validation approaches that hinder digital transformation. Overcoming these barriers requires a unified strategy built on standardization and shared understanding.

In Part Two, we will move from the why to the how—reviewing the aspects of IT-OT convergence and validation and introducing the LAPP Framework with its three pillars: semantics, information models (digital twins), and a reference architecture model.

In Part Three, these architectural ideas will be translated into practice, exploring the interoperability protocols, ontologies, and data standards that can turn plug-and-play laboratory automation into reality.6

View Part Two of the series.

Disclaimer

iSpeak Blog posts provide an opportunity for the dissemination of ideas and opinions on topics impacting the pharmaceutical industry. Ideas and opinions expressed in iSpeak Blog posts are those of the author(s) and publication thereof does not imply endorsement by ISPE.


Submit Your Best Content to ISPE

ISPE’s official blog, iSpeak accepts contributions from our Members and professionals in the pharma industry.  

What We Look For 

References