The Laboratory Automation Plug and Play (LAPP) Framework – Information Technology-Operational Technology (IT-OT) Convergence, Validation, and the Three Pillars of Laboratory Automation Plug and Play (LAPP): Part Two
Building on the discussion in Part One about the need for standardization across research and development (R&D) and quality control (QC) laboratories, this second part of the series explores how laboratories can progress toward plug-and-play automation.
The first section of this blog post reviews IT-OT convergence—how harmonizing information and operational technologies through layered architectures, standardized APIs, and modular ownership structures creates scalable and maintainable systems. It also examines validation strategies, showing how modular qualification and risk-based approaches can reduce compliance overhead while maintaining data integrity.
The second section introduces the LAPP concept, which unifies these principles into a structured framework built on three foundational pillars.
Together, these elements provide a blueprint for modular, validated, and interoperable automation in regulated laboratory environments. Part Three will build upon these pillars to explore the technological building blocks—interoperability protocols, data standards, and ontologies—that bring LAPP to life.
Introduction
IT-OT standardization
Achieving true plug-and-play automation in QC laboratories requires a harmonized integration of IT and OT. Historically, IT and OT have operated in separate domains, with IT focused on enterprise systems and data management, while OT managed real-time control and instrumentation. Today, laboratory automation increasingly spans both layers, making IT-OT convergence essential for scalable and maintainable system architectures.
A well-defined reference architecture model that clearly delineates the different layers—ranging from physical instrumentation to control logic, orchestration, data aggregation, and enterprise integration—is foundational to this convergence. By defining interfaces and responsibilities for each layer, the architecture supports a modular approach to automation design. This modularity, in turn, allows for separation of ownership and qualification: by allowing individual module development, validation, and maintenance, significantly reducing the overhead for system changes or expansions.
Standardized APIs between layers are key enablers in this framework. They ensure that data and control commands can flow reliably across the stack, even with multi-vendor components. This not only enhances interoperability but also simplifies commissioning, qualification, and validation (CQV) efforts. When modules communicate through well-defined, validated interfaces, laboratories can introduce new components without requalifying the entire automation stack.
This is the essence of true plug-and-play capability: integrating new technologies—such as a novel analytical instrument or a supportive robotic solution—into existing workflows without disrupting upstream or downstream processes. It enables QC labs to innovate incrementally while preserving validated systems..
The ecosystem of standards supporting this model is growing but remains fragmented. Protocols such as the Standardization in Lab Automation (SiLA2) and Laboratory and Analytical Device Standard (LADS) provide strong building blocks for device-level interoperability, offering state-based control and metadata-rich communication. At higher levels of abstraction, emerging standards aim to describe workflows, semantic data layers, and orchestration logic. While the landscape is still evolving, early adoption of well-supported, community-driven standards can provide immediate value while laying the groundwork for broader convergence.
In this vision, standardized interfaces and semantic models act as building blocks—interchangeable puzzle pieces that fit together into a coherent whole. The LAPP framework proposes a structured approach to selecting, layering, and integrating these blocks, enabling laboratories to move from proprietary silos to flexible ecosystems built on open, modular, and verifiable components.
Validation Aspects
As laboratory automation systems become increasingly complex, the distinction between CQV and computerized system validation (CSV) becomes more nuanced—especially when automation components blur the lines between IT systems, analytical instrumentation, and engineering equipment.
A key challenge lies in defining ownership and qualification responsibilities for supportive robotics and automation systems used in QC labs. These systems—such as sample handlers, mobile robots, or collaborative arms—do not always fit neatly into traditional categories. Are they part of the analytical instrument? Do they fall under the computerized systems framework governed by IT? Or are they engineering assets, similar to heating, ventilation, and air conditioning (HVAC) or building automation systems?
The answer often depends on the internal structure and competencies within a given organization. In some companies, automation is managed centrally under engineering; in others, it resides within QC or IT. This lack of a consistent framework can complicate validation strategies and can slow the adoption of innovative technologies.
IT security and data integrity considerations play a major role in determining classification. If a robotic system generates or manages GMP-critical data, it may fall under CSV requirements, demanding rigorous risk-based validation aligned with data integrity principles (e.g., FAIR or ALCOA+). On the other hand, if the system only performs mechanical support functions without directly impacting regulated data, it may be qualified under engineering protocols or handled as auxiliary equipment.
There is no single best approach. Each case must be evaluated individually, taking into account:
- The functional role of the automation system within the lab workflow
- The impact on product quality and data integrity
- The ownership structure and expertise of the organization
- The applicability of GAMP® 5 categories and other relevant frameworks
What is clear, however, is that increased standardization and modularity can help simplify these decisions. Systems designed with clear interfaces, encapsulated functions, and standardized data flows make it easier to define boundaries of responsibility, apply the appropriate validation methodology, and streamline compliance efforts.
Ultimately, reconciling CQV and CSV approaches in laboratory automation requires cross-functional collaboration between QC, IT, and engineering. Establishing a common language and shared responsibility model is essential for enabling both regulatory compliance and operational agility in automated QC environments.
The LAPP Concept
The LAPP framework is one example of how modular integration and semantic alignment can be achieved.1 While not the only path forward, LAPP demonstrates how reference architectures, digital twins, and standardized semantics can reduce configuration time, support cross-platform compatibility, and simplify validation.
A fundamental component of LAPP is its semantic workflow decomposition, which introduces a hierarchical framework for structuring laboratory tasks. By drawing inspiration from project management and industrial automation, this methodology systematically breaks down complex laboratory workflows into distinct levels, ranging from high-level experimental procedures to low-level robotic actions. Each task is structured in a way that enables clear assignment to specific laboratory devices or human operators, ensuring a coherent and modular approach to automation.
To streamline integration and eliminate the need for manual robot setup, LAPP incorporates a digital twin (DT) framework.2 DTs serve as virtual representations of laboratory devices, storing critical interfacing information that allows robots to interact with them seamlessly. Through the use of optical markers (such as QR codes and fiducial markers), laboratory instruments can be recognized and localized autonomously. After fetching the relative robot positions for sample handover, the robot can physically interact with the device right away, making manual robot teaching unnecessary. This approach not only accelerates deployment but also ensures bidirectional data flow between physical devices and their digital counterparts, enabling real-time monitoring, process optimization, and adaptive automation.
LAPP also introduces a Reference Architecture Model (RAM) that maps the hierarchical workflow decomposition onto a structured control architecture.3 Each level represents a logical and functional abstraction, allowing for independent qualification of the entity. In between levels, the authors of this paper advocate for standard, open APIs to connect and transport data between levels. Threaded through all layers is a common ontology which promotes data transparency and consistency of terminology. LAPP-RAM is specifically crafted for the realm of laboratory automation, with the overarching goal of serving as a comprehensive and exhaustive reference architecture, akin to the RAMI 4.0 model,4 encompassing both vertical and horizontal integration structures.
By integrating with existing automation standards such as SiLA and OPC-UA, this model provides a scalable and modular foundation for robotic systems in laboratory environments. The reference architecture ensures that experimental protocols can be executed across different platforms while maintaining compatibility with a wide range of laboratory instruments and robotic systems.
The framework’s feasibility was validated through two key implementations:
- Academic prototype using a research-oriented TIAGo++ mobile manipulator, demonstrating the core principles of LAPP in a controlled laboratory setting.
- Industrial prototype using mobERT, a market-ready mobile manipulator, which was specifically adapted for labware transfer, confirming LAPP’s ability to support real-world automation workflows by enabling plug-and-play integration with existing laboratory infrastructure.
By standardizing how laboratory automation systems communicate and interact, LAPP represents a significant step toward eliminating manual configuration and fostering true plug-and-play robotics. Beyond efficiency gains, it enhances interoperability, simplifies automation deployment, and provides a scalable solution for the future of laboratory automation.
A recent concept paper by the ISPE Pharma 4.0™ Plug and Produce Subcommittee further demonstrates how such standardization can be achieved using the Asset Administration Shell (AAS) as a DT framework for laboratory devices.5 This approach enables simplified, vendor-agnostic integration of pre-qualified instruments through a shared semantic layer and interoperable service interfaces. The use of the AAS to encapsulate both operational data and qualification metadata aligns closely with the LAPP framework’s principles of modularity, standardization, and lifecycle integration. By combining LAPP’s hierarchical workflow decomposition with AAS-driven device descriptions, laboratories can implement plug-and-play robotics that are both technically integrated and compliant by design.
Conclusion
By combining IT-OT convergence and modular validation with the introduction of the LAPP framework, this article outlines how laboratories can transition from fragmented systems to structured, interoperable ecosystems. LAPP’s three pillars—semantics, information models (DTs), and a reference architecture model—form the foundation for plug-and-play automation that is both agile and compliant.
In Part One, we discussed why standardization is essential. In Part Three, we will examine what concrete technologies can make it possible—the interoperability standards, data formats, and ontological frameworks that operationalize the LAPP concept.6