The Laboratory Automation Plug and Play (LAPP) Framework – The Case for Standardization in Research and Development (R&D) and Quality Control (QC) Laboratories: Part Three
This blog post focuses on the technological enablers that make this vision tangible. The authors examine the interoperability protocols, such as the Standardization in Lab Automation (SiLA) 21 and Laboratory and Analytical Device Standard (LADS),2 data standards, including Allotrope3 and Analytical Information Markup Language (AnIML),4 and ontologies that allow instruments, robots, and software to work together seamlessly. The article concludes with a perspective on the path forward—how the industry can align around open standards, modular architectures, and shared semantics to achieve truly plug-and-play laboratory automation.
Building Blocks
Integrating standalone laboratory devices into semi-automated or fully automated ecosystems is essential for achieving optimal efficiency and scalability. Beyond automating material flow—such as through robotic solutions—true automation requires seamless interoperability between devices. This is accomplished through orchestration and scheduling systems that manage communication, synchronization, and task execution across various instruments.
The figure below illustrates the ecosystem of standardization efforts underpinning the LAPP framework,5, highlighting how foundational standards in industrial automation and lab automation converge to enable modular, interoperable systems. On the left, core standardization domains—ranging from system architecture and interoperability protocols to ontologies and data formats—form the conceptual foundation. These domains are realized through industrial automation standards such as O-PAS,8 ISA 95,9 and RAMI 4.0,10 alongside modular protocols like OPC-UA11 and Module Type Package (MTP). The laboratory automation space builds on these foundations with domain-specific standards like SiLA 2 and OPC-UA LADS, complemented by ontologies (e.g., LAB-OP, Allotrope) and workflow description languages (e.g., XDL). Together, these components enable the implementation of key elements of the LAPP framework: the Reference Architecture Model (LAPP-RAM),7 Robotic Activity Representations (LAPP-RARs),7 and Digital Twins (LAPP-DT).6 This layered structure ensures vertical and horizontal integration while enabling plug-and-play functionality across laboratory devices and systems, bridging industrial best practices with laboratory-specific requirements.
Figure 1: The ecosystem of standardization efforts in laboratory automation (Authors' own work)

Data Protocols
Within the LAPP framework, standardized data protocols form a foundational layer that enables interoperability, traceability, and seamless integration across modular laboratory systems. Harmonized data formats ensure that devices, software, and workflows communicate effectively, regardless of vendor or function, while preserving scientific context and regulatory compliance. To achieve this, industry-wide adoption of structured, semantically rich, and machine-readable data standards is essential.
One of the key enablers in this space is the Allotrope Foundation, whose members have co-developed a universal data format that captures experimental context through linked data. This approach connects raw data, results, metadata, and provenance in a way that eliminates ambiguity and enhances scientific reproducibility. The Allotrope™ Ontologies and Data Models formalize experimental parameters and establish relationships between data, people, equipment, processes, and studies. By embedding context and traceability directly into the data layer, the Allotrope framework supports real-time data integrity and regulatory compliance—making it easier to automate quality decisions and fuel advanced analytics. In essence, Allotrope provides a connected, archive-ready digital experiment—evidence you can trust and act upon.
Complementing this, the AnIML is an open standard developed under the American Society for Testing and Materials (ASTM) that defines a vendor-neutral XML-based format for analytical data. It is designed to capture the full scope of instrument outputs across techniques such as chromatography, spectroscopy, and mass spectrometry. AnIML enables consistent storage, transfer, and interpretation of complex analytical results, facilitating interoperability between laboratory software, electronic laboratory networks, and long-term archives.
Together, Allotrope and AnIML exemplify an emerging ecosystem of interoperable, semantically enriched laboratory data. While Allotrope focuses on linked data and ontological structure, AnIML provides a lightweight yet robust format for raw analytical measurements. Implemented within the LAPP framework, these standards support plug-and-play integration by ensuring that all modules—from instruments to informatics systems—speak a common data language.
Interoperability Protocols
Laboratory devices are typically controlled via application programming interfaces (APIs), which enable automated execution of experimental sequences (assays). However, most vendor-provided APIs are proprietary and lack adherence to industry standards, creating a fragmented automation landscape. To overcome this, system integrators often develop proprietary software drivers that act as wrappers around these APIs to enable compatibility with their respective automation platforms. While this allows for integration within specific ecosystems, it significantly limits cross-platform compatibility and increases development overhead when integrating new devices.
A more effective approach is the adoption of open, standardized communication protocols that support plug-and-play compatibility across laboratory automation systems. Standardized interfaces eliminate redundant integration efforts, streamline automation implementation, and ensure that laboratory equipment can be flexibly deployed in diverse environments.
One leading initiative addressing this challenge is run by the SiLA Consortium. SiLA is a membership organization of software providers, system integrators, pharmaceutical and biotech companies, and academic institutions. It has developed the SiLA 2 standard, which defines how laboratory instruments should interface with automation software.
Within the SiLA ecosystem, device capabilities are described as feature definitions in XML format. While SiLA provides conventions for structuring these features, flexibility remains based on the device type and lab use case, leading to some variation across vendors. For instance, the same type of mobile manipulator (MoMa) might have different feature definitions depending on whether it is developed by the Fraunhofer Institute or Astech Projects.
Another emerging standard is LADS, which builds on the widely adopted OPC UA framework. Originally developed for industrial automation, OPC UA offers extensive capabilities for state-based control, module modularization, and object-oriented representations of equipment (e.g., spatial hierarchy of robotic arms). LADS adds a laboratory-specific layer to this, defining how lab instruments are represented as functional and physical modules. This allows for improved orchestration, basic automation, and asset management in laboratory environments. Governed by the German lab equipment association Spectaris, LADS released its first specification in November 2023. While adoption is still in early stages, it shows promise by leveraging mature industrial standards for laboratory settings.
Both SiLA 2 and LADS aim to bridge the integration gap between laboratory devices and control systems, aligning with the goals of the LAPP framework. Their adoption will be key to enabling modular, interoperable laboratory environments that are automation-ready by design.
A Path Forward
To move beyond fragmented, vendor-specific solutions, the pharmaceutical industry must align around open standards, modular architectures, and semantic integration. This section outlines key principles and concrete steps to help stakeholders build connected, future-ready laboratory environments. To enable this shift, organizations should align around a few key principles:
- Standardize first: Broader adoption of open protocols—and convergence on a smaller set of core standards—will reduce integration overhead and accelerate implementation.
- Modularity by design: Build systems from loosely coupled, clearly defined components. This allows for phased deployment, easier validation, and incremental upgrades without requalifying entire stacks.
- Shared responsibility: The convergence of IT and OT requires rethinking traditional ownership models. Clear interfaces between automation layers support division of responsibility and streamlined compliance, whether via commissioning, qualification, and validation (CQV) approaches owned by engineering, or computer system validation (CSV) owned by QC.
- Semantic integration: Beyond connectivity, automation systems must share a common understanding of tasks, context, and data. Ontologies and digital twins are key to enabling intelligent orchestration and scalable, data-driven workflows.
- Pilot, scale, and iterate: Early adopters play a vital role in shaping best practices. Industry-wide progress will depend on sharing lessons learned, publishing reference implementations, and working collaboratively to refine standards and architectures.
- Environmental, health, and safety (EHS) alignment: EHS requirements must be addressed early in the automation lifecycle. EHS concerns—such as safe handling of chemicals, equipment containment, or emergency procedures—can derail otherwise viable automation initiatives if not integrated from the start. A proactive, cross-functional EHS strategy ensures automation designs are not only compliant but also practically deployable in regulated lab environments.
As the figure below illustrates, the mission can be visualized as assembling a complex jigsaw puzzle, where each piece represents a different element of the lab automation ecosystem—from interoperability protocols for instrument control, to standardized data formats, ontologies, digital twin information models, and reference architecture models capturing system functionality in context. These pieces must fit together to form a coherent, scalable, and future-proof automation landscape. The recommended strategy mirrors how one would approach a physical puzzle: by first identifying the edge pieces to frame the system boundaries and then building around distinctive features—or “islands”—where standards, models, or protocols already exist.
Progress must happen in parallel from both directions:
- Top-down, by creating a high-level abstract and conceptual blueprint or reference architecture model and hierarchically breaking it down to find the place of the low-level pieces.
- Bottom-up, by working on the building blocks such as interoperability protocols and their domain-specific semantic descriptions (e.g., for supportive robotics). These should each converge into an overarching high-level ecosystem.
This dual approach—framing the system while also building local detail—supports both vision and execution, helping the industry converge on an integrated and semantically aligned automation framework.
Figure 2: The fragmented standardization landscape in laboratory automation (Authors' own work)

Ultimately, the path forward is not defined by any single framework or technology, but by a collective commitment to openness, modularity, and interoperability. Through shared standards and coordinated action, the industry can unlock more agile, efficient, and future-ready laboratory operations.
Next Steps for Industry Stakeholders
- Identify modular boundaries and standardization opportunities in existing lab infrastructure.
- Require compatibility with open standards in new procurement processes.
- Invest in semantic modelling and cross-platform digital twin strategies.
- Engage in pilot projects and contribute learnings to the broader community.
- Facilitate collaboration between peers.
- By acting on these priorities, the pharmaceutical sector can unlock more agile, scalable, and compliant laboratory operations.
Conclusion
Interoperability and data standardization are where the LAPP framework’s three pillars take practical form. Together, they transform conceptual architecture into working, validated systems capable of adaptive, data-driven operation.
Across this series, we have seen how:
- Part One identified the need for harmonization in R&D and QC
- Part Two reviewed how IT-OT convergence and modular validation support standardization and introduced the LAPP Framework built on semantics, information models, and a reference architecture model
- Part Three outlined what technologies can implement these ideas—the interoperability and data standards that form the backbone of future laboratory ecosystems
The path ahead calls for continued collaboration among pharmaceutical companies, integrators, and standards organizations to make plug-and-play laboratory automation not just a vision but the operational standard of tomorrow.