• 9 Mar 2026
  • Rob Crook

High Throughput Experimentation: Fully Integrated or Modular System?

  • Chemical Development
  • High Throughput Experimentation
  • Process R&D
  • Thought Leaders

Introduction

In the latter half of the 20th century, automation began transforming research laboratories worldwide. By the 1990s, robotics and miniaturisation had enabled the first High Throughput Experimentation (HTE) systems, allowing thousands of molecules to be tested weekly in microplates.  Originating in molecular biology, HTE evolved from manual multichannel pipettes to fully automated platforms. In the 2000s, powerful advances in computing and analytical technologies further strengthened these systems. Over the past decade, the integration of Artificial Intelligence (AI) and predictive models has enabled researchers to extract greater value from rapidly expanding datasets.

In process chemistry, particularly within the pharmaceutical industry, increasing pressure to deliver complex molecules on more compressed timelines have driven adoption of HTE. Compared with traditional one-factor-at-a-time optimisation, HTE enables rapid, structured and parallel exploration of broader chemical space while minimising material consumption. It accelerates identification of scalable conditions and supports early de-risking by assessing robustness and critical process parameters early in development.

Modern HTE platforms are now more reliable and efficient, replacing labour-intensive manual weighing and dispensing with precise automation. Enhanced accuracy and miniaturisation allow rapid, data-rich optimisation at every stage of synthesis. When combined with advances in AI and machine learning, high-throughput experimentation can significantly accelerate development timeline, often reducing screening time by half or more, while improving reaction performance.

However, implementing HTE infrastructure requires careful strategic consideration. Teams must choose between fully integrated systems and modular approaches, each offering distinct advantages and trade-offs. Following the recent refresh of HTE capabilities at CatSci (2024/25), this article outlines key considerations and practical insights to support informed decision-making and share best practice across the chemical community.

Fully integrated systems

Fully integrated high throughput experimentation (HTE) systems are now widely available; they consolidate multiple workflow steps  into a single platform. These instruments typically incorporate solid and liquid dispensing, temperature-controlled reactors with stirring capabilities, and automated sampling. Many can also be directly coupled to analytical tools such as LCMS or GCMS, creating a seamless path from reaction setup to analysis.

Advantages of Fully Integrated HTE Systems

One of the primary advantages of a fully integrated system is workflow continuity. Housing plate generation, reaction execution, sampling, and analysis within one unit reduces the risk of miscommunication between instruments and enables smooth transitions between steps. This consolidation often results in a more compact footprint compared to modular alternatives—an important consideration when systems are housed inside gloveboxes, where space is limited.

Software integration is another significant benefit. A unified control system simplifies operation, enhances coordination across process steps, and reduces training time for routine users. Centralised data capture within a single platform also strengthens data integrity and traceability—critical factors in research and development environments where audit trails and reproducibility are essential.

Limitations of Fully Integrated HTE Systems

However, these advantages must be weighed against several potential challenges.

Fully integrated systems typically require substantial upfront investment, which may be prohibitive for smaller companies or start-ups. Flexibility can also be restricted. Such platforms are generally designed around standard workflows and may struggle to accommodate specialised applications such as pressure chemistry, photochemistry, or enzymatic reactions. In these cases, manual user intervention may be required, undermining the efficiency gains HTE is intended to deliver. Modular systems, by contrast, can often be adapted more easily to niche requirements.

Vendor dependency presents another consideration. Reliance on a single supplier for hardware, consumables, and servicing increases exposure to pricing changes, supply chain disruptions, or product discontinuation. If maintenance becomes challenging or parts become unavailable, organisations may be forced into costly system replacement.

Finally, resilience is a key operational factor. When a fully integrated unit is offline, all HTE activity halts. In contrast, in a modular setup, individual components can continue operating independently, preserving productivity and data generation.

Summary

Ultimately, fully integrated HTE systems offer streamlined workflows and operational simplicity, but organisations must carefully assess cost, flexibility, supplier risk, and long-term resilience before committing to this approach.

Decentralised Modular System

An alternative to  is a decentralised, modular HTE architecture, where separate instruments are deployed for each workflow stage. A core configuration typically includes a solid handler, a liquid handler, and a plate reactor, with analysis performed offline via LCMS or GCMS systems. From this foundation, additional specialised modules can be introduced to address specific chemistries or process needs, such as photochemistry reactors, orbital shakers, cryogenic reactors, or crystallisation platforms.

Advantages of a Decentralised Modular System

One of the defining strengths of a modular setup is its flexibility. Organisations can build the platform stepwise, aligning investment with budget cycles and evolving project demands. Because each instrument represents a smaller initial cost than a fully integrated system, companies—particularly those with tighter capital constraints—can expand capabilities progressively while still improving efficiency and throughput. This approach is especially attractive to teams seeking versatility, as modular systems are typically easier to adapt to diverse reaction types. Manual interventions can also be incorporated into automated workflows when needed, offering additional practical flexibility.

Scalability is another key advantage. Modules can be added organically as requirements grow, without replacing the entire infrastructure.

Furthermore, modular systems offer operational resilience; if one instrument is unavailable due to maintenance, breakdown, or supply issues, the remaining components can often continue operating. This redundancy ensures that data generation and project delivery can proceed, at least in part, rather than halting entirely, as might occur with a single integrated platform.

Limitations of a Decentralised Modular System

However, these benefits come with trade-offs.

The principal challenge of modular systems lies in integration. Coordinating multiple instruments requires a central control or data-management platform capable of reliable workflow sequencing and data transfer between systems. This introduces technical complexity and may necessitate specialist expertise, including software configuration or coding skills.

Users must also be trained on multiple instruments and software environments, raising the barrier to widespread adoption within a team.

Data management presents an additional risk. With outputs generated across different platforms, the potential for data fragmentation or loss increases unless interfaces are carefully designed and maintained.

Summary

In summary, while a modular HTE strategy offers flexibility, scalability, and resilience, it demands thoughtful integration, robust data infrastructure, and broader technical expertise to realise its full potential.

Conclusion

There is no universal solution; the optimal HTE architecture depends on organisational scale, workflow diversity, capital structure, and internal technical expertise.

In my view, a modular TE approach offers greater flexibility and is particularly well suited to smaller research teams, such as start-ups, SMEs, or academic groups. These organisations often benefit from organic growth and stepwise investment, enabling a gradual increase in productivity without the significant upfront cost of a fully integrated platform. For many, the capital outlay required for an all-in-one system—combined with the need to tailor workflows to specific research priorities—can be prohibitive.

That said, successful implementation of a modular system depends on careful integration of its individual components. Without thoughtful design, users risk inefficiencies or data fragmentation. However, access to coding expertise and system-integration tools is becoming increasingly straightforward. At CatSci, we addressed this challenge through the development of our CHETAH workflow, enabling seamless coordination between instruments and data streams.

Fully integrated systems undoubtedly have their place. They are particularly effective in environments with consistent, high-throughput workflows focused on similar chemistries, and where laboratory footprint is less constrained. They can also be advantageous in organisations with limited in-house IT or coding expertise, as the integrated software and predefined workflows reduce the operational burden on users.

High-Throughput Experimentation at CatSci

During the refresh of our HTE capabilities in 2024/25, it became clear that there is no one-size-fits-all solution so we evaluated both approaches extensively Our decision between a fully integrated and modular system was guided by the need for versatility across diverse chemistries, including photochemical, gas-handling, ad temperature-variable reactions, alongside our team’s strong expertise in coding and automation.

We ultimately, we selected a modular setup, underpinned by our CHETAH workflow, to ensure we could support a broad range of reaction development and optimisation challenges, with CHETAH ensuring seamless coordination between instruments and structured data management.

By combining flexible physical infrastructure with our integrated digital CHETAH workflow and in-house expertise, we achieve accelerated data generation and decision-making in line with our customers’ goals.

As we continue to evolve our HTE capabilities, our focus remains on expanding specialist applications and delivering excellence in support of faster, more efficient medicine development.

If you’re looking to speed up your reaction screening or would like to know more about our CHETAH platform, I’d be happy to talk to you about how CatSci’s strategically designed high-throughput framework can support you.

Resources & Insights

  • 9 Mar 2026
  • Rob Crook
High Throughput Experimentation: Fully Integrated or Modular System?
  • Chemical Development
  • High Throughput Experimentation
  • Process R&D
  • Thought Leaders
  • 5 Feb 2026
2nd Advances in Translational Drug Development 2026
  • Events
  • 4 Feb 2026
  • Loïc Roux
Make as Insight, Not Execution – Why “Make” Is the Most Misunderstood Step in DMTA
  • Design-Make-Test
  • DMTA
  • Thought Leaders
  • 22 Jan 2026
  • Loïc Roux
Designing Therapeutic Oligonucleotides – A Practical Framework from Biology to Sequence Selection
  • Design-Make-Test
  • DMTA
  • Oligonucleotides
  • Thought Leaders