Guest Column | May 28, 2025

How To Automate Bioprocesses Without Locking Out Future Upgrades

A conversation with Tiago Aguiar

Biomedical robot-GettyImages-172648706

Process upgrades can carry hidden costs — not in financial terms but in missed chances when better technology comes along. It’s hard to know when the next advancement will turn today’s state-of-the-art into an expensive doorstop.

While there’s no perfect way to predict the future of automation, you can make smart decisions now that preserve your ability to upgrade later, says Tiago Aguiar, a process development and automation expert.

An earlier conversation with Aguiar, focused on timing automation upgrades, sparked another question: how can manufacturers keep the door open for future improvements? We followed up — here’s what he told us. His answers have been edited slightly for clarity.

To what extent does automation constrain process development once a process is automated? How much room is left for ongoing optimization, especially when processes evolve rapidly between early and late phase development?

That's a great question to start with because, obviously, as much as I like introducing automation in the process, it is very important to maintain flexibility to continue developing and optimizing the process. So, in short, I think the answer is that yes, automation will restrain your process development in a way, but there are ways that you can minimize that restraint.

When you're automating your process, I think you have to carefully balance the automation you're implementing with maintaining the freedom to continue developing and optimizing your process through the life cycle of the product. And that can be a balance that is difficult to understand, especially at early stages when it's still not clear what your commercial process is going to look like.

When you automate early, you certainly freeze quite a few elements of the process, but a lot of platforms actually separate software from hardware and, therefore, depending how flexible these protocols or software programs are, you might still have room to iterate your process to optimize process parameters to run DOE studies while staying within validated software, which doesn't require revalidation at every change.

So, it is important to have clarity on these questions before locking your process into a certain device. As I mentioned in my earlier conversation, automating early avoids the painful path of revalidating your process at a later stage when you're dealing with scaling up and with other commercialization commitments.

I think real constraints appear when the hardware itself limits and defines the process and the product.

For example, consider all-in-one units, like the CliniMACS Prodigy, which integrates most of the manufacturing process for CAR-T manufacturing in a single device. While it's a great tool for the job, the hardware itself restricts process parameters such as your culture volume, how many cells you can load into the system for enrichment, and how much process monitoring you can have.

By contrast, a modular setup lets you use whatever technology best suits your process at every step.

It allows you to swap a cell washer; it allows you to upgrade the bioreactor or add any inline/ online analytics with minimal knock-on validation. So, in practical terms, and especially in modular settings, I think the best approach is to first automate your high-touch process steps that are less prone to change and leave the more biology-sensitive unit operations open. For example, you may start by automating cell washing or cell selection, which are steps that might not change significantly over the lifetime of your product. And for each, there's already established manufacturing technologies available in the market. Leave steps such as cell activation or cell transduction, which are more biology dependent and more prone to change, open and flexible at first. This, to me, would be the best approach.

Watch the full response below

How do you balance the need for modularity and flexibility in automation design with a push toward fully integrated closed systems? Is it realistic to build platforms that can adapt to different cell types or evolving process requirements without requiring a full system overhaul?

There's no clear answer to that question. Choosing between more modular or more integrated processes at the end of the day is down to your vision in terms of how you think manufacturing will look in the future.

They both have their own advantages and disadvantages. Integrated systems are quite attractive for their plug-and-play characteristics and for being fully closed where you can run nearly your whole process with minimal operator interactions. They're especially useful for CAR-T therapies, for example, where you need to manufacture at volume and where manufacturing processes and workflows are reaching a point of stabilization in the industry and becoming more and more standardized as it's a product that is obviously quite ahead of the curve in the scope of cell therapies.

So, integrated systems enable that but, as we discussed earlier, also reduce the flexibility to change your process in the future.

Since you might not be able to change just one step of the process at a later stage, you might have to change the whole platform. Going with modular processes gives you that flexibility. If at some point in development you decide to do major changes to your cell transduction step, for example, you can switch that module out for one that suits your new process better without having to change upstream or downstream steps or technologies. That can all remain — not only if your process changes, but even if your manufacturing process remains stable with only a few changes, which is rare. But even in that case, you still have the opportunity in a modular setting to watch the market for new technologies. You can basically swap out any outdated devices with new technology that might be best in class five years from now.

On the flip side, for modular processes, one consequence is more operator interactions. With every modular device you have, you might have five or six interactions in your manufacturing process with every one of them. You need to load the tubing set, you need to transfer your cellular material, you need to harvest the outputs.

They are usually more sample-intensive and QC-intensive processes. So, it is critical to ensure that the process remains as closed as possible in this modular setting.

This means ensuring that consecutive steps and devices are compatible in terms of tubing. You should be able to sterile-connect or sterile-weld one device or one step to the next with your liquid handling because you really want to avoid introducing open steps in the process. That would mean more time in grade B or grade A isolators.

And that means an increase in GMP operational costs. And just one final point on that question. I think from the perspective of a supplier developing these devices and tools, this is not an easy field to navigate, either. The industry is still very much split between modular and integrated processes. There's a huge variety of therapies being developed, and even within the same type of therapies, you still have a high variability in the manufacturing processes that different companies run. So, the requirements differ immensely from user to user. And that's why it's important to keep evaluating the available technologies out there, considering the processing question, and to stay updated and on top of new developments as the industry matures, not just from the companies developing the therapies but also from the suppliers developing these manufacturing technologies.

Watch the full response below

We touched on this earlier, but what common pitfalls do companies face when digital controls or data layers are bolted on after physical systems are already in place?

That's a great final piece of the automation puzzle because if you think about it, that digital layer is almost like the nervous system of any automated platform. If you consider that everything is increasingly digitized, from electronic batch records to data analysis and trending to digital twins, these are all platforms that benefit hugely from being linked as much as possible to all of the data that is generated by manufacturing and by QC. So, without that native data capture, you lose some of the real-time control, some of the traceability, and some of the analytics that in the end drive continuous improvement to your process. When your equipment, your central control, your dashboards, and your record keeping software are all designed to talk to each other from day one, you automatically get end-to-end traceability. You get paper-free batch records and you get the option in some cases to release products based on near real-time data.

And that's becoming more and more of a possibility. Now, I think the real trouble starts when — and you see this too often, unfortunately — the digital backbone is bolted on after the equipment is already up and running and these considerations are only taken afterward, the equipment is chosen, and the process is defined in a way that ends up creating data silos. It forces manual transcriptions, and it forces operators to have to physically download log files with flash drives from these devices. And it turns every software update into a validation nightmare as well. And worst of all, certain quality events, such as deviations, for example, and others, might very well end up as paper-based. And anything paper-based is a huge obstacle to scaling up into operations because it just means more operator time and more people and more operational costs.

So, I think a guiding rule from me would be to blueprint the data backbone from day one and agree on naming conventions in terms of your process variables, your process parameters, your samples, and testing — all of the units, so they're all standardized from day one. Choose communication protocols that are open format and budget for projects such as digital record keeping systems and electronic batch records at the same time as you start automating your process, because it costs less over the long run and it prevents these last-minute scrambles to rebuild electronic records in the middle of a pivotal trial when your process is already established.

Watch the full response below

About The Expert:

Tiago Aguiar is a bioprocessing professional with over a decade of experience in process development and GMP manufacturing operations for advanced therapy products. He has led multidisciplinary teams in the automation and scale-up/out of manufacturing processes, contributing to successful clinical and commercial programs. His experience spans technology transfer and CDMO management across multiple phases of development. With a background in biological engineering, he has held key technical roles at Orchard Therapeutics, Autolus Therapeutics, and Achilles Therapeutics and has authored peer-reviewed publications on the bioprocessing of advanced therapies. Connect with him on LinkedIn.