Materials Matter: Reframing Raw Material Strategy In Cell And Gene Therapy Development
By Life Science Connect Editorial Staff

In cell and gene therapy (CGT), few topics are as deceptively simple, or as operationally consequential, as material classification. Terms such as raw materials, starting materials, and ancillary materials are widely used across development programs, yet their definitions remain inconsistently applied across organizations and even across regulatory frameworks. While this ambiguity is often treated as a matter of terminology, in practice it shapes entire downstream control strategies.
Starting materials are generally understood as those that become part of the final drug product, while ancillary materials support the manufacturing process without being intended to remain in the final product. Raw materials, depending on the framework, may encompass both categories or be defined more narrowly. This lack of global harmonization is not inherently problematic, but similar inconsistency within a single program is, because when teams shift definitions midstream or fail to align internally, the result is often a fragmented approach to risk assessment, testing, and documentation.
More importantly, classification is not a static labeling exercise; it forms the foundation for risk-based decision-making. Each category carries implicit expectations around qualification, testing rigor, and supplier oversight. A material classified as a starting material will typically require deeper characterization and tighter controls than one designated as ancillary. Without clear and consistent classification, those expectations become blurred, and the justification for control strategies weakens in parallel.
The challenge, then, is not simply to define materials, but to embed those definitions into a coherent operational framework. This requires cross-functional alignment within CMC teams and a deliberate effort to link classification decisions to control strategies. In the absence of this integration, even well-intentioned frameworks can fail to translate into meaningful oversight.
Complexity Multiplied: How CGT Materials Redefine Risk
The material landscape in CGT is fundamentally different from that of traditional biologics. In monoclonal antibody production, years of process refinement have created a high degree of predictability; inputs are well characterized, analytical methods are mature, and variability is tightly controlled. CGT operates in a markedly different environment, one defined by biological complexity, evolving modalities, and limited standardization.
This complexity is particularly evident when comparing ex vivo and in vivo therapies. In ex vivo systems, cells are manipulated outside the body, providing an opportunity to control conditions and remove impurities before administration. Ancillary materials, while critical to process performance, can often be cleared to acceptable levels prior to patient exposure. In contrast, in vivo therapies collapse this separation. Components such as vectors, plasmids, or delivery components are administered directly, making them part of the patient-facing product or its impurity profile. The margin for error narrows significantly, and expectations for purity, consistency, and safety increase accordingly.
Compounding this challenge is the role of inherently variable inputs, such as human cells. Unlike synthetic or recombinant materials, patient-derived cells introduce variability that cannot be fully predicted or standardized. This variability propagates through the workflow, influencing yield, performance, and ultimately product quality. As a result, CGT manufacturing becomes not just a matter of controlling processes, but of managing biological uncertainty.
These dynamics elevate the importance of materials from passive inputs to active determinants of product performance. Their impact is not isolated to a single step, but cascades across the entire manufacturing process. A minor inconsistency in an upstream material, whether due to particulates, variability in composition, or supplier differences, can amplify downstream, affecting everything from cell growth to vector efficiency to final product attributes.
The Analytical Gap: When Measurement Lags Behind Innovation
If material complexity defines the challenge, analytical capability is the constraint. Across CGT, there is a growing recognition that the tools used to measure material quality have not kept pace with the sophistication of the materials themselves.
This gap is particularly evident in the characterization of plasmid DNA and viral vectors. These materials are central to many CGT platforms, yet their analytical assessment remains fragmented. Methods for quantifying viral particles often rely on indirect measurements or legacy assumptions that fail to capture functional state. Similarly, sequencing approaches may identify gross errors but lack the sensitivity or consistency needed to detect subtle variations that could impact performance.
The result is a disconnect between what developers believe they are measuring and what they can actually understand. Specifications may be set based on incomplete data, and comparability assessments may rely on methods that are not directly aligned. In some cases, two organizations measuring the same attribute may arrive at different conclusions simply due to methodological differences.
This lack of harmonization creates challenges not only for internal development but also for regulatory interactions. Without standardized approaches, demonstrating consistency across batches, sites, or suppliers becomes significantly more difficult. Moreover, as regulators deepen their scrutiny of CGT products, the limitations of current analytical frameworks are becoming more visible.
Addressing this gap will require both technological advancement and industry alignment. Efforts to develop standardized methods and reference materials are underway, but progress is uneven. In the meantime, developers must navigate a landscape where analytical uncertainty is an inherent part of the development process. This places greater emphasis on understanding the limitations of existing methods and incorporating that understanding into risk assessments and control strategies.
From Compliance to Control: The Evolution of Regulatory Expectations
Regulatory expectations for CGT materials have evolved rapidly, reflecting the complexity of the modalities and lessons learned from early development programs. Where traditional frameworks often emphasize compliance with predefined standards, the CGT paradigm is increasingly centered on science- and risk-based control.
This shift places a greater burden on developers to demonstrate not just that controls exist, but that they are appropriate. It requires a detailed understanding of how each material contributes to product quality and how risks are mitigated across the supply chain. Importantly, this expectation extends beyond the immediate manufacturing process to include upstream suppliers and even secondary sources of variability, such as animal- or human-derived components.
Regulators are also bringing a broader perspective to their evaluations. With visibility across multiple programs and modalities, they are often able to identify risks that may not be apparent within a single development effort. This dynamic has led to instances where regulatory feedback highlights gaps in material understanding or control strategies that developers had not fully considered.
At the same time, guidance documents and frameworks are evolving to support this shift. Chapters such as USP <1043> and <1040> provide valuable structure for thinking about material classification, risk tiering, and qualification. However, these frameworks are not prescriptive, and their implementation requires interpretation. As a result, there is an ongoing effort within the industry to translate high-level guidance into more actionable practices.
One emerging concept is the idea of material control as a continuum. Rather than viewing responsibility as confined to the developer, this perspective recognizes that effective control spans the entire lifecycle of a material, from its origin and manufacturing to its integration into the final product. Suppliers, CDMOs, and sponsors each play a role, but accountability ultimately rests with the sponsor. This reinforces the need for transparency, documentation, and active engagement across the supply chain.
A Strategic Imperative: Elevating Materials in Development Planning
Despite their importance, materials are often underprioritized in early-stage CGT development. Teams focus on process optimization, clinical timelines, and product characterization, while material strategy is addressed reactively. This imbalance is understandable given resource constraints, particularly in emerging companies, but it introduces significant risk.
The consequences of this underinvestment typically emerge later in development. Materials selected for convenience or cost may prove unsuitable for clinical or commercial use. Suppliers may lack the documentation or quality systems needed to support regulatory filings. Analytical gaps may complicate comparability assessments or delay validation efforts. In many cases, these issues require revisiting earlier decisions, leading to delays, increased costs, and, in some cases, compromised timelines.
A more effective model is to integrate material strategy into the earliest stages of development. This does not require exhaustive upfront analysis, but it does require a structured approach to identifying and prioritizing risk. High-level risk assessments can highlight critical materials, guide supplier selection, and inform early control strategies. As development progresses, these assessments can be refined and expanded to support more detailed characterization and validation.
Central to this approach is a shift in mindset, away from viewing materials as inputs and toward recognizing them as strategic assets. Supplier selection, for example, should not be treated as a procurement decision but as a long-term partnership. Factors such as technical capability, quality systems, and supply chain robustness are as important as cost, if not more so. The concept of “cost of quality” becomes particularly relevant, as early savings can be offset by downstream complications if materials fail to meet evolving requirements.
Ultimately, the success of CGT development depends on the ability to manage complexity without losing control. Materials sit at the intersection of this challenge, influencing process performance, product quality, and regulatory outcomes. As the field continues to evolve, organizations that proactively invest in material strategy will be best positioned to translate scientific innovation into scalable, approvable therapies.
From Supporting Role to Central Driver
Within the CGT landscape, materials are no longer a secondary consideration — they are a central driver of success. Developers are increasingly recognizing that early decisions around material classification, sourcing, and control have lasting implications, and regulators are demanding deeper understanding and more robust justification in turn. Industry groups are working to close gaps in guidance and standardization. Yet significant challenges remain, particularly in aligning analytical capabilities with material complexity and translating high-level frameworks into actionable strategies.
What is clear is that the traditional model where materials are addressed as needed, often late in development, is no longer sufficient. In its place, a more integrated, proactive approach is required. One that treats materials not as constraints, but as levers for optimization and differentiation. In this context, the message is both simple and urgent: materials matter in every decision that shapes the path from discovery to delivery.