Features

Improving Your Development Program

Seven lab science trends driving the future of drug development.

By: jenifer vija

Charles River

By: liam moran

Charles River

By: chris loosbroock

Charles River

The quality of laboratory science services can make or break drug development, keeping you on track or derailing your program altogether. A well-planned approach may even accelerate your timelines and make the difference in being first to market. Yet, many drug developers are at a loss when it comes to this critical aspect of their program, unsure how best to proceed. Many factors fuel this uncertainty, as we witness the arrival of new technologies, the rise of new types of therapies and modalities, and the downsizing of biotech and pharma’s internal drug development teams. To be certain, there are challenges to overcome. Will the latest trends in laboratory support services offer us solutions?

Cell, gene, and other biologic therapies
Therapies based on biological matter represent a whole new wave of drug development. Though filled with promise, these types of therapies also introduce a unique set of analytical challenges. By contrast, analytical and regulatory expectations for traditional small molecule bioanalysis are clear. As we move into therapies that are mixed, cell-based, or even oncolytic viruses, for example, their vanguard nature makes it difficult to “get it right.” Reliable data is needed to make better decisions, but we are often left wondering, how do we analyze this? Do we have the right equipment to obtain results? How do we get the information we need?

These therapies often require complex method development and validation and new technologies and methods for data analysis. In addition, dose formulation for biologics, nanoparticles, and mixed-construct molecules also require a larger suite of technical capabilities, with different types of detectors and even different types of staff from those found in laboratories that support small molecule analysis. These substantial requirements need not deter you from pursuing the development of a cell-based therapy, but those venturing into this space should anticipate the differences from small molecule development and create a thoughtful plan to address these needs.

Increasing collaboration and consultation
More and more, today’s drug developers are actively seeking guidance and advice to move their program along. The combination of new therapy types (e.g., cell and gene) and constantly evolving regulatory guidelines has made knowing what to analyze and how to proceed unclear. The proliferation of new guidance documents from around the world—the ICH, Japan, Europe, and other authorities—creates rules that are more complicated, but not always prescriptive for today’s newer therapies that, in their complexity, defy our ability to articulate best practices. How do we assess its safety? How do we determine relevant endpoints? When a regulatory path for a molecule is not defined, developers must rely on the intellectual capital and experience of those who have gone before and can offer guidance based on what they’ve seen.

Seasoned advisors can determine which guidances are most applicable to the goals of the program, evaluate what has been done in the past, and apply these methods to like scenarios. The resulting collaboration stands to benefit drug developers of all kinds as we share data and create precedents that help smooth the path to clinic.

Outsourcing
With the overall downsizing of biotech and pharma operations and the rise of virtual companies, many drug developers lack the regulatory experience, laboratory facilities (dedicated space and technology), or subject matter experts who can properly design and run a program end-to-end. Especially with therapies like nanoparticles, cell, and gene therapies, you need to develop a strategy that focuses on obtaining answers to key questions to optimize your time and efforts.

Moreover, there are numerous elements—regulatory specifications, materials, and methods—that need to be in place before you are ready to go, and many developers don’t rely on outside expertise to focus on the scheduling accuracy of the inputs required before a study begins, including reference standards, test article formulation, procurement of study animals, etc. Sometimes, it is not clear what technology will best serve new and emerging therapeutics; organizations need to have a wide selection of technologies on hand to deal with the diversity of therapies in development.

From a logistics standpoint, a program can require a lot of different assays being conducted in many different locations, adding to both the scientific and managerial complexity of the project. Thus, more and more developers are choosing to outsource their analytical work with entities like contract research organizations who not only have the breadth of equipment and scientific skill to handle any type of drug in development, but the expertise to advise the best approach. Those who outsource at the earliest stages stand to benefit the most, as assistance with planning a program can result in real advantages in overall program development timelines.

Biomarkers
Whether you are evaluating pharmacodynamic parameters or assessing safety, it is important to create studies that include relevant endpoints. Biomarkers illuminate how a drug will influence the target and adjacent systems. We need research-based answers, so we know what to expect later (e.g., when we get to first-in-human doses).

Biomarkers are a hot topic, with abundant sources for lists of available biomarkers, applicable species, and associated ranges. However, with the newest classes of therapies, there is little data to support their development. How will the drug function in naïve animals? What is an adequate response? What are the relevant ranges? Novel therapeutics don’t often have a readily available relevant assay, and there is no end-all, be-all approach to identifying the right endpoints. The process starts in method development, with incurred samples from another trial to figure out the necessary ranges. Method development, then, is an iterative process to get to where you need to be. This is another scenario in which outsourcing to an experienced provider can help you taper the number of endpoints you ultimately examine and save you time and money.

Microsampling
How many samples need to be collected on the study to provide the most information for the program? Typical programs require sample collections to monitor exposure, clinical chemistry, pharmacodynamic markers and biomarkers. When biomarkers have been established, how will samples be collected? Providing many benefits to models and developers alike, microsampling is a trend that won’t be going away any time soon. But, those who wish to support the 3Rs by using this technique are often unaware of the analytical work that is required up front to ensure that the method is feasible for their therapy. Will microsampling sacrifice sensitivity for study design?

Over the last decade, equivalent data to standard collection procedures has been presented for microsampling and reduced volume techniques. You need to first verify that the exposures you can measure with a limited sample volume are going to serve your program. Running studies that prove your data will be acceptable to regulatory authorities will help de-risk your program. These preliminary studies can be costly, but the investment can be offset by the ability to collect more data from one study—with more samples that serve multiple endpoints—and by the reduction in the number of research models you ultimately require. If you decide to cut over mid-study, regulatory agencies may require bridging bioequivalent studies between the two techniques to verify that there is no disconnect between them. Choosing to microsample is a delicate balance between regulatory acceptance and doing the right thing from a 3Rs perspective, but still worthy of consideration.

Technology and automation
There is no shortage of new technology to support the business of drug development. New equipment arrives on the scene and more is in development every day. Some experts predict there will soon be instruments and techniques, for example, that are capable of multiplexing different types of analyses (e.g., PK, biomarkers, and immunogenicity). While a boon to the field of laboratory science, organizations must continuously evolve, making robust, regular investments in technology like LIMS, Provantis, ELN, and other systems that ease management and support faster turnaround. Of course, challenges exist with adopting new technology.

Often, the proliferation of new technologies results in an abundance of data, which must then be stored, managed, and analyzed. Every new system requires training at multiple levels of personnel, and the existence of multiple systems can make it difficult for data to be shared between them. The gains, however, far outweigh the difficulties, and the expectation is that laboratory science facilities will have access to the latest and greatest technology. Some labs prepare by getting in on the ground floor, engaging in working partnerships with the companies that design and manufacture these technologies in the interest of developing new features that meet their desired specifications. This results in overall wins for the industry.

As for data, cloud services would appear to be the optimum solution for the electronic lab book, where unlimited data can be securely stored and accessible from anywhere around the globe. Such a configuration fosters data sharing and collaboration between multiple sites, which can be key to reducing timelines. Smaller labs are blazing the path in this area, while drug developers and contract labs alike work to accept and trust the certification of a system that they can’t see. In many organizations, the paper trail remains the only source of confirmation for QA/QC and regulatory auditors who may not be trained on the different systems that produce data, so this aspect of the technology trend may take more time to be fully adopted.

The use of artificial intelligence (AI) in the laboratory science space is still limited, despite the need to analyze the tremendous amount of data that comes from even a single study. As with the training required for new systems, AI likewise needs a substantial investment of time from human counterparts to accurately “learn” and thus automate roles once performed by people. That is not to say it’s not on the horizon. Even now, there are tools that exist that allow one system to maintain and analyze vast amounts of data—for example, flow cytometry results—effectively simplifying and streamlining the process and requiring less manpower.

Accelerating timelines
Perhaps the biggest trend of all is the desire to speed up the development process. The future is wide open for the arrival of unique therapies, but time is still of the essence. In today’s competitive market, if you’re not first, you’re last. Everyone wants to shave days, months, or even years off their development cycles. How do you do that? Isn’t that the million-dollar question.

It’s important to remember a trend is just a trend, and not everyone who jumps on the bandwagon will succeed. Drug development is filled with challenges, and your decisions around bioanalysis play a key role in your ability to meet milestones. Regardless of the tools and resources you choose to use, anticipation and careful planning of next steps is critical—from the outset of your program all the way through to the end.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters