Online Exclusives

Bio News & Views Interview

By Gil Roth | November 3, 2006

Bio News & Views Interview

An online exclusive interview with Dr. James Jersey

by Gil Roth

In September 2006, Tandem Labs acquired PharmaKD, a Boston-based, mass spectrometry-oriented provider of drug metabolite and biomarker identification services. With the acquisition, Tandem obtained proprietary informatics that can "reduce the difficulty, complexity, and time-consuming nature of developing and launching a drug to market," according to the company. This included MarkerScan, a biomarker discovery and screening process to "efficiently identify potential biomarkers that can be used to increase the understanding of disease, diagnosis, and predictive models as well as drug compound safety, efficacy, and side-effect profiles."

Dr. James Jersey was named vice president and general manager of the new Tandem Labs - New England. From 1996 to 2004, Dr. Jersey founded and managed the rapid growth of a contract, fee-for-service bioanalytical services division within Primedica Corp., which was acquired by Charles River Laboratories in March 2001. While at Charles River Labs, he established and served as president of a new fee-for-service contract proteomic research services company in 2003. He also served as founder, chief technical officer, and senior scientist of the Bioanalytical Division within Triangle Laboratories.

We spoke a week after his new appointment to discuss the acquisition and the current state of the biomarker field.

Contract Pharma: Where was PharmaKD prior to this acquisition by Tandem?

Dr. Jersey: PharmaKD was an existing business. It was approximately two years old at the time of the acquisition. In that span, we were very focused on biomarker discovery, or identification. This was work where we, for a fee, did discovery work for potential biomarkers.

PharmaKD worked very closely with sponsors to understand what the questions were, and to develop an experimental approach to get the best chance of success. We were using mass spectrometry and liquid chromatography together with some proprietary data management tools to make the process as efficient as possible.

So the value-add comes in with the use of proprietary systems?

Yes. These experiments yield tremendous volumes of data, so the ability to manage that dataset effectively is a critical measure of success.

How did this service attract Tandem's attention?

I think it was very complimentary with what they do. Tandem worked extensively -- but not exclusively -- in the regulated bioanalysis market. They did some discovery bioanalysis, but what we were doing at PharmaKD was more qualitative in nature, identifying potential biomarkers, rather than quantifying them. Our technology was complimentary.

PharmaKD also performed drug metabolism identification. That is, knowing what the metabolites are, what can we do to improve the pharmacokinetics for a compound? The same informatic tools that we use in biomarkers are also applicable in drug metabolism identification. That complements well what Tandem was doing.

Third reason for their interest was that PharmaKD is based in the Massachusetts area, which is one of the Pharma R&D hubs.

What's your definition of a biomarker?

Now, "biomarker" is a widely used term, interpreted in many different ways . . .

I would go back to the FDA's Critical Path and use their definition. It's a molecule -- large or small -- that in some way is indicative or predictive of potential efficacy or adverse events, safety issues, and the like. The FDA has a wealth of information on its site.

I think it's still very much a maturing science. It can be a discrete molecule. Some biomarkers have been around for a very long time. A biomarker can consist of panels of compounds that collectively have predictive power. Conversely, it can be patterns where you don't necessarily know the identity of each constituent. So it's a pretty inclusive definition we're looking at, and it's my belief that we're still very early on.

I think if you look at the pharma R&D productivity, efficiency, return on dollar, there's clearly a role that biomarkers are going to play in improving the process. Ultimately, they're going to be tools for making better decisions.

In terms of making a fail-fast decision?

I think that's correct. That's the most commercially developed direction. But there will also be markers of efficacy. I think there'll be markers used in diagnostics, personalized medicine. "Who's going to be a responder, and who's not going to be a responder?" So they're not always going to be adverse, fail-fast markers; there'll be positive benefits as well.

Still, I think fail-fast will be the first significant application. There are still too many compounds that are failing late, at costly stages in development. That's the most fertile ground to plow at this point.

What advances have occurred in biomarkers in recent years?

The biggest advancement is the understanding within the marketplace of the importance of biomarkers and how difficult it's going to be to have validated ones. It's very easy to have potential biomarkers, but to actually validate that biologically is another exercise.

In the beginning of the genomics and proteomics wave, everyone was under the belief that there was going to be an instantaneous revolution. Expectations exceeded the technology's ability to deliver. It's not going to happen overnight. So I think the biggest advancement is understanding what rate of development is going to be for these contributions.

Technologically, I don't know if there have been any huge advances or breakthroughs; it's more about the community getting a conceptual grasp of what's going on.

Are we looking at a period of incremental improvements, or is there room for a major leap?

I think we're poised for incremental improvement. I don't see a differentiating technology yet.

So there's no J. Craig Venter out there to come up with a shotgun method?

You know, since the mid-1990s, there's been a quantum increase in information management, computational  power, biology, ADME, combi-chem, high throughput screening. I think they've actually done a good job in improving drug compound failures due to pharmacokinetics issues. That's actually gotten better. But look at the investment and time-span that's gone by.

These tools make it easy to generate information, but not necessarily knowledge. That's key right now. We can generate plenty of information, but what does it mean? Was the experiment designed properly? I think we've entered the golden age of biology. That's where the winners and losers are going to be determined. It's all about target assessment, target validation.

Who's out in front?

I think Pfizer's doing a good job. There's a lot of emphasis toward finding markers of toxicology, to avoid adverse events and safety issues. Lots of emphasis on the preclinical side. It's along the lines of killing the bad compound quickly, that fail-fast concept.

I don't think it's reasonable to expect rapid change. I just don't think it happens that way. There are regulatory issues. I mean, who wants to be first? There's career-risk in that.

There's no lottery mentality when it comes to biomarkers. It's going to take a lot of people rolling up their sleeves and doing hard work for a number of years for this to bear fruit. No instant gratification here.

Related Compliance:

  • Analytical Testing Outsourcing Trends Update

    Analytical Testing Outsourcing Trends Update

    January 26, 2017
    Spurred by many factors, global analytical testing services market is experiencing rapid growth in the range of 11% per year

  • The Human Parts of Mouse Models

    The Human Parts of Mouse Models

    Julia Schueler, Head of in vivo Operations, Oncotest, a Charles River company ||October 11, 2016
    The PDX model system has come back into focus

  • Robust Assay Designs

    Robust Assay Designs

    Tim Wright, Editor, Contract Pharma||October 11, 2016
    Easing the transition from preclinical to clinical research