I wish that I had a dollar — heck, I wish that I had a dime — for every time someone talks about a discovery that’s going to revolutionize the drug discovery business. I wouldn’t be sitting here, I can tell you that much. Probably be logging into the satellite internet feed from my own private tropical island, yelling at the guys in the matching colored jumpsuits to get the receiver turned around faster, and looking for the white Persian cat’s food dish.
No, we get to hear about revolutionary discoveries all the time, but honestly, how many of them ever are? Let’s ditch all the stuff from the press release pile just to make it easier, and I think everyone will agree that it’s not going to be much of a loss. Concentrating on the things that have a real case, what are we left with? Molecular biology: now that was a revolution. The switch to cloned proteins for assays during the 1980s is a good example of it. When I started working on muscarinic receptors back around then, the literature was still full of possum trachea and rat vas deferens tissue assays, but that era was already disappearing.
The advent of high-throughput screening was a revolution, too, although (like protein molecular biology), it came on in stages. People had always wanted to screen a lot of compounds, and had generally worked through as many as they were able, but the robotics and high-capacity plates gradually made those dreams come true. If you’d told someone back in 1950 about screening millions of compounds in an assay, they’d probably have been in awe, but they’d also be pretty sure that you must have been able to find a lot of good leads, and that’s where the dreams ran up against reality.
So we can argue about how successful those particular hand-in-hand revolutions have been, because there’s room to think that they’ve both been taken too far over the years. But their impact on the business can’t be overstated. What do we have to point at since then, though? I ask because I get asked this question myself by people who don’t know the drug industry, and who assume that things are always in a tumult of discovery over here. It’s hard to come up with a good post-1990 answer.
RNA interference, for example, has certainly been a big deal in biology, but it hasn’t exactly changed the way that we do drug discovery, or at least not yet. Biomarkers are potentially a big deal, but not enough of them have come along to make a revolution. Genetic sequencing was supposed to change our world back about 10 or 12 years ago, as everyone knows, but that’s another one that’s coming on slowly. In fact, all the “-omics” fields could probably be put in the same category: very interesting, often useful, but certainly not completely transforming everything we do. We’re still picking drug targets and doing clinical trials pretty much the way we always have, albeit with better data collection.
One theory I have is that potentially revolutionary technologies have to overcome a lot of counteracting forces in order to reach their potential. We learned how to screen millions of compounds, but we ran up against targets that still won’t give useful hits. We learned how to read the entire human genome, which told us that we didn’t understand how complex it was.
We learned how to pay attention to molecular properties and formulations, which raised our success rates in Phase I trials, only to run into our lack of understanding of human biology in Phases II and III. There always seem to be compensating factors, some of which compensate so much that there’s not much of a revolution left after they’ve finished. Combichem, anyone?
But people assume that any highly technological field must have gone through all kinds of upheavals over the last decade or two. That’s probably a side effect of Moore’s Law and the way that computational power has transformed everything that it’s capable of transforming.
Computers and consumer electronics are unrecognizable compared to their 1980s counterparts, so anything that involves lab coats must be, too, right? What the state of the drug industry tells you, though, is that we’re not one of those transformed things. That’s because we’re not computationally limited; faster computers are not going to get us out of this, not without have a much better idea of what to compute.
How about something a bit less scientifically oriented, then? Outsourcing of research tasks would qualify as a revolution, for sure. Back in that same late-1980s era, the idea of hiring cheap overseas labor to make compounds for you would have seemed bizarre. (And we can argue about just how successful that movement has been, too, although it’s probably too early to say, as China’s Zhou Enlai is supposed to have said about the French Revolution when someone asked him in the early 1970s.) The “virtual company” model would have seemed not just strange, but possibly crazy. There weren’t as many service providers in the industry as there are now, ready to make your compounds, run your assays, set up your clinical trials, and so on.
That’s just a backdrop, though, to the real revolution: cost containment. I don’t want to propose that the older drug companies never thought about keeping costs down; I’m sure that they did. But I can’t imagine that it was quite as urgent as it is now, with the hugely increased bills for clinical trials and regulatory affairs, lower success rates for R&D, and pricing pressure coming from all sides. Keeping costs down has gone from something a good manager should do to something that every manager has to be doing all the time, if they’re going to survive.
I’m not saying that this is a good thing. In fact, I think that on balance it’s going to turn out to be a very bad thing indeed, because this environment isn’t a good fit for drug discovery timelines, or for most kinds of R&D at all. A lot of great ideas have looked like money sinks for quite a while before they ever paid off, but ideas like that are being born into a hard world these days. And another problem is that the constant pressure to cut will (and has) led people to go too far. As they say, you don’t know if you’ve cut too deeply until you see blood — but in this business, with our complexities, you may not see the blood for a few years, by which time it’s rather late to do anything about it.
Well, that’s not very a very cheerful thing, when the biggest change I can think of is a negative one. Here’s hoping that we’re overdue for a few disruptive revolutions on the positive side, things that have the potential to get us out of holes rather than illuminate newer, ever more spacious ones.
Derek B. Lowe has been employed since 1989 in pharmaceutical drug discovery in several therapeutic areas. His blog, In the Pipeline, is located at www.corante.com/pipeline and is an awfully good read. He can be reached at email@example.com.