Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Pharmaceuticals

Improving Efficiency

To eliminate R&D bottlenecks, drug companies are evaluating all phases of discovery and development and are using novel approaches to speed them up

by Stu Borman
June 19, 2006 | A version of this story appeared in Volume 84, Issue 25

[+]Enlarge
Credit: Photodisc/Getty Images
Credit: Photodisc/Getty Images

Over the past decades, there have been many breakthroughs in the discovery and development of new medications to treat diseases. Yet a number of devastating human ailments still cannot be treated effectively with drugs. Cancer, autoimmune diseases, circulatory conditions, neurodegeneration, and other ills continue to resist medicinal intervention to one degree or another.

Researchers in academia, government, and the pharmaceutical industry wage battle against these diseases and win many victories. But questions have been raised about whether their efforts are as effective as they could be. Has the efficiency of drug discovery anddevelopment slipped? If so, how can the process be improved? Many people are endeavoring to answer these questions.

Data indicate that the productivity of the pharmaceutical industry has slipped in recent years. A study last year (J. Am. Med. Assoc. 2005, 294, 1333) showed that U.S. biomedical research funding (adjusted for inflation) rose from $48 billion to $94 billion over a decade beginning in the early 1990s. "That's a major increase in investment," says Food & Drug Administration Deputy Commissioner for Operations Janet Woodcock, but "the curve of submissions of new drugs and biologics to FDA is in the opposite direction, almost a mirror image of the investment curve."

In addition, the Centre for Medicines Research, Epsom, England, recently found that 2004 "marked a worldwide nadir, a 20-year low, in the introduction of new molecular entities into worldwide markets," Woodcock says. "We've talked to the British, to the European Union, and to Australia, and everyone is seeing a decline in drug submission rates."

With such increases in investment and simultaneous decreases in drug submissions, "you've got a productivity problem that's very significant," she says. "I think everybody acknowledges that. Costs of the preclinical and clinical part of the process in particular, which we call the critical path, have escalated remarkably in the past three or four years. So we think there is a real problem out there with the drug pipeline and with productivity."

Consultant David Brown of Alchemy Biomedical Consulting, Cambridge, England, says it's "a stark fact that overall the pharmaceutical and biotechnology industry is now submitting almost 50% fewer New Drug Applications to FDA than it did a decade earlier, despite an increase in annual expenditure over that period." The number of new molecular entities (active compounds not previously approved as drugs) reaching the market annually has declined, Brown says.

Meanwhile, clinical failure rates have climbed well above historic averages, he says. "Critically, the failure rate in Phase III trials," the most expensive part of drug discovery and development, "has risen dramatically, such that approximately half of drugs succeeded in this phase in recent years, whereas historically up to 85% succeeded."

To some extent, Brown says, "these developments reflect the success of previous efforts in drug discovery. Effective medicines are now available, often off-patent as generics at low cost, to treat many diseases. Pharmaceutical scientists face increased barriers in that now the drugs required are often either second- or third-generation drugs for diseases now moderately well-treated or first-generation drugs for difficult diseases. Either way, scientists face higher hurdles" in trying to discover new drugs.

"Have the reasons for failure changed over the past decade with the introduction of new methods?" Brown asks rhetorically. Between 1991 and 2001, he says, failures of drug development projects for pharmacokinetic reasons-absorption, distribution, tissue localization, duration of action, and excretion problems-declined owing to improvements in industry practices for handling such issues, he says. "What's not improving?" he asks. In fact, the incidence of drug failures from most other causes, such as toxicology and clinical safety, increased over the past decade, he says. And the percentage of losses caused by drug efficacy problems declined only marginally after a decade of the widespread use of target-based drug discovery, which was particularly intended to reduce the incidence of this cause of loss.

Overall, the pharmaceutical industry is "facing a number of unprecedented challenges," said John J. Orloff, vice president of development for lead innovation projects at Novartis Pharma, Hanover, N.J., at a February American Enterprise Institute (AEI) conference in Washington, D.C. "Productivity has stagnated, and development costs have increased. The number of new molecular entities that have been approved by FDA, EMEA [the European Agency for the Evaluation of Medicinal Products], and other authorities has stagnated over the same period of time."

Such trends "are simply unsustainable over the long term," Orloff said. "So while the drug development model used today has served us well in the past and indeed has delivered products that save and improve patients' lives, it's clear that we can't afford to be complacent, that we have to continuously innovate and evolve. ... We believe there's a growing imperative to modernize the drug development process and incorporate advances in science and technology into a new development model."

Researchers, drug and biotech industry executives, government officials, and consultants are all seeking ways to make drug discovery and development more productive. Perhaps the most ambitious attempt to find ways to modernize and improve drug R&D is the Critical Path Initative (CPI), an FDA program that released its initial report and recommendations this March. According to the CPI report, the problem with drug R&D isn't that drug companies are failing to come up with good drug candidates. "Instead, we think there's a problem in getting them evaluated in the development stage," says Woodcock, who heads the initiative. "New science has not been applied to development, which is still very traditional," and that has to change, she says.

The CPI report includes an "Opportunities List" of 76 projects to speed drug development. A recently established nonprofit research and education consortium called the Critical Path Institute (C-Path), Tucson, Ariz., has already announced efforts to implement some of the CPI recommendations.

In addition, a number of novel and creative ideas for improving various stages of drug discovery and development—from target identification and validation to lead discovery and optimization, clinical testing, and drug production—are subjects of active discussion, study, and implementation in the drug industry.

Drug discovery today often begins not with a potential drug compound per se but instead with a possible biological target. A target is a protein in the body that is believed to play a key role in a disease or condition. After a target is identified, it is then "validated," that is, tested to determine its function and to ensure that it is "druggable," capable of being modulated by a drug to produce a therapeutic effect. Compounds likely to interact with the target are either designed from scratch or identified by screening compound collections (libraries) likely to include good candidates. Lead compounds (or leads) are identified and optimized by enhancing their activity, improving their pharmacokinetic properties, and evaluating their likelihood to cause side effects. Compounds that pass those hurdles may be moved into preclinical (animal) and clinical (human) testing. And if they pass muster in those trials, they are then submitted for approval, produced, and marketed.

Enumerating targets

In light of concerns that drug discovery may be slowing down because targets are running out, some researchers have been analyzing the number of targets that are actually ready and, hopefully, willing to be modulated by new drugs and using the information to guide R&D.

Associate research fellow Andrew L. Hopkins of Pfizer Global R&D, Sandwich, England, among others, has been trying to quantify the "druggable genome"?the total number of druggable targets in the body?as a means "to understand the potential and probabilities of projects we're working on and their chances of success," he says. "It's an attempt to identify the high-value real estate. We want to infer which target proteins in the genome are likely to deliver high-quality chemical matter so we can focus on developing leads for that subset of the genome."

For the number of druggable, disease-related targets, "we come up with a figure of 1,500, and it could be as small as 300 for higher levels of confidence," Hopkins says.

The number of primary targets of current small-molecule oral drugs is 180 to 190. These targets are mainly from a few big protein families, such as G-protein-coupled receptors, ion channels, kinases, proteases, and nuclear hormone receptors. "The worst-case scenario is that small-molecule oral drugs may only be developed for about another 100 targets or so," Hopkins says.

Nevertheless, "there are some caveats to that worst-case Cassandra view," Hopkins adds. That worst-case estimate assumes that one drug hits one target to treat one disease, "but we're now starting to question that assumption," he says. "We're starting to see just how promiscuous many compounds are, that they hit multiple targets," a phenomenon called polypharmacology. "We're starting to realize that many single-targeted agents may fail to have the efficacy we expect because human biology is full of redundant systems, so that knocking out one specific protein by itself may have little effect."

Information about the druggable, disease-related genome that Hopkins and others have uncovered is already being used to make drug discovery more efficient. "The main way we use it at Pfizer is to prioritize our screening efforts to help decide which targets we work on," Hopkins says. "People have run drug discovery programs against a target only to find after several years that the target is inherently intractable and undruggable. Our analyses help by providing a map of the targets that are most likely to yield useful, high-quality leads."

Manuel A. Navia, a drug development strategic adviser at Oxford Bioscience Partners, in Boston, agrees that polypharmacology is an issue of growing significance. "It's particularly important in areas like metabolism, where things are under very tight control and there may be multiple correcting mechanisms," he says. "So if one mechanism is taken out, another one will pick up the slack. That kind of thing needs to be investigated a lot more than single targets. Obviously, it's much more difficult to do so, and the numbers of experiments needed to study multiple targets can very rapidly get beyond anything that one could do. You're then reduced to working on cell and animal models of a disease, which is the way people used to do things."

In a target- or mechanism-based (or so-called reductionist) approach, scientists first identify a protein or molecular mechanism with relevance to a disease and then use screening to find compounds that interact with or modulate it. Navia, among others, contends that the earlier approach of evaluating potential drugs by analyzing their effects on cells and animals rather than on targets is better in some respects. "The pendulum may have swung too far in the direction of the reductionism implicit in target- and mechanism-based drug discovery," he says.

"In the past, you needed more than some in vitro indication that an enzyme was a possible therapeutic target in order to get a drug discovery program off the ground," Navia says. Several type 2 diabetes drug classes, for example, emerged from keen observation of the behavior of compounds in living systems. Some of the drugs subsequently turned out to have complex, convoluted mechanisms, and they probably couldn't have been discovered through use of a reductionist approach, in which one would have had to first identify their targets or mechanisms, he says.

"You can work bottom-up to identify interesting targets and compounds that interact with them," Navia says, "but then, after you've put the compounds into animal systems or, horribly, into human beings, you may find that the underlying mechanism was more complicated than you thought. You've now spent a lot of money, you've committed resources, and these have come to naught. This is an argument for lesser reductionism."

Brown agrees that "until the 1970s, a lot of drug discovery was driven by phenotypic screening, in which compounds were tested for a desired readout in cell or animal systems" and their target and mechanism were typically unknown at first. "This approach was the only feasible means before the mid-1980s, because the then-current state of knowledge of human biochemistry and disease was inadequate in most cases to pinpoint a single biochemical mechanism that could be modulated to favorably impact a disease."

Advertisement

The phenotypic cell- and organism-based approach and the reductionist target- and mechanism-based strategy each has its advantages and disadvantages, but the shift to reductionism "may have reduced the thoughtfulness characteristic of earlier generations of drug discovery, when biologists and chemists worked together very closely at the target and lead stages and also were often very well-informed on clinical aspects of a disease," Brown says. Something important may have been lost in the change to modern ways of doing drug discovery, he says.

There have been some notable apparent successes with the reductionist approach, such as the mechanism-based discovery and development of imatinib (Gleevec). This Novartis drug was designed to prevent activation of Bcr-Abl, an abnormal tyrosine kinase that causes chronic myeloid leukemia. However, "there is growing evidence now that Gleevec and other new kinase inhibitors are far less selective than originally thought and that the promiscuity of these new drugs against a range of kinases is in fact key to their effects," Hopkins says.

The mechanism-based approach is unlikely to be abandoned anytime soon, but some people believe that the traditional cell- and organism-based strategy should not have been discarded completely and should now be given renewed emphasis. Indeed, a trend toward greater use of phenotypic compound screening, followed by determination of the mechanism of action of relevant hits, "is being led by the interdisciplinary chemical biology community in academia," Hopkins notes.

Among recent scientific developments that have promised to improve target identification and hence the efficiency of drug discovery, the sequencing of the human genome in 2001 is considered among the most important. The availability of the human genome sequence promised to make it easier to find drugs that interact with proteins encoded by human genes. For the most part, that promise has not yet been fulfilled, because the function of the vast majority of genomically encoded proteins continues to be unknown. Therefore, it's still extraordinarily difficult to predict the functional effects of directing a drug at a specific genomic target.

"The data that's emerged from the Human Genome Project is incredibly important," says Adriano M. Henney of the global discovery enabling capabilities and sciences department at AstraZeneca, Macclesfield, England. "But after the genome was mapped, expectations were raised that we'd know everything about disease and how to treat things. In hindsight, this was an unrealistic expectation."

Validating targets

Once a functionally interesting drug target has been identified, it's essential to validate it, that is, confirm that it plays a key role in disease and that a drug or drugs can influence its function to some extent.

When drugs fail to work in humans, it's often because the agents "don't engage the drug target, or the target in animals does not correspond to the pathophysiology of the disease in humans," says Malcolm MacCoss, vice president for basic chemistry and drug discovery sciences at Merck & Co., Rahway, N.J. "So looking at the horizon, the next round of things we have to be very aware of is validating the targets we work on and reevaluating the value of the animal models of some diseases. If you have to wait for a full-blown clinical trial before finding out that a compound is going to work on a disease state, you're going to waste an awful lot of time, money, and manpower." In the future, MacCoss says, "the folks who will likely be most successful at drug discovery will be those who are able to validate drug targets fastest."

"Once target validation has been completed, the full drug discovery capability of a pharmaceutical company can be focused with more confidence on identifying and developing drugs specific for the confirmed target," says Oliver C. Steinbach, head of functional genomics at Altana Pharma, Konstanz, Germany.

"A target is not truly validated until a drug is proven effective in human trials," Steinbach notes. Nevertheless, efforts are generally made to validate targets way before the clinical trial stage. A principal way this has been accomplished in recent years has been with gene knockdowns or knockouts, in which the expression of a gene for a target in a living animal is cut back drastically or eliminated entirely to see what effects that has.

Knockouts generally work well for target validation. "A retrospective study using mouse knockouts to assess the validity of targets of the 100 best-selling drugs demonstrated that in the vast majority of cases there is a direct correlation between the knockout phenotype [physical characteristics of the knockout mouse] and the proven clinical efficacy of drugs that modulate the specific target" knocked out in that mouse, Brown says.

In several respects, however, use of knockout animal models to validate drug targets is problematic. "Mouse gene-knockout technology provides a powerful means of elucidating gene function in vivo, but it is a tedious and time-consuming approach," Steinbach says. In addition, "published knockouts exist for only about 10% of mouse genes; many [knockouts] are limited in utility because they have not been made or phenotyped in standardized ways, and many are not freely available for use by researchers."

Efforts are being made to make knockout technology easier to carry out and more accessible. For example, an NIH-led initiative called KOMP, the Knockout Mouse Project, aims to produce a publicly available library of knockout mice. Each mouse will contain a knockout in one gene, and the library will range across the entire mouse genome.

Scientists have also been on the lookout for alternatives, or at least complements, to conventional knockouts. The most promising one so far has been RNA interference (RNAi), a technique in which a double-stranded RNA fragment called a short interfering RNA (siRNA) is used to degrade a messenger RNA and thus silence the gene that the mRNA is helping to translate into protein. RNAi can generally silence a gene in less time and at lower cost than is possible with conventional knockout animal models.

RNAi is thus of growing importance for target validation, and research on the technique is very active. For example, professor of biochemistry and molecular biology Miles Wilkinson of the University of Texas M. D. Anderson Cancer Center, Houston, and coworkers recently demonstrated a way to use RNAi to silence selected genes in specific cell types or tissues in a mouse to determine gene function or to explore therapeutic applications.

RNAi is "a valid approach for sequence-specific suppression of gene expression and hence inhibition of the corresponding gene function in a cellular context," Steinbach says, but it's not free from disadvantages. As with other technologies that interfere with biological systems, "there is the risk of artifacts, false positives, and false negatives," he says, but "with reasonable efforts, these can be managed and mitigated to a certain extent."

Moitreyee Chatterjee-Kishore, principal research scientist at Wyeth Research, Cambridge, Mass., agrees that RNAi "is a cost-effective addition to the target validation toolbox. These reagents can be easily and effectively used in a large number of cell types. The list of cells that are transfectable [capable of being treated] with chemically synthesized RNAi reagents grows every day with the availability of newer, less toxic, and more efficacious delivery options."

Target validation with RNAi-based knockouts takes much less time and eventually will be less costly than with traditional knockouts, Chatterjee-Kishore says. However, RNAi-induced downregulation of gene and protein expression may have significantly different effects from those induced by conventional knockout technology. Hence, "RNAi is not the be-all and end-all of target validation technologies, and for optimal efficacy it must be used in conjunction with other platforms," she says.

Another approach that is up and coming for analyzing the way drugs interact with biological targets is systems biology. Systems biology is the study of relationships and interactions among various parts of biological systems or pathways. It can thus be used to better understand how to intervene medicinally in biological pathways.

"At the moment we have a huge amount of very detailed and interesting information, but what we don't know is how it all knits together in dictating how a cell or organ or organism responds to a particular stimulus, challenge, or drug therapy," Henney explains. "That's the aspiration and goal of systems biology. It's being able to use computational methods to model, predict, and simulate complex interactions" in a way that would be impractical with a single-target approach.

Just understanding how a single molecule operates "is not sufficient to be able to understand how a particular medicine is going to respond in an intact organism," Henney says. Given "the way we've taken a reductionist approach to investigating targets in isolation, it's not surprising that when you put them back into the intact organism you get some surprises. That's what systems biology is trying to address. It's trying to understand the complexity of interactions underpinning a particular biological effect." It could also be useful for assessing drug toxicology, he predicts.

Systems biology "is complex, demanding, and by no means a cakewalk," Henney says. "But the potential there is for it to be applied to answer some critical questions with some measure of success. It's a logical next step."

Advertisement

Designing compounds

Ultraspeedy
[+]Enlarge
Credit: Roche Photo
Roche scientist Larnie Myer uses ultra-high-throughput screening to accelerate the discovery of new active ingredients..
Credit: Roche Photo
Roche scientist Larnie Myer uses ultra-high-throughput screening to accelerate the discovery of new active ingredients..

If individual protein targets have nevertheless been identified, a key way to find drugs for them is by structure-based drug design. Structure-based design is a computational approach in which the three-dimensional structure of a protein target is analyzed and small molecules capable of interacting with its binding pockets are identified. Dozens of commercial drugs have been discovered at least in part by structure-based design, including Gleevec and drugs for AIDS and cancer, and the approach continues to grow in sophistication.

One structure-based technique that has been on the upswing is fragment-based drug discovery, or "discovering drugs in pieces," says Daniel A. Erlanson, senior scientist at Sunesis Pharmaceuticals, South San Francisco. In this approach, scientists identify small molecules (fragments) that can interact with part of a protein's binding site and then link them together or expand them so the interactions become additive. Alternatively, researchers can dissect a large molecule into components and then replace some of them with fragments to modulate its activity.

Compared with conventional whole-compound discovery techniques, fragment-based approaches have the advantage of being able to "probe more diversity more efficiently," Erlanson says. Because fragments are small and therefore less variable and less numerous than whole molecules, "by screening fragments, you can cover greater diversity space with fewer molecules," he points out. A book coedited by Erlanson, "Fragment-Based Approaches in Drug Discovery" (Wiley-VCH), is being published this month.

Fragment-based discovery is on the rise, Erlanson says. "In addition to small companies closely identified with various flavors of fragment-based drug discovery, such as Sunesis, Astex Therapeutics, Plexxikon, SGX Pharmaceuticals, Locus Pharmaceuticals, and De Novo Pharmaceuticals, many larger companies?Abbott, Vertex, Lilly, Novartis, and Roche, to name a few?are using the technique as well," he says.

For example, Abbott has been working with a library of 10,000 fragments in its drug discovery program, according to James B. Summers, vice president of advanced technology for pharmaceutical discovery at the company. The technique has been particularly helpful in developing Bcl inhibitors as cancer treatments. Bcl is a difficult protein to target by conventional high-throughput screening, Summers says, "but the fragment approach has been quite effective."

Also continuing to advance is computational lead discovery, the use of computer modeling to identify compounds with favorable drug properties. A greater emphasis on use of computer simulations could help improve the efficiency of all phases of drug discovery and development, says Charles H. Reynolds, a research fellow and leader of a computer-aided drug discovery group at Johnson & Johnson Pharmaceutical R&D, Spring House, Pa. "Computer simulation is one of the few areas of drug discovery where capabilities are increasing rapidly and costs are coming down," he says.

Computational tools are singled out by FDA's CPI report, which notes that bioinformatics "holds the promise of reducing the size and scope of human and animal trials while improving development efficiency and predictability of results."

About a decade ago, the drug industry embraced combinatorial chemistry, the synthesis and screening of thousands or even millions of compounds en masse, as a way to discover drugs. The idea was that the rapid evaluation of large numbers of drug candidates might be easier and more efficient than the often tedious and time-consuming process of designing drugs one by one or screening compounds in small numbers. However, combichem has turned out to be a disappointment. It showed few tangible results, and the approach has been discredited.

"The productivity enhancements promised by combichem have not materialized, despite an enormous commitment of manpower and capital made by the industry," Oxford's Navia says. "In retrospect, the numerology that made the concept so compelling may have been flawed."

For example, protein structure modifications of just fractions of an angstrom can cause thousandfold changes in a protein's resistance to a drug and corresponding changes in the drug's efficacy. "The numbers that we talk about in combinatorial chemistry—in the millions of compounds at best with current technology—are simply too small to ensure coverage of chemical space at sufficient density to sample a receptor binding site" at a sub-angstrom level of precision, Navia says. Moreover, combinatorial compounds synthesized in such numbers "are not necessarily compatible with life," he says, whereas natural products "can coexist with most biological chemistry, cross biological compartments, and so on."

A strategy called diversity-oriented synthesis (DOS) was developed to address this biocompatibility issue head-on. DOS is a technique for creating libraries of complex compounds that closely resemble natural products. Several Centers for Chemical Methodologies & Library Development (CMLDs), set up with support from the National Institutes of Health's National Institute of General Medical Sciences, are dedicated to the design, synthesis, analysis, and management of such chemical-diversity libraries. CMLDs are key components of the NIH Roadmap for Medical Research, an initiative to help maximize the agency's impact on medical research progress.

Once a drug lead has been identified, whether by structure-based design, screening of corporate compound libraries, or combinatorial chemistry, the lead must be further improved before it can enter clinical trials. This lead optimization process presents significant challenges to researchers. "I don't think people outside the industry appreciate how hard it is to make a drug," Merck's MacCoss says. After a compound with nanomolar activity against a receptor of interest has been identified, "90% of the time you're still only 10% of the way there," he says. "We often evaluate 5,000 to 10,000 molecules just to find a compound that will get forwarded" into safety assessment or clinical trials, he explains.

Lead optimization involves "fine-tuning orthogonal structure-activity relationships—not just making your compound more potent for your target, but also reducing side effects, addressing drug metabolism issues, fixing pharmacokinetic problems, and dealing with toxicology from off-target interactions," MacCoss says. "And all this has to be done in a way that keeps you in the patent space of your own intellectual property. Very creative people are needed to track these multidimensional problems."

A crucial part of lead optimization is reducing the toxicity of drug leads or eliminating toxic compounds from further consideration if their toxicity can't be reduced. "Predicting things accurately and early in the discovery process is the real goal, and I would have to rank toxicity, especially human toxicity, as one of the most critical issues to predict," says F. Peter Guengerich, professor of biochemistry and director of the Center in Molecular Toxicology at Vanderbilt University School of Medicine, Nashville. When it comes to drug toxicity and side effects, one doesn't want any surprises, as the recent withdrawal of the arthritis drug Vioxx demonstrates. In 2004, Merck took the drug off the market after a postmarketing clinical trial indicated that it increased the risk of cardiovascular side effects, such as heart attack and stroke.

Various approaches are being tried to improve the efficiency of drug toxicity assessment. One is transcriptomics, the large-scale study of gene expression at the mRNA level, typically by using microarrays. The technique enables monitoring of gene transcription that is boosted or reduced in response to administration of a drug.

"People went into transcriptomics with the idea that 'This is great; we'll probably see a couple of changes that will reflect the drug's influence on specific targets, and life will be simple,'" Guengerich says. "The problem is that it's not unusual in this kind of assay for 10 to 20% of all genes to go up or down in animals. We're left with way too much data. Some of it is probably relevant, and some of it is a downstream effect and may not be as useful. So the problem is we get too much data out of those systems, and we're not quite at the point where we understand it well yet."

One set of toxicological techniques seeing increasing use is metabolic activation assessment, evaluating the metabolic system's tendency to convert drugs into products with potentially problematic types of activity. When this happens, the original drug essentially acts as a prodrug of the active, metabolic products. Metabolic activation is the source of the liver toxicity caused by some pain medications.

For this reason, "there are a bunch of structures—for example, furans and thiophenes—that most companies will avoid if they can," Guengerich says. "Also, acetylene groups can be tricky. You'll find examples of compounds that contain those groups and that haven't caused trouble. But in general, people do tend to look out for some of these troublemakers so they don't come back to sting you."

Active, rather than passive, methods are also being used to seek out problematic metabolic reactions, such as the use of chemical trapping agents to form conjugates with highly active drug metabolites. These conjugates can then be identified, characterized, and assessed for their potential to cause trou ble.

On to the clinic

The preclinical and clinical stages of drug discovery are also being given a good going-over as the drug industry tries to eliminate bottlenecks in drug discovery. "Currently, approximately 30% of new molecular entities fail in Phase I clinical testing, and although a 20% failure rate in Phase III trials was common in the early 1990s, this figure is now closer to 50%," writes analyst Hermann A. M. Mucke of H.M. Pharma Consultancy, Vienna, Austria, in a recent report ("A New Paradigm for Clinical Development: The Clinical Trial in 2015," Cambridge Healthtech Associates, Waltham, Mass.). "Of all compounds that begin human clinical testing, more than 80% fail because of either efficacy or safety issues. For those candidates that made it to U.S. pharmacy shelves as prescription drugs, no fewer than 10% faced market withdrawal or severe use restrictions between 1975 and 2000."

Advertisement

"Even when drugs are approved, some are retracted, get black-box warnings for safety reasons, or are ineffective in a large proportion of patients," Mucke tells C&EN. "It's terribly noneconomic."

One problem is that the current clinical trial process is inefficient, he says. "You have more than enough drugs that perform beautifully in Phase II, and then in Phase III they crash, mostly for safety reasons but in part also for efficacy. I've seen it happen many times. So obviously what we are doing up to and including Phase II is not predictive of Phase III. And if we pull those drugs through Phase III and they are approved, even then they sometimes need to be retracted after one or two years. In many cases, to the best knowledge of the sponsor and FDA, this is a very good drug, and then within two years you see strange side effects popping up, and an investigation shows that in some rare cases there is an interaction or incompatibility."

One approach being considered to improve trials is microdosing, that is, getting drugs into humans at extremely low doses prior to conventional Phase I trials. The idea is to obtain pharmacokinetic data from the microdosing trials, relate the results to existing animal data on the same drugs, and decide whether the compound should be discontinued without a standard Phase I trial or whether the data should be used to develop derivatives with improved properties. "It's a lead optimization process that is carried out at the clinical trial stage instead of the preclinical drug discovery phase," Mucke says.

"This is a new mode of doing clinical trials that has already been approved by FDA and EMEA," Mucke says. "The Europeans had a microdosing recommendation out in July 2003, FDA followed in January of this year, and both agencies have issued guidelines on how to prepare a submission for a microdosing study."

A microdosing trial is "what you might call 'phase zero,' " Mucke says. "Technically it's still Phase I, but because you are using compounds at one-hundredth the maximum concentration that would be deemed effective, and often much lower doses, there is little chance of any compound exerting any toxic actions." So microdosing trials are not safety trials, although safety would still be monitored in them.

Studies have shown that microdosing is a very good predictor of whether a compound is a suitable drug from a pharmacokinetic standpoint, Mucke notes. Microdosing makes it possible to collect such pharmacokinetic data many months earlier than with conventional clinical trials, he says.

Clinical trials could also be reorganized in other ways. For example, Mucke says, one could test a drug on a much smaller number of patients than in conventional clinical trials, approve it if the trial results are favorable, and then monitor the users. Currently, market monitoring is carried out primarily on an adverse-event basis when dramatic problems arise. Reorganizing trials in this way "might lead you to an approved and marketed drug much more quickly," and problems with drugs would still be caught, he says.

FDA's CPI report discusses other ideas for improving clinical trials. One is adaptive trial design, in which clinical trial protocols are modified in response to early or interim trial results. Another is enhancing the way patient responses to medicines are assessed and measured in clinical trials.

Despite the problems with clinical trials, Navia advises that trials shouldn't be unduly cautious. "There is a price to be paid for overcaution in trials and drug approvals," he says, "though that price is seldom adequately quantified or brought to bear in our societal discussions about risk." An example is the type 2 diabetes drug metformin. "Metformin was first launched in France in 1958, but it was not until December 1994 that the drug was finally fully approved in the U.S., due to concerns over lactic acidosis, a potentially fatal side effect," Navia says. "Ironically, there appears to be no increased risk of lactic acidosis due to metformin after all. How many diabetes patients might have avoided kidney transplants, retinopathy, and limb amputations in the intervening decades that the drug was not available in this country? If you raise societal expectations to the point where people believe that anything they pick up at a pharmacy has to be 100% safe and also 100% effective, this isn't going to work."

Researchers often use outdated techniques in clinical trials instead of modern approaches "that could be giving us important safety information much sooner with much more precision," said Scott Gottlieb, FDA deputy commissioner for medical and scientific affairs, at the AEI session held last February in Washington, D.C. In some cases, they're using "tools that were literally developed decades ago, in some cases 50 years or more, to test and evaluate new medicines. There is so much risk in the drug development process, so much uncertainty," that it's extremely difficult for a company with an innovative idea about drug testing to bring it to FDA and get it approved and implemented. "And so invariably things get stuck in place, and science gets fixed in time," he says.

FDA is thus trying to modernize and improve the tools the agency and drugmakers use to evaluate drugs, Gottlieb said. And FDA is looking internally at how it addresses scientific problems and allows creativity and new innovations into the drug development process.

Within the next decade or so "there will be a fabulous array of tools to figure out subsets of diseases, to predict who is going to respond to treatment, to screen out people at high risk for an adverse event, and to monitor people during therapy, so if they're not responding they can be taken off a drug quickly," Woodcock says. "Most of those tools are in the biomarker area. There has to be a big effort on biomarkers to qualify them and develop them for various uses." Biomarkers are measurable substances, such as proteins or metabolites, whose presence or concentration varies in response to a drug.

The CPI report recommends increased development and use of biomarkers in clinical trials and in earlier stages of drug R&D. "A new generation of predictive biomarkers would dramatically improve the efficiency of product development, help identify safety problems before a product is on the market and even before it is tested in humans, and facilitate the development of new types of clinical trials that will produce better data faster," Woodcock says.

One effort in this area is a recently announced collaboration between the Global Alliance for TB Drug Development (the TB Alliance) and BG Medicine to identify biomarkers to assess the efficacy of tuberculosis drugs. "The clock is ticking, we need to find a faster cure for TB, and new biomarkers could expedite our search," said Maria C. Freire, president and chief executive officer of the TB Alliance, when the collaboration was announced. "Clinical trials for TB drugs are especially time-consuming, so a new biomarker that helps us test novel medicines faster and more efficiently will be a tremendous asset in developing a better, affordable TB cure."

Another such effort is the Oncology Biomarker Qualification Initiative, a collaboration among FDA, the National Cancer Institute, and the Centers for Medicare & Medicaid Services to improve cancer therapies and outcomes for cancer patients through biomarker development and evaluation. The goals are to identify biomarkers that can shorten clinical trials, reduce time and resources spent in drug development, and increase the safety and appropriateness of drug choices for cancer patients.

Some of FDA's CPI recommendations relate to a very late stage of the drug R&D process-namely, the way pharmaceutical manufacturing is currently organized and carried out. Drug manufacturing "needs major improvement," Woodcock says. "R&D has been the centerpiece of what the pharmaceutical industry does, and manufacturing has been kind of a poor stepchild. There's been a tremendous amount of waste and downtime. Some of the consultants we spoke to found that the utilization rate in some pharmaceutical factories was about 15% and that some companies actually spend more on manufacturing their pills than they do on R&D." The increased use of advanced process technologies currently used in the fine chemicals and food industries, for example, could provide part of the answer.

Organizational change

Anticancer Inhibitor
[+]Enlarge
Credit: Infinity Pharmaceuticals
Infinity Pharmaceuticals researchers collaborated with professor of biological chemistry and molecular pharmacology Gerhard Wagner of Harvard Medical School and coworkers in a molecular modeling study in which inhibitors of Bcl-2 were developed. The inhibitors target a binding groove on the protein (blue, red, and white structure) used by its endogenous ligand, the peptide BH3 (gold spiral). Inhibitors have potential utility as anticancer agents.
Credit: Infinity Pharmaceuticals
Infinity Pharmaceuticals researchers collaborated with professor of biological chemistry and molecular pharmacology Gerhard Wagner of Harvard Medical School and coworkers in a molecular modeling study in which inhibitors of Bcl-2 were developed. The inhibitors target a binding groove on the protein (blue, red, and white structure) used by its endogenous ligand, the peptide BH3 (gold spiral). Inhibitors have potential utility as anticancer agents.

Some observers note that changes in the overall strategy and management of drug discovery in pharma and biotech companies and even in academia are also required if the efficiency of the drug enterprise is to be improved. For example, managing the discovery process so drug candidates doomed to failure are quickly screened out of the R&D pipeline is a big priority.

Advertisement

"Costs have been going up at the same time as regulatory hurdles have been getting higher, particularly from the point of view of drug safety," says Julian Adams, president and chief scientific officer of Infinity Pharmaceuticals, Cambridge, Mass. "More and more, it has become imperative for the industry to try to figure out the liabilities of any given drug candidate as early as possible into the process and into the investment, prior to engaging in large Phase III trials that are extremely expensive."

In fact, the biggest challenges facing the drug industry today, Adams says, "are learning as early as possible in the process that a biological target is relevant to a disease process, that a drug is hitting its biological target, and that there is no major liability of the drug candidate in terms of a safety concern. No one has solved these issues. There are many approaches, and many of them have merit. But there is not a universal solution. How to deal with these kinds of bottlenecks is still very much an art form and a risk management issue for pharmaceutical businesses."

Unfortunately, according to Reynolds, a drug's absorption, metabolism, excretion, and toxicology (ADME/Tox) properties are "very difficult to assess before going to the clinic." The high attrition rate in development is likely a consequence of the temptation to first optimize potency, a property that can easily be measured, and wait until later to look at everything else, he notes. Efforts are thus being made to assess ADME/Tox properties at an earlier stage of drug R&D. "The trend is in the right direction," he says.

At Wyeth, a new initiative called 6?6 is being pursued to better manage drug discovery. The plan intends to reduce the time needed to identify drug candidates and test them preclinically to six months each. That means that the time from identifying initial hits to human clinical trials would be about one year, whereas currently these two stages typically require nearly twice as long. In addition, Wyeth plans to shepherd four or five candidates per treatment through the discovery process simultaneously, compared with just one or two currently, and more data than before will be obtained on them at an early point in the process.

The 6?6 drug discovery program will be more streamlined than what Wyeth has been using up until now, "and we'll have more quality added in because we'll be looking at multiple candidates," says Magid Abou-Gharbia, senior vice president and head of chemical and screening sciences at Wyeth Research, Princeton, N.J., and a member of C&EN's advisory board. "Then we'll make a decision to move with the top compound based on the data." He notes that Wyeth has also recently boosted the size of its corporate compound library and increased its high-throughput-screening capacity in an effort to improve the efficiency with which it identifies drug candidates.

Others say that the way academic drug research is organized and supported could also use some improvement. Christopher A. Lipinski, a former Pfizer researcher and developer of the Rule of Five, a widely used set of criteria for evaluating drug leads, notes that "many truly novel targets emerge from academic biology programs, not at pharmaceutical companies. Drug companies generally don't have the time and money to engage in basic research that is unlikely to lead directly to new drugs.

"The problem is that academic biologists know tons about biology but generally don't know much about medicinal chemistry," Lipinski continues. "They buy a compound library and think they can take their hits and start using them in a drug discovery program, and a lot of times you just can't do that" unless you also use medicinal chemistry to optimize the hits.

"An academic group that's involved in some aspect of drug discovery really needs to have chemistry advice," Lipinski says, but "most medicinal chemists are in industry. There are programs for executives to help business start-ups. Why aren't there programs like this on the chemistry side? There are lots of industrial medicinal chemists who retire. What happens to their expertise? Does it get tapped into organizations that could really use it?"

Lipinski notes that one example of this kind of medicinal chemistry assistance program is Wellcome Trust's Seeding Drug Discovery project. The Wellcome Trust "requests proposals and interesting ideas from academics," he says. "If it funds those proposals, grantees are provided resources for project management, for outsourcing studies of compound druggability and ADME/Tox properties, and for consulting with experts as needed. It's an effort to bridge what I would call the biology-chemistry gap."

Abou-Gharbia agrees that chemistry expertise is essential. "How are we going to overcome the bottlenecks in drug R&D?" he asks. "If you ask drug companies this question, you'll get one answer from all of them—chemists. If you don't have capable chemistry, you won't have successful drug discovery because the pipeline predominantly involves small molecules."

But more than just chemists are needed for successful drug R&D. "For all the efforts to industrialize and automate discovery, history suggests drug discovery is art as well as science and relies heavily on the skill of experienced drug hunters," Reynolds says. "It's the people who carry out drug R&D and their ability to exercise good judgment at many stages of the process that have perhaps always been and will continue to be the key to making drug discovery more efficient."

NEXT STORY: Government - Regulatory Trends

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.