Analyze This

Everything Old is New Again

Back to the future in analytical methods.

Author Image

By: Emil W. Ciurczak

Independent Pharmaceuticals Professional

The Pure Food and Drug Act of 1906, also known as Dr. Wiley’s Law, was the first of a series of significant consumer protection laws which was enacted by Congress in the 20th century and led to the creation of the Food and Drug Administration (FDA). Its main purpose was to ban foreign and interstate traffic in adulterated or mislabeled food and drug products, and it directed the U.S. Bureau of Chemistry to inspect products and refer offenders to prosecutors. It required that active ingredients be placed on the label of a drug’s packaging and that drugs could not fall below purity levels established by the United States Pharmacopeia or the National Formulary.

Good Manufacturing Practices (GMP) were first introduced in the early 20th century to ensure that products were manufactured according to strict quality standards. The first WHO draft text on GMP was adopted in 1968, and the term was first formulated by the World Health Organization (WHO) in 1975. In 1987, the FDA published the “Guideline on General Principles of Process Validation,” which provided guidance and advised industry that device manufacturers must validate other processes when necessary to ensure that these processes would consistently produce acceptable results.

All these rules were needed and good for the consumers and the suppliers. However, keep in mind that the tests available to the QC lab were, by today’s standards, “quaint.” The API in the dosage form was checked by titration or Ultraviolet/Visible spectroscopy. The safety/purity of the drugs and excipients was done (most often) by the inferential tests of the USP/NF/EP/JP and so forth. What do I mean by inferential? Mostly, there were no straightforward tests in the USP monographs.

This was largely because it was written for small formulating pharmacies or small commercial pharmaceutical manufacturers. Not only did they not have huge budgets for analysts or equipment, but the tools were limited. Until well into the 1970s, manufacturers were not allowed to use gas or liquid chromatography to release a product. One very good reason was that the packings used for the columns weren’t standardized, yet. I remember coating silica with a mixture of silicone oil and solvent, rotating the flask under vacuum until the solvent was removed, and hoping my GC packing would resemble my last batch.

So, a monograph for, say, Lactose, USP might include a mid-range IR spectrum, but almost all sugars look similar, and, without a valid computer algorithm, other tests were run. The material was boiled with a copper salt and, if it turned red, it was a “reducing sugar.” Placing the lactose unknown in an ammonia solution and following the optical rotation until it went just far enough negative, implied it was lactose.

Heavy metals? Use a long tube and a hydrogen sulfide solution; look down the tube and see if the opacity is better or worse than a standard “heavy metals solution.” Melting range? Why we simply placed the sample in a capillary tube, attach to a thermometer with a rubber band, immerse in mineral oil, warm, and make note when the material melts. And so on and so forth. Modern, cutting-edge science in 1956, but not in 1996.

Even in the late 1980s, we were using thermal analysis for melting range. Particle size? Forget sieve analyses, we have LASER light-scattering devices. And praise the chromatography gods, we had STANDARDIZED gas and liquid chromatographic packings, so GC and LC became the “gold standard” for dosage form release. UV/Vis spectrometers got far better and even multiple component dosage forms could be quantified with simultaneous equation software. FTIR and Raman spectra were spectacularly sharp, and the accompanying software made ID a piece-of-cake to perform. So, we’ve reached the pinnacle of lab analysis? Sort of.

As John Lennon said, “Life is what happens while you’re busy making plans.” While we were busy making our labs up to the Jetsons’ level, PAT/QbD happened. Suddenly, these ($100,000) works of art weren’t “good enough” for the production floor. Why? Well, several reasons:

1.  They were using house current (110 or 220V), which does not play well with powders or organic fumes found in the production facility.
2.  They couldn’t take the movement to or the sitting at the process line. They were designed for accuracy and sensitivity, not ruggedness.
3.  They weren’t fast enough. Yes, one minute for an outstanding IR spectrum is great, but the sample (in most cases) had to be prepared to be used. Meanwhile 100,000 tablets were pressed and not tested or corrected.

Fortunately, in 1983, the EMEA (now the EMA) thought it would be a good idea to qualify 100% of all containers of every material entering a production facility. This was fine for the Europeans who used trains, but in the US of A, we lean to truck deliveries. So, 3000 kg of Lactose, for example, could be four or five bags on a train, but up to several hundred on a truck. Using EP or USP tests to qualify them could take weeks—and that’s for just one excipient. We received more than 200 bags of Lactose each time!

Thus, the “new age” of Pharma analyses was entered. We used HPLC freely for assay and content uniformity and NIR helped by qualifying all incoming raw materials. Raman was added and even far-infrared (renamed as TeraHertz) became tools in the toolbox. But wait, the US FDA encouraged the industry to adopt PAT (Process Analytical Technology) to monitor each step of the process.

That was fine, since, for example, a “modern” near-infrared could measure a sample of powder or a tablet in a few seconds. That meant that in-line units could be designed and built for use in blend uniformity, allowing operators to modify the time needed for a mixer from a fixed time (under GMP) to “blend until endpoint” under PAT. The technology was applied to drying (pan and fluidized bed), and coating pans. These processes were running and changing at a speed easily monitored by the available instruments.

Even more tablets could be measured in a timelier fashion with NIRS. The first application was for Viagra. A quality control room was built between tableting machines in the production area. Ten to twenty tablets were taken from each machine, every hour, and analyzed. Of course, to facilitate the analysis and obviate process variations, there was some sample preparation. The tablets were pressed into a thin disk with a Carver press and run on a lab-type NIR unit, equipped with a carousel capable of running 20 tablets in a transmission mode.

This greatly increased the number of samples analyzed from every batch—from twenty, at the end of the run to ten or twenty every hour. So, for a one million-tablet lot, running for, say, twelve hours, 240 assays were performed, not just ten or twenty. It also gave the operators an indication that a process was either running as planned or drifting out of specification. While any modification could only save a portion of a batch, that was still an improvement over “sell or toss” as was the custom.

Over the last few years, a technique for spectroscopy, called “spatially-resolved-spectroscopy,” has been adopted by several companies for Raman and Near-infrared. For Raman it was employed to minimize the effects of fluorescence. In NIR, it is combined with “push-broom” imaging to rapidly measure tablets—as they are pressed—in real time! At normal production speeds the millisecond-speeds of the device allow for multiple measurements per tablet!

What this means is every tablet or capsule can be analyzed, not merely a small percentage, affording immediate feedback for true PAT or CM. The downside is that NIR is not a “primary” analysis.

That is, it is calibrated against a validated laboratory technique, e.g., HPLC. One upside is that the distribution of the light and shape of the resulting dispersion pattern is not only does the method allow for analysis (and since several pictures are taken from each tablet) and content uniformity. The pattern also discloses the density/hardness which, in turn, can be correlated with dissolution profiles.

These also need to be calibrated against current lab-validated methods. But we have now gotten the tools to measure the chemical and physical properties of 100% of a lot and physically remove outliers in real time. The irony isn’t lost on me: we started production of large lots of tablets and capsules, and their ingredients by using tests that were, in many cases, inferential. We progressed to faster analyzers but were limited to the speeds at which we could test, so, even with excellent monitors, only the slower processes—including, by the way, biochemical reactions—could be monitored.

With the newest instrumentation, we can now measure every unit of every batch, just as long as the inferential tests are validated by the “old-fashioned” methods. What a brave new world in which we live. 


Emil W. Ciurczak has worked in the pharmaceutical industry since 1970 for companies that include Ciba-Geigy, Sandoz, Berlex, Merck, and Purdue Pharma, where he specialized in performing method development on most types of analytical equipment.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters