In the old days of process analytical technology (PAT)—anything more than 10 years ago is old—the paradigm was to capture samples at the point of origin, measure them with the Near-IR, Raman, UV/Visible, or whichever instrument was being used to control the process, then send the samples for lab analysis. To set the record straight, this worked quite nicely for some time since it was much better after the PAT controls were initiated and used routinely. But, it was a tad time-consuming for any new assay or product line and there was always the time lag from sampling to analysis to worry about, especially where volatiles or a continuing reaction was being measured. There could be variations in the amount of volatile(s) and temperature, or in simply capturing the same sample that is scanned by the instrument.
These potential variances have been somewhat obviated in the past by simply making many measurements on numerous lots of product. While statistically correct, large numbers of samples are both time consuming and tie up equipment and staff. These time and personnel restraints are a large part of the reluctance of both larger and smaller companies resisting adaptation of PAT/QbD (quality by design) programs. However, recent developments in instrumentation may have changed the outlook.
Smaller and, in some cases, less expensive hardware has come along. In addition to NIR and Raman, which were already being made smaller and smaller, and in less expensive versions of LIBS (LASER-induced breakdown spectroscopy), X-ray fluorescence, and 2-D NMR (nuclear magnetic resonance spectrometry) are now available for PAT and QbD applications. So, not only are on-line measurements becoming easier to calibrate, new parameters are being added to the mix.
I’ll start with the most obvious: Near-Infrared Spectroscopy. NIR has been in use for raw materials qualification since the mid-1980s. In the beginning, samples were taken in the warehouse and physically brought to the NIR instrument in a lab. At first, the samples were poured into a cup, then inserted into an instrument to be measured. A lot of 225 bags of lactose would take roughly one-half day to sample and analyze, certainly an improvement over the two weeks, using USP or EP wet methods. Very soon after this method, a fiber optic probe allowed samples to be read through the wall of their plastic whirl-packs, speeding the process and lessening the amount of cleaning needed between readings.
The first foray into in situ measurements was c. 1990, when Bran+Leubbe (formerly Technicon) built an instrument capable of rolling into the warehouse for measurements. Some companies combined vendor validation with statistical sampling to lower the number of containers that needed to be opened and inspected. Nonetheless, the process of RM qualification cannot be sped up much more, even with the introduction of hand-held NIR units (c. 2000). Several companies now sell units and, depending on what is needed by the plant, the materials are measured from simple ID to parameters, such as particle size, moisture, and crystallinity—both percent crystallinity and polymorphic forms.
It is not surprising that the first in-line technology was NIRS. Developed at Pfizer (by Zeiss) for following blend uniformity, it was the first of several, but still relatively large in size and definitely not easily portable. That was up to Brimrose to make. Their AOTF (accusto-optic tunable filter) instrument was used in blending control and the spectrometer itself could be removed from the mixer cover and used as a hand-held unit. Others have followed.
The next technique to go mobile was Raman. This technique has a number of advantages over NIR, one of which is that the resultant spectra are more classic in that it resembles infrared. The plethora of distinct, reproducible peaks less affected by physical parameters gives easily identifiable spectra for rapid RMID (raw material ID). And, like MIR, unknowns are more easily identified than the less clean NIR spectra, mostly overlapping and broad peaks.
Somewhat less well known is LIBS (LASER-induced breakdown spectroscopy). It is simply emission spectroscopy (ES) of solid/semi-sold samples. In classic ES, the sample is first dissolved in a solution (mostly acidic), then aspirated into a plume of (argon) plasma, vaporized, and the atoms reduced to plasma (see Figure 1). As the nuclei and electrons re-unite, they release light, allowing the analyst to quickly determine the elements present and in what concentration. Early, and current, instruments were the size of armoires and needed extensive ventilation for the plumes of chemicals released by the LASER burns.
In fact, early editions of LIBS were exclusively used for metal atoms—iron, titanium, and magnesium. The iron and titanium are often found in coatings and magnesium in the tablet core—Mg Stearate is one of the most popular lubricants. By pulsing the LASER, a small channel is formed, with each pulse vaporizing materials deeper and deeper into the dosage form. The resultant spectra gleaned from a pattern or spots, allows analysts/formulators/production managers to ascertain 1) the distribution of lubricant within a tablet, and 2) the distribution (uniformity) of Fe and Ti in the coating (see Figure 2).
Later, non-metals were added, allowing analysts to determine the distribution of the API and some excipients. The most common elements measured are Mg, Cl, Na, K, I, Br, and Ca. Since many active pharmaceutical ingredients (APIs) contain Cl or Br, this makes following their distribution simple. Lab-based units have also been used to determine carbon distribution within a dosage form.
I also mentioned 2-D NMR because of its importance in recent history. Several years ago, a number of people were killed by deliberately adulterated heparin. The mediate problem was a shortage of porcine intestines in China, so exporters bulked up their supply of heparin with over-sulfated chondroitin—as much as 50%—taken orally for arthritis pain and joint healing. The chemical is safe and benign, but taken intravenously, it is toxic. Sadly, wet compendial methods cannot distinguish between the two moieties. A fact the suppliers were aware of. So, there was a conference held by the USP to determine the best method of analysis for heparin.
After all the presentations, the consensus was that 2-D NMR was the way to go. At that time, the available instruments were liquid helium cooled with levels maintained by liquid nitrogen and cost many thousands of dollars a day to maintain. Several years after that suggestion, I discovered an interesting unit built by Magritek named Spinsolve. It is a bench-top, room temperature unit, capable of measuring 1H and 19F nuclei, can perform 1-D and 2-D experiments, has a compact, benchtop size and weight, needs no sample spinning, thus, no compressed gas, and most importantly from a maintenance and cost factor, no cryogens are needed for operation. This unit should prevent future outbreaks of patient illness/death from adulterated materials, and is quite useful for API synthesis and control.
In future columns, I will address some of the other technologies transforming the industry, speeding up and improving the way we analyze products, intermediates, and raw materials.
Emil W. Ciurczak
Emil W. Ciurczak has worked in the pharmaceutical industry since 1970 for companies that include Ciba-Geigy, Sandoz, Berlex, Merck, and Purdue Pharma, where he specialized in performing method development on most types of analytical equipment. In 1983, he introduced NIR spectroscopy to pharmaceutical applications, and is generally credited as one of the first to use process analytical technologies (PAT) in drug manufacturing and development.