Emil Ciurczak, Contributing Editor01.21.21
Back in 1993, I had the pleasure of owning a Ford SHO (Super High Output). For those of you who are not familiar with the car, it strongly resembles a “normal” Taurus… but, on steroids. It was one of the fastest American cars made (at the time), accelerating from zero to OMG in a few seconds. When my son asked how fast I had ever driven it I simply answered “75 MPH.” A little thing like road construction and laws were involved. [I dreamed of the German Autobahn, but… And 75 MPH was less than halfway up the tachometer.] Why do I mention this “beast?” Well, out current PAT/QbD efforts are almost parallel. In this column, I will explain the benefits of “unleashing the beast” that is QbD.
All the tools, already in hand, used for “normal” PAT/QbD process improvements may also be expanded to increase the profit of almost any product. In addition to “merely” controlling production and assuring continuity and reproducibility of products, the concept may be expanded in all directions. Let’s look at three major places where these tools will help:
1. API synthesis. Yes, the actual chemistry could be and is being helped by continuous processing at some companies (the engineering is simpler than making tablets and such), but I am referring to the particular form of the API: both the co-ion (Na+, Ca2+, etc.) and the polymorphic form are variable. Within our “silo-type” organization [meaning we all do our share, staying in our “own lane,” without consideration or even awareness of the previous or next step in the process], the synthetic organic chemists aim for the highest yields, without consideration of the processability or long-term stability of the final dosage formulation.
If the QbD process were extended to all parts of the process, the people making the API would be privy to the processing characteristics of each type or salt or polymorph. So, for example, if the formulators were to show better processability (or stability) form “B,” even though form “A” can give a higher yield in the synthesis process, the lower yield would be more than compensated by lower production costs for the final dosage form.
2. Pre-Formulation/Formulation. The (stated) purpose of normal pre-formulation studies are to screen potential incompatibilities between the API(s) and individual excipients. However, the traditional methodology of making 1:1 mixes of the API and a single excipient only gives limited information. The reason for such limited data generation is the labor-intensive format: a number of these 1:1 mixtures are made, placed into temperature/humidity cabinets (RT, 500 C, 350C/70%RH, etc.). At specified time periods (4 weeks, 8 weeks, 12 weeks, etc.), a set of mixtures is analyzed for breakdown products.
The resultant data only shows whether the API is compatible with that one excipient. The interactions between the API and multiple excipients (mirroring what occurs in production and the final dosage form) are not explored, just extrapolated. Ideally, we would perform a true Design of Experiment (DoE), wherein a statistical model is generated with mixes of the API and several excipients, simultaneously. These data would really, really give the formulators a leg up on designing a final formulation. So, why isn’t it done?
The time and cost of doing a traditional “sacrifice” of samples at pre-set time points would not be as valuable for several reasons:
a. If we only analyze the samples at four to twelve-week time points, we know that there was some interaction, but we don’t have a relative rate at which the samples began reacting (a little Heisenberg action here).
b. The work involved in the analyses would be expensive and not obtained in a timely fashion. Rapid testing would allow for faster determinations of “better” mixtures.
The answer to the problem would be to proceed and make the “designed” mixtures, but examine them non-destructively, using spectroscopic methods. Equipment exists that can be equipped with (at least) four fiber-optic probes for near-infrared or Raman spectroscopy. If the vials, for example, were in test-tube rack, four-across, they could be taken from the stability chambers, capped (to maintain the humidity levels) and scanned through the base of the vial… then returned to the chambers. This may be repeated daily to limit both the amount of API used and work performed on samples with no reaction(s).
If any one of the vials shows spectral differences, it may be analyzed (there will, of course, be replicates available) and the reaction determined. The benefits of this approach are:
a. The relative rates of any breakdowns can both show the best formulations, but hint at potential expiry dates for final dosage forms.
b. Any potential positive interactions could also be noted. That is the API might interact badly with excipient “A,” but not be affected by a mixture of “A & B.”
c. Various polymorphs and salts could be added to the DoE matrix, further adding to the determination of the “best” formulation possible. In other words, not only would we be testing the drug-excipient interactions, but helping define which form of the drug would be most successful in a dosage form.
3. Stability Extension and Expiry Recalibration. Currently, we use “time-tested” (a.k.a., old-fashioned and “safe”) methods for setting expiration dates and performing stability studies. In most cases, a predetermined number of containers of the finished product are placed on stability (RT and stressful temperature and humidity conditions) and at set times, one container is removed and tests such as assay (including degradation byproducts), physical parameters (i.e., color or hardness), and dissolution profile.
These tests were instituted decades ago, long before concepts (and analytical equipment) commonly applied to PAT and QbD were available. Today, multiple, rapid, and non-destructive tests are widely available for stability testing. When a company is running a successful QbD program, all the parameters are monitored: all the chemical and physical properties of raw materials, intermediates, and finished product.
The very basic tenet of QbD is the “design space,” or the process parameters that may be modified, in situ, to obtain the optimal product. So, for any batch, we have a complete record of the source and attributes of each ingredient and intermediate. So, with more complete stability measurements, these data can be correlated to render a clearer picture of each products status. So, how may this better mousetrap be built?
a. Measure every dosage form. Currently, if we have a 100 tablet or capsule container, we actually only test a portion of the contents. Even performing content uniformity (10 units) and dissolution (6 units), without the need for a retest, that leaves 84 units undisturbed.
Using a non-destructive technique, such as near-infrared or Raman, all 100 units in each bottle (or, every unit in a set of blister packs can be scanned through the plastic) may be scanned. In fact, this scanning may be used as a method for choosing the units for assay or dissolution, allowing potential OOS tablets/capsules to be further assayed by the compendial method.
b. Correlate the number and time of OOS samples with design space. In other words, we can show that Batch “A” has four questionable units at one year, Batch “B” has ten, and so forth. By correlating the production parameters (design space) with the percent good/questionable units, the design space can be modified to improve the stability of the product (we assume the assay and dissolution of the initial finished product already pass QC standards).
c. Cement the design space and assure longer stability. This is one of the great powers of QbD: not only to make a product for release that is within all CPPs (Critical Product Parameters), but can be “fine-tuned” to extend the expiry date. That alone could add $$$$ to the values of a drug. One year longer on the shelf cuts back drastically on recalls and potential OOS findings by the Agency.
And then there are numerous (almost unintended) benefits from this “expended” application and, by adding continuous manufacturing,” we see even more time and cost savings:
1. Any time you can know which combination of parameters will give you extended stability, the additional year (or two) on an expiry date could be worth millions.
2. The information from the pre-formulation studies may be used to optimize the clinical trial samples, giving formulation scientists a “leg-up” on developing the final formulation.
3. If we use continuous manufacturing, the change of design space to assure better stability can be performed with some cost savings:
a. Since smaller batches are needed for CM, the design of experiment can be performed using far, far less of the expensive API (especially if it is not yet made in bulk).
b. Many “batches” of modified product can be made in a single day, not weeks, as cGMP requires “commercial level” batches for a DoE.
c. Time savings are realized for both analyses of numerous batches and cleaning/cleaning validation of the larger equipment between each experimental batch, since CM units need not be disconnected between variation.
d. The increase in stability performance allows companies to fulfill the USFDA Guidance on lifecycle management (2018) [i.e., Q12 Technical and Regulatory Considerations for Pharmaceutical Product Lifecycle Management]. This allows the company to comply and, in the course of better understanding, make a better product at a lower cost.
So, we see that “opening the throttle” on the engine named QbD would lead to better products, time savings, and improved profitability. And, for initiator companies, this would allow a retail price reduction, but still allow for a reasonable profit in the face of generic competition.
All the tools, already in hand, used for “normal” PAT/QbD process improvements may also be expanded to increase the profit of almost any product. In addition to “merely” controlling production and assuring continuity and reproducibility of products, the concept may be expanded in all directions. Let’s look at three major places where these tools will help:
1. API synthesis. Yes, the actual chemistry could be and is being helped by continuous processing at some companies (the engineering is simpler than making tablets and such), but I am referring to the particular form of the API: both the co-ion (Na+, Ca2+, etc.) and the polymorphic form are variable. Within our “silo-type” organization [meaning we all do our share, staying in our “own lane,” without consideration or even awareness of the previous or next step in the process], the synthetic organic chemists aim for the highest yields, without consideration of the processability or long-term stability of the final dosage formulation.
If the QbD process were extended to all parts of the process, the people making the API would be privy to the processing characteristics of each type or salt or polymorph. So, for example, if the formulators were to show better processability (or stability) form “B,” even though form “A” can give a higher yield in the synthesis process, the lower yield would be more than compensated by lower production costs for the final dosage form.
2. Pre-Formulation/Formulation. The (stated) purpose of normal pre-formulation studies are to screen potential incompatibilities between the API(s) and individual excipients. However, the traditional methodology of making 1:1 mixes of the API and a single excipient only gives limited information. The reason for such limited data generation is the labor-intensive format: a number of these 1:1 mixtures are made, placed into temperature/humidity cabinets (RT, 500 C, 350C/70%RH, etc.). At specified time periods (4 weeks, 8 weeks, 12 weeks, etc.), a set of mixtures is analyzed for breakdown products.
The resultant data only shows whether the API is compatible with that one excipient. The interactions between the API and multiple excipients (mirroring what occurs in production and the final dosage form) are not explored, just extrapolated. Ideally, we would perform a true Design of Experiment (DoE), wherein a statistical model is generated with mixes of the API and several excipients, simultaneously. These data would really, really give the formulators a leg up on designing a final formulation. So, why isn’t it done?
The time and cost of doing a traditional “sacrifice” of samples at pre-set time points would not be as valuable for several reasons:
a. If we only analyze the samples at four to twelve-week time points, we know that there was some interaction, but we don’t have a relative rate at which the samples began reacting (a little Heisenberg action here).
b. The work involved in the analyses would be expensive and not obtained in a timely fashion. Rapid testing would allow for faster determinations of “better” mixtures.
The answer to the problem would be to proceed and make the “designed” mixtures, but examine them non-destructively, using spectroscopic methods. Equipment exists that can be equipped with (at least) four fiber-optic probes for near-infrared or Raman spectroscopy. If the vials, for example, were in test-tube rack, four-across, they could be taken from the stability chambers, capped (to maintain the humidity levels) and scanned through the base of the vial… then returned to the chambers. This may be repeated daily to limit both the amount of API used and work performed on samples with no reaction(s).
If any one of the vials shows spectral differences, it may be analyzed (there will, of course, be replicates available) and the reaction determined. The benefits of this approach are:
a. The relative rates of any breakdowns can both show the best formulations, but hint at potential expiry dates for final dosage forms.
b. Any potential positive interactions could also be noted. That is the API might interact badly with excipient “A,” but not be affected by a mixture of “A & B.”
c. Various polymorphs and salts could be added to the DoE matrix, further adding to the determination of the “best” formulation possible. In other words, not only would we be testing the drug-excipient interactions, but helping define which form of the drug would be most successful in a dosage form.
3. Stability Extension and Expiry Recalibration. Currently, we use “time-tested” (a.k.a., old-fashioned and “safe”) methods for setting expiration dates and performing stability studies. In most cases, a predetermined number of containers of the finished product are placed on stability (RT and stressful temperature and humidity conditions) and at set times, one container is removed and tests such as assay (including degradation byproducts), physical parameters (i.e., color or hardness), and dissolution profile.
These tests were instituted decades ago, long before concepts (and analytical equipment) commonly applied to PAT and QbD were available. Today, multiple, rapid, and non-destructive tests are widely available for stability testing. When a company is running a successful QbD program, all the parameters are monitored: all the chemical and physical properties of raw materials, intermediates, and finished product.
The very basic tenet of QbD is the “design space,” or the process parameters that may be modified, in situ, to obtain the optimal product. So, for any batch, we have a complete record of the source and attributes of each ingredient and intermediate. So, with more complete stability measurements, these data can be correlated to render a clearer picture of each products status. So, how may this better mousetrap be built?
a. Measure every dosage form. Currently, if we have a 100 tablet or capsule container, we actually only test a portion of the contents. Even performing content uniformity (10 units) and dissolution (6 units), without the need for a retest, that leaves 84 units undisturbed.
Using a non-destructive technique, such as near-infrared or Raman, all 100 units in each bottle (or, every unit in a set of blister packs can be scanned through the plastic) may be scanned. In fact, this scanning may be used as a method for choosing the units for assay or dissolution, allowing potential OOS tablets/capsules to be further assayed by the compendial method.
b. Correlate the number and time of OOS samples with design space. In other words, we can show that Batch “A” has four questionable units at one year, Batch “B” has ten, and so forth. By correlating the production parameters (design space) with the percent good/questionable units, the design space can be modified to improve the stability of the product (we assume the assay and dissolution of the initial finished product already pass QC standards).
c. Cement the design space and assure longer stability. This is one of the great powers of QbD: not only to make a product for release that is within all CPPs (Critical Product Parameters), but can be “fine-tuned” to extend the expiry date. That alone could add $$$$ to the values of a drug. One year longer on the shelf cuts back drastically on recalls and potential OOS findings by the Agency.
And then there are numerous (almost unintended) benefits from this “expended” application and, by adding continuous manufacturing,” we see even more time and cost savings:
1. Any time you can know which combination of parameters will give you extended stability, the additional year (or two) on an expiry date could be worth millions.
2. The information from the pre-formulation studies may be used to optimize the clinical trial samples, giving formulation scientists a “leg-up” on developing the final formulation.
3. If we use continuous manufacturing, the change of design space to assure better stability can be performed with some cost savings:
a. Since smaller batches are needed for CM, the design of experiment can be performed using far, far less of the expensive API (especially if it is not yet made in bulk).
b. Many “batches” of modified product can be made in a single day, not weeks, as cGMP requires “commercial level” batches for a DoE.
c. Time savings are realized for both analyses of numerous batches and cleaning/cleaning validation of the larger equipment between each experimental batch, since CM units need not be disconnected between variation.
d. The increase in stability performance allows companies to fulfill the USFDA Guidance on lifecycle management (2018) [i.e., Q12 Technical and Regulatory Considerations for Pharmaceutical Product Lifecycle Management]. This allows the company to comply and, in the course of better understanding, make a better product at a lower cost.
So, we see that “opening the throttle” on the engine named QbD would lead to better products, time savings, and improved profitability. And, for initiator companies, this would allow a retail price reduction, but still allow for a reasonable profit in the face of generic competition.