Features

Process Validation

By Peter H. Calcott , Calcott Consulting LLC | October 7, 2015

Has it changed since the issuance of the FDA and EMA guidances?

Both Guidances—FDA and EMA—have been issued over 3 years ago and supposedly have become the recommended method for process validation in the industry.1,2 So how well has it been implemented? Has it simplified life? Has process validation changed and become more of a value added activity? The answer to these questions is a resounding “it depends”. It depends on the company you talk to, the size of the company, the progressiveness of the company. I spend a lot of time training people and working on projects with various companies in this technical area. And that is what I have found.

In this article, I am going to focus on some of the areas where challenges exist, and it’s both technical as well as semantic. That is areas where there are misperceptions of what the agency really wants: areas where companies try to make it harder for themselves, where they lose track of what the purpose is and create obstacles that cause failure.

Every company out there develops their manufacturing process for their candidate product. Some are using the modern Quality by Design (QbD) while others are using older approaches.3,4,5,6 In reality it really does not matter. In both cases the objective is the same.  That is a reproducible process that reliably makes the product you want with constraints of cost, yield and quality. Some companies who have not embraced QbD actually generate processes that are better than some using QbD. But whether you use QbD or not you end up with knowledge. The knowledge includes:
  • What are the unit operations I must have and in what order?
  • What does the output of the unit operation need to look like for a good chance to be usable satisfactorily downstream—remembering that the output of operation n-1 is the input for unit n;
  • What are critical raw materials I must control well and what are more forgiving?; and
  • What parameters are important in the unit operation, that must be controlled and what does not really matter?
In one training exercise on QbD, the process staff had read all the ICH guidances but had not grasped exactly what QbD was.
During the course of the training, and discussion, suddenly I saw a light bulb go on in the eyes of the technical leader. He indicated that they were already using many of the principles I was describing having worked it out for themselves. To them QbD was simply a more formalized process development exercise with a more structured approach but still yielding the key elements described in the bullets above. They viewed it more evolution versus revolution.  And they were right.

We are all very familiar with how these 4 bullet elements as they are linked together. But for a small molecular weight drug it might be 5 unit operations, while for a biological product we may be dealing with 15. The principles are the same. The unit operation is fed (input) with the result of the previous step plus new raw materials. Some of these raw materials may be critical and influence the operation while others may not. The output becomes the input for the next step. The unit operation may have multiple or a single step. Each step will have parameters that can be monitored and/or controlled. Only some of these may be critical and influence the outputs. Good process development (QbD or not) entails determining what is important and what is not and then controlling the important ones.

For companies using QbD the process is more formal with true linkages between all the information, more documented structure and is highly systematic. In the older process development work, it was documented but less holistically and more driven by gut feeling. With QbD there is even a section where you define the Target Product Profile (TPP). That is what the product must look like or be.  In the older process development approach, pre-QbD, it is either understood but not formally documented or the product is what it is at the end of the development work. This is another example of the systematic and structured nature of QbD.

But you may be asking why are we talking about QbD and process development when the article is about implementation of process validation? It’s because without the knowledge of the development illustrated in the bullet points, we cannot effectively validate the process. We cannot know what is important and what is not.

A modern validation of the process calls for sufficient numbers of runs to convince you the process works. These are generally done at commercial scale in the commercial plant or at least the launch facility. So has the classic 3 runs disappeared?  From an FDA perspective—yes. Word of mouth indicates that FDA has approved products (small molecular weight drugs) with a single large-scale batch. I am not aware of any approvals with zero batches. But for biological products where the complexity of the process is clearly higher the classic three is still the norm.

Of course before you run the 3 batches, you will validate each of the unit operations, one by one. But to do that you need the information described in the 4 bullets and this is where the problems begin. So each protocol for each unit operation needs clear definition of the steps and their linkages (forward and backwards), the impact of raw materials and other inputs, what are the variables that might be controlled and which as the critical and what exactly does the output need to look like to be processable downstream. It is getting this right that is the challenge.

In the subsequent sections I am going to define the important elements and also describe where I have seen problems in implementing a good strategy for process validation.

The sequence of steps or unit operations
In a well-designed process development exercise, QbD or not, each unit operation employed has a well-defined role and is placed in a scientifically logical place in the process train. To do this you must understand the challenges of the process stream at each stage. In addition, you must understand what the step does do and what it does not do: and what the step might contribute to make life more difficult downstream. Let me illustrate with 2 examples.

In a tableting operation I was consulting on, the unit operations were very straight forward. A powdered bulk substance—active pharmaceutical ingredient (API)—was compounded with a sugar and lubricant to yield tablets after compression, which were subsequently coated for controlled release. However, it became clear that high variability in the API sometimes yielded tablets that dissolved too quickly or not at all. So granulation and milling steps were introduced into the process to create more consistency in the materials entering compaction. With the introduction of these two steps, we introduced more time into the process and being hygroscopic, the API began to absorb moisture, so this required several steps to be done under controlled humidity to minimize the impact of the hygroscopicity. Without this type of careful analysis to understand the process and its liability, it is difficult to justify the process and the controls.

A classic purification train for a monoclonal antibody is illustrated in Figure 1 on the previous page. Each step is introduced to eliminate a specific set of contaminants. Further, one step (Protein A chromatography) leads to a further contamination, protein A that leaches off that column. A subsequent column must be optimized to remove that contaminant.

It is this effort that defines what unit operation validations are needed and where.

What does the output of each step need to look like?
Critical Quality Attributes is what this element is really all about. The problem that people run into is that they are always looking at the final product specifications. For small molecular weight drugs with a simple 2-step process, the specifications might be completely relevant. But are all specifications critical? While you must pass all specifications to release, the truly CQA are most probably a subset. However, for steps before the final drug product step, CQAs may take on a different set of parameters and values. It is probably best demonstrated using a biotechnology product profile. Table 1 below illustrates. As can be seen the CQAs for the cell expansion reactor and the large reactor where product is made are very different from you would see for the vialed product.

These CQAs become the acceptance criteria that must be met in each unit operation validation. But be careful what you chose. Only chose those that are truly critical. If your downstream process is robust, you may not need to define these parameters at this step, if taken care of at a subsequent step.

The impact of inputs—raw materials and previous outputs
During audits of many companies I always look at the requirements for raw materials as defined by their specifications. What do I see?  Compendial grade (USP, EP etc.) seems to predominate. This is usually indicative of a process group that has not evaluated the criticality of each and every one. This is very prevalent in biotechnology products where there are usually a large number, although I have seen it in small molecular weight API manufacture as well.  In these companies, the process groups have simply grabbed what was available. To simply default to Pharmacopeial grade for all raw materials is both unnecessary and costly.  In addition, when you examine the specifications, you see full compendial testing described. I bet all of those tests are not really needed to assure the product meets the specifications. While it is pretty universal to use Pharmacopeial grade for excipients directly in the product, many materials upstream definitely do not need to be. Early in my career I was charged with assessing raw material requirements in an E. coli fermentation for a recombinant product and indeed, USP grade was not needed. Basically we used food grade materials in the fermentation broth with great technical success as well as regulatory success. But it was only accomplished by good scientific data and approach. By the way, cost of goods was reduced to exceedingly low levels.

Another watch out during development is the failure to examine variability in supply of raw materials. A colleague of mine described a situation where all development and clinical manufacturing was performed with one batch of resin, so when it came to commercial scale, a new batch of resin was introduced and you have guessed it. This batch did not behave exactly like the original. So they had to resort to choosing batches from the vendor that were more closely matching the original clinical batch. That added extra cost and effort into the operations as a use test had to be employed.

These critical reagents and their specifications become the cornerstones for the unit operation validation and subsequent change controls.

What parameters are variables and which are critical to control and how are we going to control them?

Each unit operation whether it be a simple one step operation e.g. milling of an API or a multi-step operation eg. A Protein A purification step, has a set of parameters that can be monitored and/or controlled). It is deciding which are critical and which are non-critical that is the essential element of a good process development. There are well-established approaches of multivariate analysis with half normal plots that allow a screening of process steps to separate the critical from non-critical (Figure 2). It is only after that work can you chose the key elements or parameters that define success. Even after this work, beware of the process scientist that lists every parameter as critical. When I suspect that is the case, I interview them and ask what happens when each parameter deviates. After the explanation, I ask if the product is still Ok or would you throw it away. In most cases, the answer is it affects yield but the product is good. That is not a critical parameter. It is only when you cannot process further or the product will not meet eventual final specifications, that you have defined a critical parameter.

Setting of limits or operating ranges are also important. Beware of limits set that are unattainable in the large-scale equipment. The developed process must be operational in the larger scale equipment, so the transfer of what you found at small scale may be difficult to determine at the larger scale. The use of the first few engineering batches should be used to test out the scale up of the process and ranges adjusted before we rush into the process validation exercises.

When operating processes, we often see transients. That is a parameter that is important may move out of range just for a short time. These transients may be real or an artifact of equipment set up. But if you impose a tight limit on a parameter be prepared to live by it. An example, I have experienced is pH in a production bacterial fermenter. Early work indicated an optimal range and if we attempted to grow outside that range, product quality was affected. However, if controlled within the normal operating range, the system was able to experience transients with no impact on product quality, at least early in the culture age. So our acceptance criteria described the acceptable range but indicated that transients of up to a certain duration were not detrimental before a certain culture age. You have to be careful setting these limits and creative in writing them.

In an article of this length it is impossible to describe all elements or all examples experienced but it illustrates how careful thinking early in development lays the foundations for when you plan to transfer the process to the launch scale equipment where you are required to perform the process validation. After validation, which lays the foundation for routine operations you are on your way to the continuous verification stage called for by both FDA and EU guidances. 


Peter H. Calcott, Ph.D., is president and chief executive officer of Calcott Consulting. A native of the UK, Dr. Calcott has a career spanning 4 decades and has worked for pharma companies such as SmithKline Beecham, Monsanto and Bayer as well as biotechnology companies such as Immunex, Chiron and PDL Biopharma. He has held positions in R&D, manufacturing, QA & QC, process development, regulatory affairs, corporate compliance, business development, and government affairs. He has worked on biologics (vaccines, recombinant proteins, monoclonal antibodies, blood products), drugs and generics and devices from early development to commercial. Visit www.calcott-consulting.com for more information.

References
  1. Guideline on Process Validation (2012)—EMA/CHMP/CVMP/QWP/70278/2012-Rev1
  2. FDA Guidance for Industry (2011) Process Validation: General Principles and Practices Jan 2011
  3. ICH Q8 2009 Pharmaceutical Development Q8 (R2)
  4. ICH Q11 2012 Development and Manufacture of Drug Substances (Chemical Entities and Biotechnological/Biological Entities) Q11
  5. ICH Q9 2005 Quality Risk Management Q9
  6. ICH Q10 2008 Pharmaceutical Quality Systems Q10
  • Packaging Equipment Technology Preview

    Packaging Equipment Technology Preview

    February 15, 2017
    Today’s complex drug products present challenges; the latest equipment capabilities help overcome them

  • The Good, the Bad, and the Donald

    The Good, the Bad, and the Donald

    January 31, 2017
    Orphan drugs, CMO continuous manufacturing and developing world sales offer biggest revenue opportunities

  • Predictive Analytics and the Future of RBM

    Rajneesh Patil, Senior Director, RBM and Analytics, QuintilesIMS||January 23, 2017
    How advances in Risk-Based Monitoring will enable a more proactive approach to identify and mitigate potential risks