Technology to secure and protect an organization’s environment has come a long way. Virtual Private Networks (VPN), Advanced Encryption Standard (AES) encryption, firewalls with intrusion detection, and Secure Sockets Layer (SSL), are all providing unprecedented levels of protection.
Unfortunately, data breaches are occurring too often. A root cause analysis seems to indicate that technology is not the culprit—as recognized by the National Security Agency (NSA)—but that improper human behavior most often results in breaches. Examples of such behavior include failure to upgrade software or patch known bugs (yes organizations still use Windows XP!), sharing of passwords or using obvious passwords, accessing unknown sites with viruses, casual exchange of files through USB sticks, or not keeping anti-virus software up to date. Here, the industry’s culture of quality represents an asset, because there is a shared understanding of what it means to follow procedures, and conversely, the fact that not following procedures can have grave consequences.
To address this topic, a multi-layer approach must be endorsed that builds security into each layer—the network, operating systems, applications and people. Organizations would be wise to consider the platform-as-a-service (PaaS) offering of some suppliers. PaaS provides an integrated series of tools to manage and configure IT environments where security is built-in across all layers. At the application layer, organizations should also consider double authentication approaches to provide access that may involve physical SecureID-based authentication on top of a VPN layer.
Finally, to provide safeguards against security breaches associated with human errors, consider the use of tools to ensure that rules are followed. First and foremost, proper training should be administered regularly and employment agreements should be used that dissuade people from having a cavalier attitude towards security. Tools that force employees to change passwords or prevent them from installing non-approved software should be used.
When the MES solution is in the hands of a third-party organization, auditing the supplier becomes a key security requirement. One will need to ensure that the company complies with its own procedures and requirements and that its internal processes are sound. Certifications are available that describe the compliance status of reputable suppliers against best practices, and most suppliers will provide under a non-disclosure agreement some kind of certification statement or a third party audit report. But one should keep in mind that many industry security certificates tend to have limited relevance to GxP compliance. So pharmaceutical companies should still perform their own security audits.
The second concern is with data integrity. Data integrity does not go away because a company outsources its IT environment, particularly when this is still one of the major types of observations on FDA-483 forms. Therefore, standard requirements need to be addressed irrespective of the outsourcing model.
Conceptually, compliance in that area can be broken down into two domains. The first domain concerns the application where standard practices apply and data integrity requirements can be evaluated using standard techniques. The second domain concerns the infrastructure where auditing and data transfer agreements need to be established to outline what suppliers can do and what procedures they need to follow.
Organizations should work with their suppliers to understand where data is located and in what type of environment it resides, as well as how they will ensure that data is correct and cannot be falsified. They will also need to establish how data can be retrieved, whether it is possible to permanently erase its history or how long it can be retained depending on the needs.
Similarly to data integrity, regulatory requirements do not change because the application is deployed to the cloud. So regulated companies are responsible for compliance and applications need to be validated on a risk basis. From an application point of view, standard validation practices can be used to verify that the software works as intended, keeping in mind that the application is now more likely to fall in the off-the-shelf class of software. Unfortunately, companies are likely to find that most cloud providers do not have formal quality systems that the pharmaceutical industry is accustomed to; instead cloud suppliers rely on a series of controls and mitigation procedures, many of them being fully automated. Consequently, when auditing them, it is important to keep an open mind, focusing on the methods they use to provide quality products and services, and whether or not they comply with their own standards.
Given the risk perspective of validation, one area to focus on is the understanding of risks associated with deploying to the cloud. Because there are more risks and these risks may be hard to identify, we recommended that organizations consult the recently published concept papers from the GAMP cloud SIG1,2 that provide comprehensive overviews.
Responsibilities for risk management must be defined early on and enforced through solid contracts. However, one difficult area concerns the qualification of the infrastructure when that infrastructure is dynamically allocated and loosely defined. This is clearly an area where industry guidelines are welcome in order to audit suppliers and qualify their environment.
Availability cannot be taken for granted and all organizations must have a disaster recovery plan in place to maintain data integrity. To some extent, availability is becoming a decreasing issue, especially considering the network bandwidth that can now be purchased, and the high reliability of electronic devices. Nevertheless, there is no substitute for having a strong backup plan in place that is tested regularly.
One way to address availability is through a quality or service level agreement that specifies the level of services in critical areas, such as incident management, disaster recovery, audit support, processes to upgrade software, notification processes, issue investigation and key performance metrics.
In a world where systems may all reside in the cloud, how do these systems communicate with each other? Let’s assume that the production capabilities are still physical and located in a facility. There are two main challenges associated with integration; the integration of business systems with each other and the integration of MES with the equipment.
A current trend is towards using a cloud middleware solution to connect the various business systems (ERP, LIMS, CAPA, Analytics). This enables connectivity between cloud applications, but also between the cloud and systems deployed within production facilities. Some vendors have developed “connectors” that make information available to other systems. The connection between systems is reduced to drag and drop operations where data objects from one system are mapped to data objects in another system.
Separately, how do we connect a piece of hardware to MES in the cloud? This can be done easily with the use of inexpensive hardware components that make devices visible on the internet by assigning a transmission control protocol (TCP)/ internet protocol (IP) address/port combination to the device. Interestingly enough, if the MES application is designed properly, this approach has very little impact on performance.
To begin taking steps towards the deployment of a cloud-based MES, organizations need to develop a good understanding of their business requirements. As with any IT project, these requirements form the basis for the definition and configuration of solutions.
Manufacturers must also prepare themselves to delegate control and responsibility while still maintaining accountability from a regulatory perspective. This is definitely a significant change in mindset, which in the short-term, creates the need for a strong focus on establishing robust service level agreements. But if the level of interest in cloud-based solutions is any indication, the industry appears to be ready to seek this next level of improvement. And because some third-party vendors are also anxious to take advantage of this new market opportunity, they too are changing their perspective by developing compliant solutions.
Most in the industry would agree that specific guidelines in the areas of auditing, validation and data integrity are now necessary to ensure that migration of MES to the cloud is most effective. This would no doubt fuel broad adoption by providing industry best practices on how processes should be deployed. That being said, the benefits of the cloud are already very clear, and manufacturers who have adopted cloud-based MES, mostly in discrete industries, are gaining significant competitive advantage through improved scalability, flexibility and responsiveness. MES is increasingly moving away from being a facility floor system and taking its rightful place alongside enterprise technologies.
Let’s keep in mind that the industry went through a similar phase with the outsourcing of manufacturing operations to third party organizations. Along the way, issues were identified and resolved to support a vibrant industry, where some drug companies have become “virtual” organizations focusing mostly on drug development and little on drug productions. This augurs well for the future.
- ISPE GAMP Cloud SIG Concept Papers (July 2016): SaaS in a Regulated Environment – The impact of multi-tenancy and subcontracting; Using SaaS in a Regulated Environment – A Life Cycle Approach to Risk Management; Evolution of the Cloud: A Risk- Based. Perspective on Leveraging PaaS within a Regulated Life Science Company
- ISPE GAMP Cloud SIG: Mike Rutherford (Lilly / USA) Kathy Gniecko (Roche / Switzerland).
Christian Fortunel is vice president at LZ Lifescience Inc. (USA). He is responsible for operations management and the successful financial running of the U.S. business, as well as bringing a wealth of MES deployment expertise to his role.