Migration in the Cloud & Data Integrity.

For several years now, cloud computing has been the object of growing interest from all economic sectors, particularly regulated industry (pharmaceuticals, medical devices, etc.).
If the economic interests of such an architecture are put forward by the main players in the market (Amazon Web Services, Microsoft Azure, Google ...), healthcare companies must question the impact of such a change in the market. hardware and software infrastructure on their regulated applications:

What about the location of the data, the supposed "opacity" of the change management, the confidentiality of the hosted data ...? The purpose of this article is to introduce this new paradigm of application service delivery with an example of infrastructure migration in a cloud environment across an international company, while preserving the quality and integrity of data. regulated.

Some definitions
Let's start by defining the term "Cloud Computing" by using the standardized NIST definition(2) : Cloud computing is a model that allows convenient and on-demand access from any connected terminal on the Internet to shared and configurable computing resources (eg network components, servers, storage, applications and services) that can be quickly provisioned and activated with minimal management effort and without service provider interaction.

This model is composed of five essential characteristics, three service models and four deployment models.

The five essential characteristics

1. Self-service on demand
A customer may have data processing capacity, such as server time and storage capacity, depending on need, without a requirement for human interaction with each service provider.

2. Broad network access
The capacities are available on the internet network and can be accessed via standard mechanisms that favor use by heterogeneous thin or thicker client platforms (for example: mobile telephones, tablets, laptops and workstations).

3. Pooling resources
The provider's computing resources are pooled to serve multiple customers using a multi-tenant model, with different physical and virtual resources dynamically allocated and reallocated based on consumer demand. There is a sense of location independence, in that the client usually has no control over or knowledge of the exact location of the resources provided but can specify a location at a higher level of abstraction (eg country, state or data center). Available resources include, for example, storage, processing, memory, and network bandwidth.

4. Rapid flexibility
The capacities may be supplied and activated flexibly, in some cases automatically, increasing or decreasing rapidly depending on demand. For the consumer, the capacities available seem often unlimited and may be monopolized in any quantity at any time.

5. Service measurement
Cloud systems control and optimize the use of resources automatically by operating a measurement capacity at a level of abstraction appropriate to the type of service (for example storage, processing, bandwidth and active user accounts). Use of resources may be monitored, controlled and reported, which ensures transparency for the provider.

Service models

  • Software as a Service or SaaS

This model gives the consumer the possibility of using provider applications operating in a Cloud infrastructure. The applications can be accessed by various customer devices by means of a thin client interface such as a Web browser (for example, a Web-based email) or a programmable interface (API). The consumer neither manages nor controls the underlying Cloud infrastructure, including the network, servers, user systems, storage, or even individual applications, with the possible exception of specific application configuration parameters restricted to the user.

  • Platform as a Service – PaaS

This model allows the consumer to deploy applications created or acquired by the consumer on the cloud infrastructure using programming languages, libraries, services and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure, including the network, servers, operating systems or storage, but masters the deployed applications and possibly the configuration settings of the hosting environment applications.

  • Infrastructure as a Service - IaaS)

This model allows the consumer to supply the processing, storage, networks and other basic IT resources which make it possible for them to deploy and operate a software of their choice, which may include user systems and applications. The consumer neither manages nor controls the underlying Cloud infrastructure but controls the user systems, storage and the applications deployed and has possibly limited control of the network components selected (for example host firewalls).

Modes of deployment

  • «Private Cloud

In this mode the Cloud infrastructure is provided for the exclusive use of a single organization comprising several consumers (for example commercial units). It may be held, managed and operated by the organization, a third party or a combination of these, and it may exist inside or outside the premises of the organization.

  • Community "cloud"

The Cloud infrastructure is implemented for the exclusive use of a specific community of consumers coming from organizations with common concerns (for example, the task, safety requirements, regulatory rules and constraints). It may be held, managed and operated by one or more organizations of the community, a third party or a combination of these, and it may exist in their own premises or externally. For example, air transport companies have implemented this type of deployment to facilitate the numerous exchanges between them (reservations, cross invoicing…).

  • Public cloud

Cloud infrastructure is supplied for open use by the general public. It may be held, managed and operated by a commercial, university or governmental body, or a combination of these. It exists in the premises of the Cloud provider.

  • Hybrid cloud

The Cloud infrastructure is an assembly of two or more distinct Cloud infrastructures (private, community, or public) which remain single entities, but are linked by a standardized or proprietary technology which allows portability of data and applications (for example load balancing between distributed servers).

The main authorities
The main players in the Cloud market are confronted with numerous and varied requirements given the different professions and fields that they serve.
They must thus adopt the strictest quality requirements to satisfy the level of security and confidentiality required by the most demanding sectors (banking, health, defense…). Most of them are certified in relation to the main authorities in these different areas:

  • ISO 27001: 2013

International Standard for Information Security Management System, published in October 2005 and revised in 2013. This is the most widely used security standard that specifies the requirements for establishing, implementing, updating, and continuously improving a security management system. information in the context of an organization.

  • SOC 1, 2, 3

Service Organization Control Reports (SOC) are prepared by an auditor in compliance with American Institute of Certified Public Accountants (AICPA) standards and are specifically intended to evaluate the means of control of a service organization around 5 confidence factors: security, availability, processing and data integrity, confidentiality and respect for private data.

  • PCI-DSS

The PCI DSS standard is established by the suppliers of payment cards and is managed by the PCI Security Standards Council (open international forum for the improvement, circulation, and implementation of security standards for the protection of accounts data). This standard was created in order to increase control of card holder information with the aim of reducing fraudulent use of payment instruments.

  • HIPAA

The second section of the Health Insurance Portability and Accountability Act (HIPAA) defines the American standards for electronic management of sickness insurance, the transmission of electronic treatment forms and all user IDs necessary for the dematerialization program for sickness insurance treatment forms.

  • GDPR (Global Data Privacy Regulation)

Applicable from the 25 May 2018, the European Regulation on data protection imposes specific obligations on subcontractors who may be held liable in the event of breach. These obligations affect all bodies that process personal data on behalf of another body, as part of a service or service provision such as computer services providers (hosting, maintenance.). The subcontractors are bound to comply with specific obligations regarding security, confidentiality and documentation of their activity. They must take account of data protection from the design stage of the service or product and by default must put in place measures that ensure optimal protection for data.

Most of these certification authorities are not pharmacy-specific but contain most of the data security requirements, ie:

  • Integrity. The data must be what they are expected to be and must not be corrupted accidentally or intentionally.
  • Confidentiality. Only authorized persons have access to the information intended for them. Any unwanted access must be prevented.
  • Availability. The system must operate faultlessly during the planned usage slots, and guarantee access to the installed services and resources within the expected response time.

But also relative to:

  • Non-repudiation and imputation. No user must be able to dispute the operations they have performed as part of their authorized activities and no third party must be able to allocate activities of another user to themselves.
  • Authentication. Identification of users is crucial for managing access to relevant workspaces and maintaining confidence in interactive relationships.

As regards regulatory texts more specifically related to the pharmaceutical business, references to "Cloud computing" can be found in the following standards:

  • 21 CFR Part 11(3)

the notion of open system which is defined as an environment for which access to the system is not controlled by the persons responsible for the content of resident electronic records in that system. For these systems, the 21 CFR Part 11 recommends in 11.30 that "People who use open systems to create, modify, maintain, or transmit electronic records shall employ procedures and controls designed to ensure authenticity, integrity and, where appropriate, confidentiality of electronic documents from the point of their creation to the point of receipt. These procedures and means of control include those identified in the paragraph 11.10, and if necessary, additional measures such as document encryption and the use of digital signature standards to ensure, depending on the circumstances, the authenticity integrity and confidentiality of records.

  • WHO (4)

Under a subcontracting agreement, the WHO guide emphasizes the need for "the responsibilities of the principal and the acceptor defined in a contract as described in the guidelines of the World Health Organization. WHO fully covers the data integrity processes of both parties covering the outsourced work or services provided ... These responsibilities extend to all IT service providers, data centers, database maintenance personnel and contracted computer systems, as well as cloud computing solution providers ... Staff who periodically audit and evaluate the competence of an agency or contracted service provider should have the knowledge, qualifications, experience and appropriate training to assess data integrity governance systems and detect validity issues. The evaluation and the frequency and approach of the contract acceptor's periodic monitoring or evaluation should be based on a documented risk assessment that includes an assessment of the data processes. Finally, this document emphasizes the importance that "planned data integrity monitoring strategies should be included in quality agreements" and written contractual and technical provisions, if any, between originator and the acceptor of the contract. These should include provisions allowing the payer to have access to all data held by the contracted organization in relation to the product or service of the payer, as well as all relevant quality. This should include the client's access to electronic records, including audit trails, held in the organization's computerized systems, as well as all printed reports and other paper or electronic documents. relevant.
When the retention of data and documents is entrusted to a third party, particular attention should be paid to the understanding of the ownership and recovery of data held under this arrangement. The physical location in which the data is stored, including the impact of the laws applicable to that geographic location, should also be taken into account. Agreements and contracts should establish mutually agreed consequences if the acceptor of the contract refuses or restricts the originator's access to the records held by the acceptor of the contract. When outsourcing databases, the client must ensure that subcontractors, in particular cloud service providers, are included in the quality agreement and are properly trained in records management. and data. Their activities should be subject to regular monitoring as determined by the risk assessment. "

  • US FDA(5)

Interesting recommendations on the use of electronic records and signatures in clinical studies and more particularly on the use of application services in the Cloud are proposed:

- Validation documentation to be defined from a risk-based approach can be based on vendor documentation and procedures.
- The ability to generate accurate and complete copies of regulated recordings
- The availability and retention of records in the event of inspection as long as the documents are required by the applicable regulations.
- The archiving capabilities of the solution.
- Access controls and verification of authorizations granted to users.
- Secure audit logs, generated by the system and timestamped with user actions and changes to the data.
- Encryption of data at rest and in transit.
- Electronic signature methods.
- Records of the performance of the electronic services provider and the electronic service provided.
- The capacity for monitoring compliance of the electronic service provider with the security of the service and data integrity checks.

An infrastructure migration project in the Cloud at international company level
As early as June, Baxter Healthcare initiated 2017, a project to migrate its IT servers to a "cloud" architecture, which represents approximately 450 servers and a portfolio of 20 GxP applications and 80 non-GxP applications: finance, sales ...

The preparation phase lasted 6 months to end at the end of December 2017 with the choice of the Amazon Web Services (AWS) provider, market leader and supplier meeting the criteria of the call for tender and having the required quality certifications for such a project.
During this preparation phase and given the large number of servers to migrate, a preliminary assessment was made to classify the systems and determine the possible migration options according to the following possibilities:

  1. Decommissioning. The application is at the end of life and can be stopped without difficulty for business. Users are informed of this and the infrastructure will be decommissioned in accordance with a pre-established procedure. For each application identified as GxP, a decommissioning plan will be produced with archiving of the GxP records required for the legal retention period.
  2. Rationalization. It is possible to combine several instances of the same application in order to reduce the size of the instance. This pooling effort must be carried out in accordance with the modifications management procedure with great care paid to data confidentiality.
  3. Simple accommodation. The application can be hosted in a cloud infrastructure with minimal change. The application and infrastructure change management process will be applied when moving to the future Data Center.
  4. Cloud conversion. The application must be upgraded to be hosted in the cloud. A project is initiated according to the application change management procedure incorporating the necessary specification and design steps, as well as the induced validation phases.

An approach focused on data security and integrity
In the case of systems decommissioning, the two following options are practiced:

  1. The database is kept read-only and it is possible to access the regulated data via validated reports or queries.
  2. The application is kept on a virtual server in order to retain access to the data via the application; this virtual machine which is activated solely on request in the event of an audit or an inspection must be given particular attention (link between the database and the application, specific IP name or address…). This method of storage can be used in the majority of cases with the exception of incompatibilities with operating systems.

For each system/application concerned by the project identified in the configuration management database (CMDB), a Quality risk analysis is performed to determine the degree of validation required for the change of infrastructure. A specific regulatory risk analysis is conducted to assess the impact and define any specific measures to guarantee data integrity as well as a security risk analysis which will define security provisions in the Cloud (private virtual Cloud, multi-factor authentications, configuration of the virtual machine (AMI)…).
All of these analyses give rise to a report which will consolidate the results and risk-control methods for each system concerned.
The project process can then go ahead with the writing of the specifications for the future system (functional and technical specifications) which will enable identification of the migration tool to be implemented.

The development environment is performed on the Cloud platform and, for applications requiring rebuilding, the code of the secure application in a source code manager is changed according to the approved specifications. Important tests are carried out on this environment to avoid the appearance of anomalies during the validation phase. Once the so-called "dry run" tests are finalized and conclusive, a change request is initialized and pre-approved after verifying the identification and specifications of the application, the risk analysis report and the construction specifications. the qualification environment that must be identical to the one checked in the development environment.

The qualification environment is constructed in accordance with the approved specifications and the application is qualified for basic functionalities and critical functions: communication, interfaces, printing, database access...). The tests are recorded, and any errors are analyzed, and, after correction, the tests are re-run in order to check the effectiveness of the corrective action.
The change management records are updated with the references of the tests carried out and the risk analysis report is approved with approval of the change.

The production environment can then be constructed, and specific checks may be carried out if necessary. A monitoring phase of the application in production is set up and the change record is given final approval (closure).

Key points
Each critical step of the project is subject to strict review and approval management. Transitions from development instances to verification instances (QA) and migration to production instances are subject to the approval of a Change Advisory Board (CAB).

The change management process is at the heart of the project. A change is initiated for each database migration and for each environment taking care to separate the verification (Quality) and production environments. This must include a back-up plan (possible backtracking) and be approved by the process owner ("Business Process Owner" or BPO), the Chief Security Officer (CSO), and the representative. Quality Assurance Representative (QSR).
Once the migration has been carried out, an integrity verification report for the migrated data is attached to the change request.
The automation of certain phases results in a noticeable time-saving in implementation. Testing, deviation management and change management are managed through dedicated applications. The infrastructure proposed by Amazon allows segregation of responsibilities:

  • Maintenance of the underlying infrastructure Cloud is provided by Amazon (AWS) who has the technical resources, staff and quality certifications required; AWS is responsible for protecting the infrastructure running all services offered in the cloud. This infrastructure consists of the hardware, software, network, and facilities running the AWS Cloud Services.
  • Maintenance in the Cloud is Baxter’s responsibility without any change relative to the previous infrastructure supplier; Baxter remains in charge of maintenance and security with regard to operating systems, applications and security (antivirus...). Although Baxter is deploying an Amazon EC2 instance, it is responsible for management of the guest operating system (including updating and security patches), software, applications and utilities installed by the customer on the instance(s), and the configuration of the firewall supplied by AWS (security group) for each instance.

Thus, no access to AWS personnel is possible on the application infrastructure and data deployed by Baxter, thus preserving their integrity and security. The location of the data is guaranteed and change management is shared.
For example:

  • Patch Management

AWS is responsible for the correction of faults linked to the infrastructure, but Baxter is responsible for the correction of their operating system(s) and application(s).

  • Configuration Management

AWS maintains the configuration of its infrastructure, but Baxter must configure its own operating systems, databases and guest applications.

  • Knowledge and training

AWS trains its employees and Baxter is responsible for training its own employees.

En conclusion
Although it is difficult to anticipate short-term financial benefits, savings should be expected in the medium-term (3-5 years) as Baxter no longer has to cope with the obsolescence of network components and IT equipment. In addition, the gradual optimization of the solutions available on this platform will enable noticeable savings to be made on license costs. This project, currently being finalized, demonstrates the capacity and feasibility of migration to a Cloud architecture at international company level while ensuring the integrity of the data and applications transferred.

Share Article

2018 04 27 screenshot to 13.45.19

Jean-Louis JOUVE – COETIC

Since November 2004, Jean-Louis JOUVE is the manager and principal consultant of COETIC, an expertise and consulting company dedicated to regulated industries such as pharmaceutical and cosmetic industry, medical device manufacturers, biotechnology companies, producers of active pharmaceutical ingredients. COETIC's missions are focused on support for choice, project management assistance, product expertise, validation of its customers' systems, processes and processes according to US FDA, EU GMP and industry standards. GAMP 5 or ISO / TR 80002-2 type: 2017. Before the creation of COETIC, Jean-Louis JOUVE was the general manager of a company specializing in the computerization of the quality processes of regulated companies: more than 50 systems for about 30 national and international customers were implemented in this period. Jean-Louis JOUVE has a degree in engineering from the Ecole Supérieure de Chemie Industrielle de Lyon (CPE LYON) and a Diploma of Advanced Studies (DEA) in Analytical Chemistry from the University of Lyon I.

jean-louis.jouve@coetic.com

2018 04 27 screenshot to 13.45.26

Gregory FRANCKX – BAXTER

Currently IT Validation Manager EMEA, Gregory Franckx has been working for more than 10 years for Baxter World Trade, being in charge of the validation of infrastructures and GxP applications for Europe. Specialist Computer System Integrity & Data Integrity. He is also certified as Lead Auditor for Software / Data Center / Cloud providers.

gregory_franckx@baxter.com

2018 04 27 screenshot to 13.45.42

Jean-Sébastien DUFRASNE – BAXTER

Currently Worldwide Responsible for IT Quality and Compliance at Baxter Healthcare including Global Data Integrity Program across Manufacturing plants.
During Twenty-three years, Jean-Sébastien has confirmed his leadership in the fields of medical devices and Baxter (Baxter), in Biological Company (GSK Vaccines), Chemical Manufacturing and Banking Industries. Jean-Sébastien is a Baxter expert in Computerized System and Data Integrity.

jean_sebastien_dufrasne@baxter.com

Bibliography

1. https://www.skyhighnetworks.com/cloud-security-blog/microsoft-azure-closes-iaas-adoption-gap-with-amazon-aws/
2. National Institute of Standards and Technology - Special publication 800-145
3. US FDA, 21 CFR Part 11, "Electronic Records; Electronic Signatures; Final Rule. Federal Register Vol. 62, No. 54, 13429, March 1997
4. WHO, Annex 5, Technical Report Series; No 996 "Guidance on Good Data and Record Management Practices," May 2016
5. Guidance for Industry June 2017 (Draft): "Use of Electronic Records and Electronic Signatures in Clinical Investigations Under 21 CFR Part 11"