Wednesday, August 29, 2012

Parameterised IQ Protocols

Another question from the 21CFRPart11 forum - not strictly relating to ERES, but interesting all the same:

Q. I am wondering about a project and how the FDA could see it as a validated way to execute qualification protocols.
 

There is the idea: we validated our document management system, where we validated the electronic signature, and the documents could be developed as pdf forms, where some fields are able to be written... and in this case, we could develop our qualification protocols as a pdf forms, with the mandatory fields for protocols, able to be written, and filled with the qualification info.

Is this a situation which FDA could see as a correct way to develop and execute protocols?


A. There should be no problem at all with the approach, as long as the final protocols (i.e. with the parameters entered) are still subject to peer review prior to (and usually after) execution.

We do this with such documents and we have also used a similar approach using HP QualityCenter (developing the basic protocol as a test case in QualityCenter and entering the parameters for each instance of the protocol that is being run).

The peer review process is of course also much simpler, because the reviewer can focus on the correct parameters having been specified rather than the (unchanging) body of the document.

Scaleable Database Controls for Part 11 Compliance

A good question posted on-line in the 21CFRPart 11 Yahoo Group, going to the heart of security controls around Electronic Records

Q Consider an electronic record system utlizing a typical database where a User ID and Password combination are used as an electronic signature to sign the records.  Assume the front end application has sufficient controls to prevent manipulation of the signature by the users.

Without getting too technical, what types of controls and solutions have you folks seen in terms of ensuring compliance with the requirements of 11.70 in regards to access to the database on the back end, typically by a DBA? 

How do you ensure that the linkage from record to signature is secure enough in the database to prevent manipulation by the DBA via ordinary means (i.e. with standard functionality and tools)?

Is there a risk justification for allowing the DBA some ability to manipulate the signature linkage, or should it be prevented by all but extraordinary means (i.e. using non-sanctioned tools, hacking, etc.)?


A. Without getting too technical there are a variety of controls that can be used. Which is implemented depends on risk and should be decided based upon the outcome of a documented risk assessment.

The first step is to use digital rather than electronic signatures. At its simplest, this means calculating some sort of 'hash' (fancy checksum) which is calculated using the contents of both the record and signature components. If either the record or signature components are changed, any attempt to recalculate the hash will come up with a different answer and you will the know that something has changed.

It won't however tell you what was changed - for that you need to rely on audit trails which can be set up at either the applications level (looking at changes to the defined record) or at the database level, to identify any inadvertent or accidental changes at the database level.

There is still the chance that the DBA can turn off the audit trails, so with some databases (e.g. Oracle) you have a couple of other options. The first is to write the audit trails into a separate database schema to which the DBA does not have access. This can use something like Audit Vault technology, which means the audit trails are written to a separate database server. The second is to make the database tables 'virtual private' database tables, which means the DBA can't even see them, let alone access them.

There may be circumstances when the DBA needs to access the database to make changes at the data level e.g. data corruption, or an accidental deletion by a user. It is permissible to make corrections under these circumstances, but you need a strict change control procedure to record what is being fixed, why and when (i.e. to preserve the audit trail). In such circumstances you would typically issue the DBA with a 'Super DBA' user account and password to make the changes, and then change the password afterwards (obviously someone needs to be trusted at some point).

One important principle is the idea of 'motivation' - why would a DBA want to make such changes? Organisational controls should be in place (including training and sanctions) to prevent e.g. a DBA and a system user from working in collusion to falsify records. Clearly system users should not also be DBAs.

It is therefore possible to make your Part 11 records and signatures very, very secure, but the more secure they are the more complex and expensive the solution. That's where the risk assessment is important - to identfy what is appropriate for any given set of records.

Friday, August 24, 2012

Measuring the Value of Validation?

An interesting question today on measuring the value of validation and assigning resources accordingly - something we see very few Life Sciences organizations doing well.

Q. How do others measure the value that validation creates within an organisation?
Does anyone have any experience of assigning value to validation activities?

I'm interested in how others may allocate resource and how within the validation planning process limits are/can be defined in terms of the cost of man hours against the economic benefits created from validation?
 
A. We have a process as part of our 'Lean IS Compliance' programme to put KPIs in place and measure cost and compliance levels (http://www.businessdecision-lifesciences.com/1170-lean-is-compliance.htm if anyone is interested). The challenge with measuring the value of validation is that it's difficult to compare projects with and without validation.
 
Most projects are different and many companies only see the cost of validation and not the value.
However, it can be done when you are e.g. rolling our an ERP system within a wider organisation and some business units need the system validating (APIs) and other business units do not (bulk chemicals).
 
Even in these cases most organisations only measure the cost and see validation as a 'negative', but if you are clever (which is what Lean IS Compliance is all about) you can also measure the value in terms of hard metrics (time and cost to fix software defects that make it into production) as well as soft metrics (user satisfaction when the system does - or does not - work right first time).
 
However, these are general benefits which accrue to any software quality process.
 
Although the principle of risk-based validation is to assign greater resources to systems and functions of greater risk, most companies again see this as an opportunity to reduces the level of resources assigned to lower risk systems/functions and the focus is again not on benefits.
 
It is possible to look at the implementation of systems of comparable size/complexity, where one system is high risk (and has more validation resources/focus) and another system has low/no risk (and few/no validation resources). Our work clearly shows that the higher risk systems do indeed go into production with fewer software issues and that this does have operational benefit (hard and soft benefits).
 
However, few companies track and recognise this and cannot correlate the investment in validation (quality) with the operational benefits. This is often an accounting issue, because the costs are usually capital expenditure and the benefits are associated with (lower) operational expenditure.
 
To really pull together a programme like Lean IS Compliance needs:
  • Quality/Regulatory to truly accept a risk based approach (and that enough is enough)
  • IT to understand the value of software quality activities (to which formal validation adds a veneer of documentation)
  • Accountants to be willing to look at ROI over longer timescales than is often the case.

Thursday, August 23, 2012

Risk-Based Approach to Clinical Electronic Records/Signatures

Q. I'm looking at complying to regulations, in order to validate some lab software collecting data from an ICP. Anyone have any ideas on attacking this? The software claims to be "part 11 capable" but it's pretty soft. E-sigs are 'loose' and audit trails are also 'loose'. For something like this do you feel attaching a risk assessment to each part of the regs to determine what level of testing to perform?

A. The reality is that with the 2003 Scope and Appplication guidance introducing a risk-based approach to e-records, the ability to use digital, electronic or no signatures (the latter being conditional on predicate rule requirements) and with the risk-based approach in Annex 11, taking anything other than a risk-based approach makes no sense.

You should therefore conduct a risk assessment against each of the applicable parts of Part 11, and also for each applicable records and signatures. The latter because the risks impact associated with different records/signatures may well be different (in terms of risk to the subject and subsequent risk to the wider patient population) and also because different technical controls may be applied to different areas of the system.

This will allow you to assess the risk impact, likelihood and detectability for each record/signature and to determine whether the in-built controls are appropriate to the risk. If they are not you can either find alternative solutions e.g. print out the records and sign them, or validate a process to copy data/records to an external, secure system and signing them there or introducing additional procedural controls. If there are no alternative control that are acceptable then you may well need to be looking at an alternative piece of software.

Wednesday, August 22, 2012

Verifying Computer Clocks in Production

Another good question asked online, that we share here...

Q. In a mechanically driven process, when Time is a critical parameter, the timer would be qualified for accuracy and reliability. Probably challenged against another calibrated clock for various durations that would align with the process requirements/settings. No doubt Preventive Maintenance and Calibration schedules would be developed and approved during OQ.

For automated processes that have Time as a critical parameter and the Time is measured by the PLC internal clock, what strategies are best to provide documented evidence that the internal clock is accurate, reliable (especially over time).


A. It's a good question - I can remember a certain DCS (back in the 1980's) that had two versions of a real time clock chip. One had a ceramic chip and the other plastic. The plastic clock chips could lose more that 15 minutes in an hour (honestly - I had to write the code to work around the issue).

On the basis of risk, and following ASTM E2500 / GAMP principles of leveraging the work of your supplier, if you believe that things have improved in the last 20 years you can assume that the system is acceptably accurate and include a check of elapsed time as part of your OQ or PQ.

Confirming this across a number of intervals and process runs over time should allow you to confirm that the clock is sufficiently accurate across a range of intervals with negligible drift over time (you will need to define what 'acceptable' is in your protocol and use a time reference to measure real time).

Taking a risk based approach, I would only expect to specifically test the clock if I had reasons to suspect that their might be an issue.

When are e-Signatures Required?


We answer a lot of questions submitted to us by e-mail or in various on-line groups. We thought that some of these deserve a wider audience, so from now on we're going to publish the best questions (and answers) here on the Life Sciences Blog.

Q. I had a question regarding 21 CFR Part 11 Electronic records. If a system has electronic records but no electronic signature but does contain the Audit trail does it mean that each record that is created within the system has to be printed in hard copy and signed so as to associate that record with the signature.

A. The fact the system has an Audit trail which links the user name and action on that records is sufficient enough to meet the 21 CFR Electronic Records criteria. If the record requires no signature (either explicit or implied from a predicate rule perspective) then there is no need to print and sign anything (either copies of the record or the audit trail).

The (secure) recording of what, (optionally who) and why (inferred from the transaction) will be sufficient to demonstrate compliance with the predicate rule.

A good example would be a training record. The predicate rule does not require training records to be signed, but you would still want to record what (training was undertaken), who (who was trained) and why (the training topic or syllabus). Even though an audit trail would be maintained for the training record, there would be no need to print and sign it because the predicate rule requires no signature.

Tuesday, August 21, 2012

Cloud Computing Comes of Age in Life Sciences

For a while now we have been saying the Cloud was coming of age in the Life Sciences industry.

Business & Decision,  along with a small number of other Providers have been providing Infrastructure-as-a-Service and Platform-as-a-Service for some time.

We also said that as far as Software-as-a-Service was concerned, we would see Life Sciences specialist vendors (e.g. LIMS, Quality Management, Learning Management etc) providing compliant Software-as-a-Service solutions - simply because they understand our industry both at the functional level and also at the regulatory level.

We are working with a number of such vendors to deploy their software on our Platform-as-a-Service solutions, leveraging virtualization to provision solutions that are inherently flexible, scalable and - perhaps just as importantly - compliant.

At the same time, we have just started to engineer our first compliant 'Cloud Anywhere' solutions - which allow us to deploy pre-engineered and pre-qualified Platforms (hardware, power, HVAC, storage, virtualization, operating systems, database servers and applications servers) anywhere in the world. This was an idea first developed with Oracle with their Exadata and Exalogic machines (for which Business & Decision developed standard Qualification Packs).

Based upon a wider and more affordable technology base ‘Cloud Anywhere’ allows Business & Decision to leverage our investment in our Quality Management System to provision compliant Private or Community Cloud solutions with the minimum of additional qualification activities. These can be installed on client sites, in third party data centres of in the data centres of our software partners.

As well as deploying the solution, these 'Cloud Anywhere' solutions also come complete with Managed Services from Business & Decision - meaning that clients, partners etc no longer need to worry about the management of the Platform. All of this is taken care of remotely by our own staff (with the exception of local power and network connections of course) and the solutions can also be engineered to automatically failover to a remote Disaster Recovery site.

In the last couple of years we have seen people asking "How long will it be before everything is in the Cloud?", but the reality is that this will never be case in Life Sciences. There will always be Life Sciences companies who need or want some infrastructure on their own sites (because of network latency issues or data integrity issues) and the reality is that we are moving towards a mixed-Model Cloud Environment.
We will see a mixture of non-clouded Infrastructure, Platforms and Software, and various Cloud models, including On-Premise & Off-Premise and Public & Private Clouds.

The coming of age of safe, secure multi-tenanted Software-as-a-Service and the availability of solutions such as 'Cloud Anywhere' means that Life Sciences companies now have the ability to mix'n'match their Cloud environments to meet their specific business needs - and address their regulatory compliance requirements.

It may not seem like it now, but in the next few years we will see these solutions move from leading-edge to mainstream and we will wonder what all the fuss about Cloud was for.