Thanks once again to those of you who joined us for yesterday's webcast - the second stop on our "virtual book tour" which looked at practical risk management. We had a good number of questions asked as part of the registration process which we handled in yesterday's webcast (you can watch a recording of the webcast and download the slides here) but unfortunately we didn't have time to answer all of your questions that were asked during the session.
As usual, we taken the time to answer the outstanding questions here on a life sciences blog.
Q. Can you say more about regulators who are worried about misused risk assessments?
A. During the webcast we mentioned that the number of inspectors from European and US regulatory agencies had commented that they have concerns about the quality of risk assessments and the resulting validation. This comments have been made during informal discussions and in one case at a conference.
Their concern is that the resulting validation is either not broad enough in terms of scope or not rigorous enough in terms of depth and that this has been uncovered during inspection of what they believe to be relatively critical systems. In a couple of cases inspectors have commented that they believe that this is a case of the companies involved using risk assessment as an excuse to reduce the level of effort and resources applied in validating such systems.
We know from their comments that in a number of cases this has led to inspection observations and enforcement actions and it appears that a number of regulatory inspectors are in their words "wise to the trick". As we said in a webcast yesterday is important that the scope and rigour of any validation is appropriate to the system and the risk assessment is used to determine which areas and functions in the system require greater focus. The objective of risk-based validation is not to simply produce a level of effort and expenditure but ensure that the efforts and resources are applied most appropriately.
Q. How much time and effort can be saved by using the right risk assessment approach?
A. Our experience is that by using a relative risk assessment process rather than a quantitative risk assessment process it is possible to reduce the time and effort spent on assessing risks by between 50 to 75%. We have also studied the outputs of both types of risk assessment process on very similar systems and it is encouraging to note that in many cases both processes have provided very similar outputs in terms of the distribution of high, medium and low risk priorities both in terms of the relative number of each risk priority grouping and the functions allocated to each group.
This means that for enterprise systems with lower risk it is possible to reduce the time spent assessing risks by half or three quarters and still come up with results which are sufficiently accurate to support appropriate risk-based validation. This is why it is so important that regulated companies have a variety of risk management processes and tools available to them so they can use the most appropriate and cost-effective approach.
Q. When would you use a quantitative risk assessment approach? For what type of systems?
A. You would typically use a quantitative risk assessment approach where it is necessary to distinguish low, medium and high risk impact amongst a variety of requirements or functions that are all or are mostly of high GxP significance. In this case a quantitative (numeric) approach allows you to take a more granular view and again focus your verification activities on the requirements or functions which are of the highest risk impact.
Typically these will be systems which are safety critical and while this approach could be very useful in terms of manufacturing systems, in terms of enterprise systems we see this approach being used to the most critical systems such as adverse event systems (AES), LIMS systems used for product release, MES etc. Even with these systems quantitative risk assessment can be used on a selective basis for those modules which the initial risk assessment determines to be most critical.
Q. Who should conduct the risk assessment of EDMS system supporting the whole Enterprise?
A. Risk assessments cannot be conducted alone. This was a key points bought out in this week's GAMP UK meeting where we ran a risk assessment exercise and it was clearly valuable to have a variety of opinions and experience feeding into the process. You need people who understand the requirements, the business processes and the resulting risks to give their expertise with respect to risk impact.
You also need technical subject matter experts from the engineering or IT group who are much more likely to understand the risk likelihood. Both groups can contribute to thinking about risk detectability, either in terms of detecting risks within the system or as part of the normal business process checks.
It is therefore very important to invite the right people with the right breadth and depth of knowledge to any risk assessment exercise and to allow sufficient time for the relevant risk scenarios to be identified and assessed.
Thank you as ever for your interesting questions - we hope we find the answers above useful. Remember that you can join us on 17th October when will be looking at the very thorny issue of validating enterprise systems in the Cloud as Software-as-a-Service (registration is free and is open here)
Showing posts with label Risk Assessment. Show all posts
Showing posts with label Risk Assessment. Show all posts
Thursday, October 4, 2012
Friday, March 30, 2012
Computer System Validation Policy on Software-as-a-Service (SaaS)
In a recent LinkedIn Group discussion (Computerized Systems Validation Group: Discussion "Validation of Cloud), the topic of
Software-as-a-Service (SaaS) was widely discussed and the need to identify appropriate
controls in Computer System Validation (CSV) policies was discussed.
The reality is that relatively few compliant, validated SaaS
solutions are out there, and relatively few Life Sciences companies have CSV
policies that address this.
However, there are a few CSV policies that I’ve worked on that address this and
although client confidentiality means that I can’t share the documents, I did
volunteer to publish some content on what could be included in a CSV policy
to address SaaS.
Based on the assumption that any CSV policy leveraging a
risk-based approach needs to provide a flexible framework which is instantiated
on a project specific basis in the Validation (Master) Plan, I've provided some notes below (in italics) which may be useful in providing policy guidance. These would need to be incorporated in a CSV Policy using appropriate language (some Regulated Company's CSV Policy's are more prescriptive that others and the language should reflect this).
"When the use of Software-as-a-Service (SaaS) is considered,
additional risks should be identified and accounted for in the risk assessment and in the development of the Validation Plan computer system validation
approach. These are in addition to the issues that need to be considered with
any third party service provider (e.g. general hosting and managed services).
These include:
- How much control the Regulated Company has over the configuration of the application, to meet their specific regulatory or business needs (by definition, SaaS applications provide the Regulated Company (Consumer) with little or no control over the application configuration)
o
How does the Provider communicate application
changes to the Regulated Company, where the Regulated Company has no direct
control of the application?
o
What if Provider controlled changes mean that
the application no longer complies with regulatory requirements?
- The ability/willingness (or otherwise) of the Provider to support compliance audits
- As part of the validation process, whether or not the Regulated Company can effectively test or otherwise verify that their regulatory requirements have been fulfilled
o
Does the Provider provide a separate
Test/QA/Validation Instance?
o
Whether it is practical to test in the
Production instance prior to Production use (can such test records be clearly
differentiated from production records, by time or unique identification)
o
Can the functioning of the SaaS application be
verified against User Requirements as part of the vendor/package selection
process? (prior to contract - applicable to higher risk applications)
o
Can the functioning of the SaaS application be
verified against User Requirements once in production use? (after the control -
may be acceptable for lower risk applications)
- Whether or not the Provider applies applications changes directly to the Production instance, or whether they are tested in a separate Test/QA Instance
- Security and data integrity risks associated with the use of a multi-tenanted SaaS application (i.e. one that is also used by other users of the system), including
o
Whether or not different companies data is
contained in the same database, or the same database tables
o
The security controls that are implemented
within the SaaS application and/or database, to ensure that companies cannot
read/write delete other companies data
- Where appropriate, whether or not copies of only the Regulated Companies data can be provided to regulatory authorities, in accordance with regulatory requirements (e.g. 21CFR Part 11)
- Where appropriate, whether or not the Regulated Companies data can be archived
- If it is likely that the SaaS application is de-clouded (brought in-house or moved to another Provider)
o
Can the Regulated Companies data be extracted
from the SaaS application?
o
Can the Regulated Companies data be deleted in
the original SaaS application?
If these issues cannot be adequately addressed (and risks
mitigated), alternative options may be considered. These may include:
- Acquiring similar software from an acceptable SaaS Provider,
- Provisioning the same software as a Private Cloud, single tenancy application (if allowed by the Provider)
- Managing a similar application (under the direct control of the Regulated Company), deployed on a Platform-as-a-Service (PaaS)"
Tuesday, September 27, 2011
Software as a Service - Questions Answered
As we expected, last week's webcast on Software as a Service (Compliant Cloud Computing - Applications and SaaS) garnered a good deal of interest with some great questions and some interesting votes.
Unfortunately we ran out of time before we could answer all of your questions. We did manage to get around to answering the following questions (see webcast for answers)
Cloud computing is, not surprisingly, the big topic of interest in the IT industry and much of business in general. Cloud will change the IT and business models in many companies and Life Sciences is no different in that respect.
We've have covered this extensively during the last few months, leveraging heavily on the draft NIST Definition of Cloud Computing which is starting to be the de-facto standard for talking about the Cloud - regardless of Cloud Service Providers constantly inventing their own terminology and services!
If you missed any of the previous webcasts they were
- Qualifying the Cloud: Fact or Fiction?
- Leveraging Infrastructure as a Service
- Leveraging Platform as a Service
There are of course specific issues that we need to address in Life Sciences and our work as part of the Stevens Institute of Technology Cloud Computing Consortium is helping to define good governance models for Cloud Computing. These can be leveraged by Regulated Companies in the Life Sciences industry, but it is still important to address the questions and issues covered in our Cloud webcasts.
As we described in our last session, Software as a Service isn't for everyone and although it is the model that many would like to adopt, there are very few SaaS solutions that allow Regulated Companies to maintain compliance of their GxP applications 'out-of-the-box'. This is starting to change, but for now we're putting our money (literally - investment on our qualified data center) into Platform as a Service, which be believe offers the best solution for companies looking to leverage the advantage of Cloud Computing with the necessary control over their GxP applications.
But on to those SaaS questions we didn't get around to last week:
Q. Are you aware of any compliant ERP solutions available as SaaS?
A. We're not. We work with a number of major ERP vendors who are developing Cloud solutions, but their applications aren't yet truly multi-tenanted (see SaaS webcast for issues). Other Providers do offer true multi-tenanted ERP solutions but they are not aimed specifically for Life Sciences. We're currently working with Regulated Company clients and their SaaS Cloud Service Providers to address a number of issues around infrastructure qualification, training of staff, testing of software releases etc, . Things are getting better for a number of Providers, but we're not aware of anyone who yet meets the regulatory needs of Life Sciences as a standard part of the service.
The issue is that this would add costs and this isn't the model that most SaaS vendors are looking for. It's an increasingly competitive market and it's cost sensitive. This is why we believe that niche Life Sciences vendors (e.g. LIMS, EDMS vendors) will get their first, when they combine their existing knowledge of Life Sciences with true multi-tenanted versions of their applications (and of course, deliver the Essential Characteristics of Cloud Computing - see webcasts)
Q. You clearly don't think that SaaS is yet applicable for high risk applications? What about low risk applications?
E-mail us at life.sciences@businessdecision.com
Unfortunately we ran out of time before we could answer all of your questions. We did manage to get around to answering the following questions (see webcast for answers)
- Would you agree that we may have to really escrow applications with third parties in order to be able to retrieve data throughout data retention periods?
- How is security managed with a SaaS provider? Do they have to have Admin access, which allows them access to our data?
- How do you recommend the Change Management (control) of the SaaS software be managed?
- How can we use Cloud but still have real control over our applications?
- What should we do if procurement and IT have already outsourced to a Saas provider, but we haven't done an audit?
As promised, we have answered the two remaining questions we didn't get time to address below.
If you missed any of the previous webcasts they were
- Qualifying the Cloud: Fact or Fiction?
- Leveraging Infrastructure as a Service
- Leveraging Platform as a Service
There are of course specific issues that we need to address in Life Sciences and our work as part of the Stevens Institute of Technology Cloud Computing Consortium is helping to define good governance models for Cloud Computing. These can be leveraged by Regulated Companies in the Life Sciences industry, but it is still important to address the questions and issues covered in our Cloud webcasts.
As we described in our last session, Software as a Service isn't for everyone and although it is the model that many would like to adopt, there are very few SaaS solutions that allow Regulated Companies to maintain compliance of their GxP applications 'out-of-the-box'. This is starting to change, but for now we're putting our money (literally - investment on our qualified data center) into Platform as a Service, which be believe offers the best solution for companies looking to leverage the advantage of Cloud Computing with the necessary control over their GxP applications.
But on to those SaaS questions we didn't get around to last week:
Q. Are you aware of any compliant ERP solutions available as SaaS?
A. We're not. We work with a number of major ERP vendors who are developing Cloud solutions, but their applications aren't yet truly multi-tenanted (see SaaS webcast for issues). Other Providers do offer true multi-tenanted ERP solutions but they are not aimed specifically for Life Sciences. We're currently working with Regulated Company clients and their SaaS Cloud Service Providers to address a number of issues around infrastructure qualification, training of staff, testing of software releases etc, . Things are getting better for a number of Providers, but we're not aware of anyone who yet meets the regulatory needs of Life Sciences as a standard part of the service.
The issue is that this would add costs and this isn't the model that most SaaS vendors are looking for. It's an increasingly competitive market and it's cost sensitive. This is why we believe that niche Life Sciences vendors (e.g. LIMS, EDMS vendors) will get their first, when they combine their existing knowledge of Life Sciences with true multi-tenanted versions of their applications (and of course, deliver the Essential Characteristics of Cloud Computing - see webcasts)
Q. You clearly don't think that SaaS is yet applicable for high risk applications? What about low risk applications?
A. Risk severity of the application is one dimension of the risk calculation. The other is risk likelihood where you are so dependent on your Cloud Services Provider. If you select a good Provider with good general controls (a well designed SaaS application, good physical and logical security, mature support and maintenance process) then it should be possible to balance the risks and look at SaaS, certainly for lower risk applications.
It still doesn't mean that as a Regulated Company you won't have additional costs to add to the costs of the service. You need to align processes and provide on-going oversight and you should expect that this will add to the cost and slow down the provisioning. However, it should be possible to move lower risk applications into the Cloud as SaaS, assuming that you go in with your eyes open and realistic expectations of what is required and what is available.
Q. What strategy should we adopt to the Cloud, as a small-medium Life Sciences company?
A. This is something we're helping companies with and although every organization is different, our approach is generally
- Brief everyone on the advantages of Cloud, what the regulatory expectations are and what to expect. 'Everyone' means IT, Procurement, Finance, the business (Process Owners) and of course Quality.
- Use your system inventory to identify potential applications for Clouding (you do have one, don't you?). Look at which services and applications are suitable for Clouding (using the IaaS, PaaS and SaaS, Private/Public/Community models) and decide how far you want to go. For some organizations IaaS/PaaS is enough to start with, but for other organizations there will be a desire to move to SaaS. Don't forget to think about new services and applications that may be coming along in foreseeable timescales.
- If you are looking at SaaS, start with lower risk applications, get your toe in the water and gradually move higher risk applications into the Cloud as your experience (and confidence) grows - this could take years and remember that experience with one SaaS Provider does not automatically transfer to another Provider.
- Look to leverage one or two Providers for IaaS and PaaS - the economies of scale are useful, but it's good to share the work/risk.
- Carefully assess all Providers (our webcasts will show you what to look for) and don't be tempted to cut audits short. It is time well worth investing and provides significant ROI.
- Only sign contracts when important compliance issues have been addressed, or are included as part of the contractual requirements. That way there won't be any cost surprises later on.
- Remember to consider un-Clouding. We've talked about this in our webcasts but one day you may want to switch Provider of move some services or applications out of the Cloud.
E-mail us at life.sciences@businessdecision.com
Tuesday, September 20, 2011
GAMP® Conference: Cost-Effective Compliance – Practical Solutions for Computerised Systems
A very interesting and useful conference held here in Brussels over the past two days, with a focus on achieving IS compliance in a cost effective and pragmatic way. It's good to see ISPE / GAMP® moving past the basics and getting into some more advanced explorarations of how to apply risk-based approaches to projects and also the operational phase of the system life cycle.
There was understandably a lot of discussion and highlighting of the new Annex 11 (Computerised Systems), with many of the presenters tying their topics back to the new guidance document, which has now been in effect for just two and a half months.
One of the most interesting sessions was when Audny Stenbråten, a Pharmaceutical Inspector of the Norwegian Regulator (Statens Legemiddelverk) provided a perspective of Annex 11 from the point of view of the regulator. It was good to see an open approach to the use of pragmatic risk-based solutions, but as was highlighted throughout the conference, risk-based approaches require a well-documented rationale.
Chris Reid of Integrity Solutions presented a very good session on Managing Suppliers and Service Providers and Tim Goossens of MSD outlined how his company is currently approaching Annex 11.
Siôn Wyn, of Conformity, provided an update on 21 CFR Part 11, which was really ‘no change’. The FDA are continuing with their add-on Part 11 inspections for the foreseeable future, with no planned end date and no defined plans on how to address updates or any changes to Part 11.
On the second day, after yours truly presented some case studies on practical risk management in the Business & Decision Life Sciences CRO and our qualified data center, Jürgen Schmitz of Novartis Vaccines and Diagnostics presented an interesting session on how IT is embedded into their major projects.
Mick Symonds of Atos Origin presented on Business Continuity in what I thought was an informative and highly entertaining presentation, but which was non-industry specific and was just a little too commercial for my liking.
Yves Samson (Kereon AG) and Chris Reid led some useful workshops looking at the broader impacts of IT Change Control and the scope, and scalability of Periodic Evaluations. These were good, interactive sessions and I’m sure that everyone benefitted from the interaction and discussion.
In the final afternoon René Van Opstal, (Van Opstal Consulting) gave an interesting presentation on aligning project management and validation and Rob Stephenson (Rob Stephenson Consultancy) presented a case study on Decommissioning which, although it had previously been presented at a GAMP UK meeting, was well worth airing to a wider audience.
All in all it was a good couple of days with some useful sessions, living up to its billing as suitable for intermediate to advanced attendees. On the basis of this session I’d certainly recommend similar sessions to those responsible for IS Compliance in either a QA or IT role and I’m looking forward to the next GAMP UK meeting, and to presenting at the ISPE UK AGM meeting and also the ISPE Global AGM meeting later in the year.
There was understandably a lot of discussion and highlighting of the new Annex 11 (Computerised Systems), with many of the presenters tying their topics back to the new guidance document, which has now been in effect for just two and a half months.
One of the most interesting sessions was when Audny Stenbråten, a Pharmaceutical Inspector of the Norwegian Regulator (Statens Legemiddelverk) provided a perspective of Annex 11 from the point of view of the regulator. It was good to see an open approach to the use of pragmatic risk-based solutions, but as was highlighted throughout the conference, risk-based approaches require a well-documented rationale.
Chris Reid of Integrity Solutions presented a very good session on Managing Suppliers and Service Providers and Tim Goossens of MSD outlined how his company is currently approaching Annex 11.
Siôn Wyn, of Conformity, provided an update on 21 CFR Part 11, which was really ‘no change’. The FDA are continuing with their add-on Part 11 inspections for the foreseeable future, with no planned end date and no defined plans on how to address updates or any changes to Part 11.
On the second day, after yours truly presented some case studies on practical risk management in the Business & Decision Life Sciences CRO and our qualified data center, Jürgen Schmitz of Novartis Vaccines and Diagnostics presented an interesting session on how IT is embedded into their major projects.
Mick Symonds of Atos Origin presented on Business Continuity in what I thought was an informative and highly entertaining presentation, but which was non-industry specific and was just a little too commercial for my liking.
Yves Samson (Kereon AG) and Chris Reid led some useful workshops looking at the broader impacts of IT Change Control and the scope, and scalability of Periodic Evaluations. These were good, interactive sessions and I’m sure that everyone benefitted from the interaction and discussion.
In the final afternoon René Van Opstal, (Van Opstal Consulting) gave an interesting presentation on aligning project management and validation and Rob Stephenson (Rob Stephenson Consultancy) presented a case study on Decommissioning which, although it had previously been presented at a GAMP UK meeting, was well worth airing to a wider audience.
All in all it was a good couple of days with some useful sessions, living up to its billing as suitable for intermediate to advanced attendees. On the basis of this session I’d certainly recommend similar sessions to those responsible for IS Compliance in either a QA or IT role and I’m looking forward to the next GAMP UK meeting, and to presenting at the ISPE UK AGM meeting and also the ISPE Global AGM meeting later in the year.
Thursday, February 11, 2010
Risk Likelihood of New Software
Here's a question submitted to validation@businessdecision.com - which we thought deserved a wider airing.
Q. To perform a Risk Assessment you need experience about the software performance. In the case of new software without previous history, how can you handle it?
A. We are really talking about the risk likelihood dimension of risk assessment here
GAMP suggests that when determining the risk likelihood you look at the ‘novelty’ of the supplier and the software (we sometimes use the opposite term – maturity – but we’re talking about the same thing).
If you have no personal experience with the software you can conduct market research – are there any reviews on the internet, any discussions on discussion boards or is there a software user group the Regulated Company could join? All of this will help to determine whether or not the software is ‘novel’ in the Life Sciences industry, whether it has been used by other Regulated Companies and whether there are any specific, known problems that will be the source of an unacceptable risk (or a risk that cannot be mitigated).
If it is a new product from a mature supplier then you can only assess risk based on the defect / support history of the supplier's previous products and an assessment of their quality management system. If it a completely new supplier to the market then you should conduct an appropriate supplier assessment and would generally assume high risk likelihood, at least until a history is established through surveillance audits and use of the software.
All of these pieces of information should feed into your initial high level risk assessment and be considered as part of your validation planning. When working with ‘novel’ suppliers or software it is usual for the Regulated Company to provide more oversight and independent verification.
At the level of a detailed functional risk assessment the most usual approach is to be guided by software categories – custom software (GAMP Category 5) is generally seen as having a higher risk likelihood than configurable software (GAMP Category 4), but this is not always the case (some configuration can be very complex)- our recent webcast on "Scaling Risk Assessment in Support of Risk Based Validation" has some more ideas on risk likelihood determination which you might find useful.
Q. To perform a Risk Assessment you need experience about the software performance. In the case of new software without previous history, how can you handle it?
A. We are really talking about the risk likelihood dimension of risk assessment here
GAMP suggests that when determining the risk likelihood you look at the ‘novelty’ of the supplier and the software (we sometimes use the opposite term – maturity – but we’re talking about the same thing).
If you have no personal experience with the software you can conduct market research – are there any reviews on the internet, any discussions on discussion boards or is there a software user group the Regulated Company could join? All of this will help to determine whether or not the software is ‘novel’ in the Life Sciences industry, whether it has been used by other Regulated Companies and whether there are any specific, known problems that will be the source of an unacceptable risk (or a risk that cannot be mitigated).
If it is a new product from a mature supplier then you can only assess risk based on the defect / support history of the supplier's previous products and an assessment of their quality management system. If it a completely new supplier to the market then you should conduct an appropriate supplier assessment and would generally assume high risk likelihood, at least until a history is established through surveillance audits and use of the software.
All of these pieces of information should feed into your initial high level risk assessment and be considered as part of your validation planning. When working with ‘novel’ suppliers or software it is usual for the Regulated Company to provide more oversight and independent verification.
At the level of a detailed functional risk assessment the most usual approach is to be guided by software categories – custom software (GAMP Category 5) is generally seen as having a higher risk likelihood than configurable software (GAMP Category 4), but this is not always the case (some configuration can be very complex)- our recent webcast on "Scaling Risk Assessment in Support of Risk Based Validation" has some more ideas on risk likelihood determination which you might find useful.
Subscribe to:
Posts (Atom)