The revised FDA draft guidance document “Electronic Source Data in Clinical Investigations” provides guidance to clinical trial sponsors, CROs, data managers, clinical investigators and others involved in the capture, review and archiving of electronic source data in FDA-regulated clinical investigations.
The original January 2011 draft guidance has been updated to clarify a number of points made to the FDA by commentators in the industry and the new draft guidance is published to collect additional public comments.
It's good to see industry and regulators working to develop guidance on the use of electronic Case Report Forms (eCRFs), recognising that capturing clinical trial data electronically at source significantly reduces the number of transcription errors requiring resolution, does away with unnecessary duplication of data and provides more timely access for data reviewers.
While much of the guidance contained in the draft would be seen as common sense in much the industry it does start to provide a consensus on important issues such as associating authorised data originators with data elements, the scope of 21CFR Part 11 with respect to the use of such records, interfaces between medical devices or Electronic Health Records and the eCRF.
No doubt a number of the recommendations contained in the draft guidance document will be of concern to software vendor's whose systems do not currently meet the technical recommendations provided. We will therefore surely see a variety of comments from “non-compliant” vendors trying to water down the recommendation until such a time their systems can meet what is already accepted good practice.
One key issue that would appear to be missing is the use of default values on eCRFs, which we know has been a concern in a number of systems and clinical trials i.e. where the investigator has skipped over a field leaving the data element at the default value. This is something we have provided feedback on and we would encourage everybody in the industry to review the new draft guidance and provide comments.
You can view a copy of the new draft guidance at http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM328691.pdf and comment through the usual FDA process at https://www.federalregister.gov/articles/2012/11/20/2012-28198/draft-guidance-for-industry-on-electronic-source-data-in-clinical-investigations-availability
Showing posts with label FDA. Show all posts
Showing posts with label FDA. Show all posts
Thursday, November 22, 2012
Wednesday, September 26, 2012
Global Outsourcing Conference Report
Over the last three days we've been taking part and presenting at the third Global Outsourcing Conference, jointly organized by Xavier University and the US FDA.
Although not the best attended of conferences this year, it proved to be one of the best in terms of content presented and the quality of the invited speakers, including a couple of key note addresses from senior members of US FDA. This resulted in some very interesting and beneficial discussions amongst the attendees, all of whom have taken home some thought provoking material and ideas for implementing positive change in terms of better securing the supply chain, assuring product and patient safety and in optimizing the performance of their extended enterprises.
The conference looked at a wide range of outsourcing and supply chain issues, ranging from the pragmatic management of outsourcing and supply chain management best practices, with a mixture of practical best practices from the pharmaceutical industry and research and experience from a number of leading Universities working in the field (presentations are currently available on the Xavier GOC website).
Of significant interest were the FDA presentations looking at the implications of the recent FDA Safety and Innovation Act (FDASIA - due to be signed into law next month) and the changes that this will have in terms of changes to GMP and GDP regulations.
There was a significant interest in the topic of serialization and ePedigree - which was covered in a number of sessions and signs are that companies are now realizing that rolling these solutions out will be necessary and more difficult than originally envisaged when compared to simpler pilot studies.
Supplier selection, assessment and management were also key topics with the focus on developing partnerships and relationships as the best way of meeting forthcoming regulatory expectations for the management of suppliers.
Business & Decision presented a deep dive session on the future challenges faced by ERP System and Process Owners, looking at the need to integrate with serialization systems, master data management systems, and supply chain partners systems. Acknowledging that many ERP systems were never designed to handle such a level of integration, the session looked at how middleware solutions such as Business Process Management solutions and SOA can be used to better integrate the supply chain.
Outsourcing clearly isn't going away and although some companies are looking to in-source some strategic products and services once again, the issues associated with outsourcing cannot be ignored. Although examples from India and China were much in evidence it was also acknowledged that outsourcing risks do not solely exist in so-called 'emerging economies'
This issues exist not only with product (API, excipients and other starting materials), but also with services such as IT services and it is clear that the US FDA expect companies to better manage their suppliers and supply chain.
For pharmaceutical companies looking to get involved in the debate there is the opportunity to follow the discussion on-line in the LinkedIn "Xavier Pharmaceutical Community".
In summary, the conference provided pharmaceutical companies with a comprehensive list of the topics they will need to be address in the next 1 - 3 years, which now need to be developed into a road map leading to on-going compliance, improved product and patient safety and more efficient and cost-effective supply chain operations.
Although not the best attended of conferences this year, it proved to be one of the best in terms of content presented and the quality of the invited speakers, including a couple of key note addresses from senior members of US FDA. This resulted in some very interesting and beneficial discussions amongst the attendees, all of whom have taken home some thought provoking material and ideas for implementing positive change in terms of better securing the supply chain, assuring product and patient safety and in optimizing the performance of their extended enterprises.
The conference looked at a wide range of outsourcing and supply chain issues, ranging from the pragmatic management of outsourcing and supply chain management best practices, with a mixture of practical best practices from the pharmaceutical industry and research and experience from a number of leading Universities working in the field (presentations are currently available on the Xavier GOC website).
Of significant interest were the FDA presentations looking at the implications of the recent FDA Safety and Innovation Act (FDASIA - due to be signed into law next month) and the changes that this will have in terms of changes to GMP and GDP regulations.
There was a significant interest in the topic of serialization and ePedigree - which was covered in a number of sessions and signs are that companies are now realizing that rolling these solutions out will be necessary and more difficult than originally envisaged when compared to simpler pilot studies.
Supplier selection, assessment and management were also key topics with the focus on developing partnerships and relationships as the best way of meeting forthcoming regulatory expectations for the management of suppliers.
Business & Decision presented a deep dive session on the future challenges faced by ERP System and Process Owners, looking at the need to integrate with serialization systems, master data management systems, and supply chain partners systems. Acknowledging that many ERP systems were never designed to handle such a level of integration, the session looked at how middleware solutions such as Business Process Management solutions and SOA can be used to better integrate the supply chain.
Outsourcing clearly isn't going away and although some companies are looking to in-source some strategic products and services once again, the issues associated with outsourcing cannot be ignored. Although examples from India and China were much in evidence it was also acknowledged that outsourcing risks do not solely exist in so-called 'emerging economies'
This issues exist not only with product (API, excipients and other starting materials), but also with services such as IT services and it is clear that the US FDA expect companies to better manage their suppliers and supply chain.
For pharmaceutical companies looking to get involved in the debate there is the opportunity to follow the discussion on-line in the LinkedIn "Xavier Pharmaceutical Community".
In summary, the conference provided pharmaceutical companies with a comprehensive list of the topics they will need to be address in the next 1 - 3 years, which now need to be developed into a road map leading to on-going compliance, improved product and patient safety and more efficient and cost-effective supply chain operations.
Friday, September 16, 2011
The use of Unique Device Identifiers in Healthcare
Monday 12th and Thursday 13th September saw a very interesting public meeting organized by the US FDA, entitled "Unique Device Identification (UDI) for Postmarket Surveillance and Compliance".
Rather than looking at details of the rule currently being developed for the unique identification for medical devices (details of which can be found at http://www.fda.gov/udi) the meeting looked at how UDIs would be used in the real world.
Whereas Pharmaceuticals is looking to reduce or prevent counterfeiting by the use of serialization (see our recent webcasts on serialization - "Strategic Management of Product Serial Identifiers" and "Serialized Labelling: Impacts on the Business Model"), in the medical devices sector there is a global drive to be able to uniquely identify medical devices at all point in the supply chain, at point of initial use and throughout the life of the device. Whereas pharmaceutical products are clearly identified (e.g. via the National Drug Code [NDC] in the US), this is not the case for medical devices.
At the moment medical devices are identified inconsistently my manufacturer, model, product name, hospital allocated item number, SKU# etc. As the public meeting heard, the ability to uniquely identify what a medical device is has significant benefits in terms of:
The tracking of devices via the Electronic Health Record (EHR) or Personal Health Record (PHR) is one of the most significant steps to enable all of this - where the EHR records the Unique Device Identifier to be recorded, and thereby linked to model number and manufacturer, to the batch/lot or serial number where required, and a host of other associated device data available from a manufacturers database.
This is part of a global initiative to uniquely identify medical devices via a Global Medical Device Nomenclature - which is important when you consider how important it is for a German cardiac specialist to know exactly what sort of heart pacemaker is implanted in the Australian tourist who has just been rushed in the emergency room!
Although we're most likely a year away from finalizing the FDA rule on UDI, and two years away for initial requirements for Class III devices, the use of UDI heralds the possibility of a new era in reduced hospital errors, better device safety, faster recalls, improved safety signal detection and the abilty to use real evidence - and not marketing hype - to know what the best device is for any given patient.
Details of the public meeting program and presentations can be found on the US FDA website at http://www.fda.gov/MedicalDevices/NewsEvents/WorkshopsConferences/ucm263947.htm
Rather than looking at details of the rule currently being developed for the unique identification for medical devices (details of which can be found at http://www.fda.gov/udi) the meeting looked at how UDIs would be used in the real world.
Whereas Pharmaceuticals is looking to reduce or prevent counterfeiting by the use of serialization (see our recent webcasts on serialization - "Strategic Management of Product Serial Identifiers" and "Serialized Labelling: Impacts on the Business Model"), in the medical devices sector there is a global drive to be able to uniquely identify medical devices at all point in the supply chain, at point of initial use and throughout the life of the device. Whereas pharmaceutical products are clearly identified (e.g. via the National Drug Code [NDC] in the US), this is not the case for medical devices.
At the moment medical devices are identified inconsistently my manufacturer, model, product name, hospital allocated item number, SKU# etc. As the public meeting heard, the ability to uniquely identify what a medical device is has significant benefits in terms of:
- More accurate device registries (e.g. of implantable devices)
- Faster and more focused product recalls
- Fewer patient/device errors (ensuring the right patient receives the right device)
- Better post marketing surveillance and adverse events reporting
The tracking of devices via the Electronic Health Record (EHR) or Personal Health Record (PHR) is one of the most significant steps to enable all of this - where the EHR records the Unique Device Identifier to be recorded, and thereby linked to model number and manufacturer, to the batch/lot or serial number where required, and a host of other associated device data available from a manufacturers database.
This is part of a global initiative to uniquely identify medical devices via a Global Medical Device Nomenclature - which is important when you consider how important it is for a German cardiac specialist to know exactly what sort of heart pacemaker is implanted in the Australian tourist who has just been rushed in the emergency room!
Although we're most likely a year away from finalizing the FDA rule on UDI, and two years away for initial requirements for Class III devices, the use of UDI heralds the possibility of a new era in reduced hospital errors, better device safety, faster recalls, improved safety signal detection and the abilty to use real evidence - and not marketing hype - to know what the best device is for any given patient.
Details of the public meeting program and presentations can be found on the US FDA website at http://www.fda.gov/MedicalDevices/NewsEvents/WorkshopsConferences/ucm263947.htm
Labels:
adverse events,
devices,
EHR,
FDA,
pharmacovigilence,
PHR,
UDI
Thursday, November 18, 2010
FDA to join PIC/S
On 8th and 9th November the Pharmaceutical Inspection Cooperation scheme (PIC/S) met in Kuala Lumpur Malaysia. The main news to come out of that was that the US FDA will become full members of PIC/S as of 1st January 2011. Some commentators may be saying “about time too” but what does it actually mean to the industry?
The FDA has rightly or wrongly been thought of (at least in the USA) as having the highest standards in the world for pharmaceutical products and medical devices. However the FDA applied for membership of PIC/s as far back as 2005. So why have they not joined?
It was rumoured that under the previous US Administration there was a reluctance to allow overseas Agencies review the FDA's own internal quality management system, which is required for new members to join PIC/S.
Another reason, if recent press reports are anything to go by, is that the PIC/s organization looked at the FDA regulations and concluded that they were insufficiently rigorous for admission to PIC/S. Given that the FDA has now been admitted, it is an open question as to how far the FDA will go to comply with the full intentions of the scheme. Will this mean a tightening of regulations or just reciprocal recognition of inspections?
At the same time as the FDA joins PIC/S the Ukrainian regulatory agency (SIQCM) is being admitted too. Does this mean that inspections of pharmaceutical plants in the Ukraine by the SIQCM will be recognized by the FDA as being of the same rigor as their own inspections? This is as much a political as a regulatory question and it remains to be seen how far the FDA is prepared to go to comply with the PIC/s principles, and whether we can expect any change to regulatory law or guidance in the USA as a result.
[Posted on behalf of David Hawley]
[Posted on behalf of David Hawley]
Wednesday, October 27, 2010
Regulatory Changes In - and For - China
There were some interesting sessions at this morning's keynote sessions at the ISPE-CCPIE conference in Beijing.
The Chinese State FDA presented a brief history of the Chinese GMP regulations, comparing these to other international regulations (e.g. WHO) and although they provided an outline of the new Chinese GMP regulations there was no commitment in terms of a date by which these will be made effective.
Cynics at the conference suggested after the session that this is because 90% of local Chinese companies would not comply with the new Chinese GMP regulations, but while the SFDA do appear to be keeping their options open regarding timing (and appear to be moving away from introducing a target date with a period of grace during which companies could move to compliance) a session from the US FDA gave a different picture.
Its 12 months since the US FDA set up shop (a field office) in China and although there are still only seven full time FDA staff in country - with only half of these conducting for-cause and high priority inspections - there appears to have been good progress in working with the Chinese State FDA (SFDA) as well as some of the Provincial FDA offices.
What we appear to be seeing is the Chinese authorities committing to address the regulatory/quality concerns that threatened to impact their export markets last year while also starting to address the reform of regulations in their home market, recognizing that the latter will take a while to address in a market consisting of literally thousands of (rapidly consolidating) manufacturers and distributors.
The US FDA and SFDA now meet on a monthly basis, with the SFDA acting as observers to some US FDA inspections. The US FDA is helping to fund training for the SFDA, reviewed the pending Chinese GMP regulations and have provided 10 GCP regulations for the SFDA to translate. This is all part of the US FDA strategy of helping to educate other regulatory agencies on the requirements of the US market and to help build inspection capacity (through the education of a cadre of SFDA inspectors trained in international regulatory expectations).
At the same time, multinational Life Sciences companies are sharing concerns about Chinese products with the US FDA, who are in turn discussing issues with the SFDA and there is also agreement between the US FDA and SFDA to focus on ten high risk products (mainly pharmaceutical, but some medical devices).
This co-operation provides evidence of the US FDA's desire to work more effectively with other regulatory agencies and will certainly start to address concerns about Chinese product.
At the same time it will also help the Chinese authorities to better regulate their own market, which is forecast to be world's largest by 2013 (behind the US and Japan). While 'rogue traders' operating out of China will undoubtedly be of continuing concern with respect to product quality and counterfeiting at least problems with the legitimate market are starting to be addressed.
Whatever people may think about the Chinese governments method of implementing change, there is no doubt that effective reforms can be implemented and probably more quickly than in many other markets. Although this is just the beginning there is no doubt that regulatory change is happening - which will be good for patients in China as well as the rest of the world.
The Chinese State FDA presented a brief history of the Chinese GMP regulations, comparing these to other international regulations (e.g. WHO) and although they provided an outline of the new Chinese GMP regulations there was no commitment in terms of a date by which these will be made effective.
Cynics at the conference suggested after the session that this is because 90% of local Chinese companies would not comply with the new Chinese GMP regulations, but while the SFDA do appear to be keeping their options open regarding timing (and appear to be moving away from introducing a target date with a period of grace during which companies could move to compliance) a session from the US FDA gave a different picture.
Its 12 months since the US FDA set up shop (a field office) in China and although there are still only seven full time FDA staff in country - with only half of these conducting for-cause and high priority inspections - there appears to have been good progress in working with the Chinese State FDA (SFDA) as well as some of the Provincial FDA offices.
What we appear to be seeing is the Chinese authorities committing to address the regulatory/quality concerns that threatened to impact their export markets last year while also starting to address the reform of regulations in their home market, recognizing that the latter will take a while to address in a market consisting of literally thousands of (rapidly consolidating) manufacturers and distributors.
The US FDA and SFDA now meet on a monthly basis, with the SFDA acting as observers to some US FDA inspections. The US FDA is helping to fund training for the SFDA, reviewed the pending Chinese GMP regulations and have provided 10 GCP regulations for the SFDA to translate. This is all part of the US FDA strategy of helping to educate other regulatory agencies on the requirements of the US market and to help build inspection capacity (through the education of a cadre of SFDA inspectors trained in international regulatory expectations).
At the same time, multinational Life Sciences companies are sharing concerns about Chinese products with the US FDA, who are in turn discussing issues with the SFDA and there is also agreement between the US FDA and SFDA to focus on ten high risk products (mainly pharmaceutical, but some medical devices).
This co-operation provides evidence of the US FDA's desire to work more effectively with other regulatory agencies and will certainly start to address concerns about Chinese product.
At the same time it will also help the Chinese authorities to better regulate their own market, which is forecast to be world's largest by 2013 (behind the US and Japan). While 'rogue traders' operating out of China will undoubtedly be of continuing concern with respect to product quality and counterfeiting at least problems with the legitimate market are starting to be addressed.
Whatever people may think about the Chinese governments method of implementing change, there is no doubt that effective reforms can be implemented and probably more quickly than in many other markets. Although this is just the beginning there is no doubt that regulatory change is happening - which will be good for patients in China as well as the rest of the world.
Tuesday, October 12, 2010
Marked Decline in Sponsored Link Advertising following US FDA Enforcement
An interesting recent piece of news was that sponsored link advertisements for pharmaceutical products have declined more than 50% following a spate of Warning Letters from the US FDA. According to various new articles published on March 26, 2009, the Division of Drug Marketing, Advertising, and Communications (DDMAC) of the U.S. Food and Drug Administration (FDA) sent warning letters to 14 major pharmaceutical manufacturers identifying specific brands as being in violation of FDA fair balance guidelines. The letters stated that sponsored link advertisements for specific drugs were misleading due to the exclusion of risk information associated with the use of the drug.
Most of these companies quickly removed their sponsored ads for these products and others not specifically mentioned in the letters. As a result, the number of sponsored links for pharmaceutical brands has dramatically declined as manufacturers changed their strategies to ensure compliance.
This illustrates neatly what we have been pointing out for some time; there is nothing special about the Internet as far as regulation is concerned. Advertising on the Internet is governed in the same way as advertising using any other medium, and non-compliance with the rules that govern advertising will be dealt with similarly.
This poses a challenge on a number of levels for the companies concerned. The very nature of sponsored link advertisements mandates brevity. In a magazine you can have your ad on one page and the list of warnings in small type on the next page or two, but how is this to be handled in a sponsored link of 25 words or even fewer?
This isn’t the only problem with on-line ads. They might be a common sight to US-based consumers but on a global stage they are unusual. In fact the only countries that I know of where direct-to patient advertising is permitted are the USA and New Zealand. Other countries such as the UK specifically prohibit advertising of Prescription Only Medicines to non-health professionals. This places the onus squarely on the regulated company to make sure that they target their adverts correctly otherwise they risk action from other regulatory bodies and not just the FDA.
For more on this topic see the Business & Decision webcast "Controlling Life Sciences Promotional Activities on the Internet"
Most of these companies quickly removed their sponsored ads for these products and others not specifically mentioned in the letters. As a result, the number of sponsored links for pharmaceutical brands has dramatically declined as manufacturers changed their strategies to ensure compliance.
This illustrates neatly what we have been pointing out for some time; there is nothing special about the Internet as far as regulation is concerned. Advertising on the Internet is governed in the same way as advertising using any other medium, and non-compliance with the rules that govern advertising will be dealt with similarly.
This poses a challenge on a number of levels for the companies concerned. The very nature of sponsored link advertisements mandates brevity. In a magazine you can have your ad on one page and the list of warnings in small type on the next page or two, but how is this to be handled in a sponsored link of 25 words or even fewer?
This isn’t the only problem with on-line ads. They might be a common sight to US-based consumers but on a global stage they are unusual. In fact the only countries that I know of where direct-to patient advertising is permitted are the USA and New Zealand. Other countries such as the UK specifically prohibit advertising of Prescription Only Medicines to non-health professionals. This places the onus squarely on the regulated company to make sure that they target their adverts correctly otherwise they risk action from other regulatory bodies and not just the FDA.
For more on this topic see the Business & Decision webcast "Controlling Life Sciences Promotional Activities on the Internet"
Wednesday, May 19, 2010
New Part 11 Assessment (and Enforcement) Is Coming
As we report in this month's copy of our ValidIT newsletter (focusing on IS Compliance and Computer System Validation issues) it looks as if the US FDA's CDER division wants to assess the state of play in the industry with respect to 21 CFR Part 11 (Electronic Records, Electronic Signatures).
Although the scope and specific program areas aren't yet decided the way they intend to do this is to ask specialists in the Agency to accompany Inspectors from the Field Divisions on routine inspections and look at issues around Part 11, taking appropriate enforcement action where necessary. This is to help them to understand how the industry is responding to the 2003 Scope and Applications Guidance and to help the Agency decide how to revise Part 11.
This demonstrates a pragmatic approach to resolving this open issue and the Agency is to be applauded in taking a proactive yet measured approach (other Divisions within FDA aren't directly involved and are playing a watching brief).
I hope that what they'll find - certainly on inspections in North America and Europe - is that:
However, there is also a significant number of indigenous API, pharmaceutical and medical device companies in these markets often using local software developers to avoid the licensing costs or overseas development costs of using more established software from multi-national vendors. Our experience in these markets is that in many of these cases the requirements of Part 11 are well less understood.
Looking at any of the on-line forums that exist, anyone who reads some of the Part 11 questions posed by certain individuals and organizations from some countries will realize that in many cases the current level of understanding (and in some cases the level of technology) is where North America and Europe was ten or more years ago. What we might consider to be 'simple' questions and answers only appear simple based on more than fifteen years discussion and experience in the industry, which many newcomers understandably lack.
Now this isn't a rant about moving jobs 'overseas' or about how unfair lower cost labor is in emerging markets - we all compete in a global economy. Working with end users and suppliers in these markets I know how well educated their labor pool is, how hard working they are and how innovative they can be.
What I hope is that the FDA will not restrict their assessment of the state of Part 11 to just their domestic market or traditional developed markets (Canada and Europe), but that they will also include a broader set of overseas manufacturers, to determine what the overall state of the market is with respect to Part 11 compliance.
Without this representative sample there are two potential risks:
Although the scope and specific program areas aren't yet decided the way they intend to do this is to ask specialists in the Agency to accompany Inspectors from the Field Divisions on routine inspections and look at issues around Part 11, taking appropriate enforcement action where necessary. This is to help them to understand how the industry is responding to the 2003 Scope and Applications Guidance and to help the Agency decide how to revise Part 11.
This demonstrates a pragmatic approach to resolving this open issue and the Agency is to be applauded in taking a proactive yet measured approach (other Divisions within FDA aren't directly involved and are playing a watching brief).
I hope that what they'll find - certainly on inspections in North America and Europe - is that:
- Most Regulated Users have taken Part 11 on board and are responding well, applying a risk-based approach where appropriate,
- Technology has moved on, allowing Suppliers to meet Part 11 technical requirements much more easily, leveraging significant advances in security, encryption and digital signatures.
However, there is also a significant number of indigenous API, pharmaceutical and medical device companies in these markets often using local software developers to avoid the licensing costs or overseas development costs of using more established software from multi-national vendors. Our experience in these markets is that in many of these cases the requirements of Part 11 are well less understood.
Looking at any of the on-line forums that exist, anyone who reads some of the Part 11 questions posed by certain individuals and organizations from some countries will realize that in many cases the current level of understanding (and in some cases the level of technology) is where North America and Europe was ten or more years ago. What we might consider to be 'simple' questions and answers only appear simple based on more than fifteen years discussion and experience in the industry, which many newcomers understandably lack.
Now this isn't a rant about moving jobs 'overseas' or about how unfair lower cost labor is in emerging markets - we all compete in a global economy. Working with end users and suppliers in these markets I know how well educated their labor pool is, how hard working they are and how innovative they can be.
What I hope is that the FDA will not restrict their assessment of the state of Part 11 to just their domestic market or traditional developed markets (Canada and Europe), but that they will also include a broader set of overseas manufacturers, to determine what the overall state of the market is with respect to Part 11 compliance.
Without this representative sample there are two potential risks:
- The Agency concludes that things are in relatively good shape and that no great changes are needed to Part 11, or the enforcement thereof. This has the potential to miss possible issues in emerging economies where fraud can be as much as an issue as accidental problems with electronic records and signatures,
- The Agency develops a revised Part 11 based upon an assumption that all software developers (and their clients) generally have access to the latest technology, which can again lead to compliance risks. Any new Part 11 should clearly avoid the problems created by the original preamble and should not focus on any specific technologies or techniques.
Thursday, April 22, 2010
Computer System Validation – Business as Usual?
A colleague asked me earlier today what were the big issues at the moment in computer system validation – and I couldn’t really think of any.
After more than twenty years introducing computer system validation to a lot of companies, consulting on Part 11, getting ready for Y2K, responding to Part 11, addressing infrastructure qualification and adopting a risk-based approach to validation the question is very much ‘where next?’.
To some extent it depends on what happens with risk-based validation. As the results from our webcast polls show, many Life Sciences organisations are still struggling to adopt a justifiable risk-based and cost effective approach to computer system validation.
At the moment it does appear to be business as usual – we still see computer system validation issues cited in FDA Warning Letters (and anecdotally reported by other regulatory agencies) but its justified and at a reasonable level in comparison to other more pressing topics – very much what we were used to around a decade ago.
However, if companies continue to use taking a risk-based approach as an excuse for simply doing less – rather than providing a real risk-based rationale for shifting resources to areas of the greatest risk – things may change. Some regulatory agencies have already commented that they are getting wise to ‘risk-based’ equating to ‘simply doing less’ and companies simply adopting GAMP® 5 as a flag of convenience for reducing spending on computer system validation without any clear rationale for doing less. Some inspectors have warned that they will take enforcement actions unless there is a clear and sound risk-based rationale for reducing the level of validation. Efficiency savings are fine, but only when the same goals are met.
There is then a possibility that we could see an increase in enforcement actions in response to Life Sciences companies taking the cost savings too far, but hopefully common sense will prevail as more individuals and organisations really start understand how to achieve the same objectives with less time and effort.
That leaves us with the other ‘big issue’ – which is how the industry is looking to changes in IT - such as cloud computing, virtualization, outsourcing and the like – and wondering how to apply risk-based principles to new technology and different business models.
While many Life Sciences companies are still relatively slow to change others are quietly moving ahead and the immediate future is probably one of evolution and not revolution. That’s not to say however that such evolution isn’t exciting – there is great potential to leverage newer technologies and models to deliver enhanced business performance, reduce costs and help restore the bottom line. If we can seize these opportunities and also address the compliance and validation issues in a cost effective manner then we’re in for a very interesting time – even if it’s not quite as exciting as when the regulators were giving everyone a hard time.
After more than twenty years introducing computer system validation to a lot of companies, consulting on Part 11, getting ready for Y2K, responding to Part 11, addressing infrastructure qualification and adopting a risk-based approach to validation the question is very much ‘where next?’.
To some extent it depends on what happens with risk-based validation. As the results from our webcast polls show, many Life Sciences organisations are still struggling to adopt a justifiable risk-based and cost effective approach to computer system validation.
At the moment it does appear to be business as usual – we still see computer system validation issues cited in FDA Warning Letters (and anecdotally reported by other regulatory agencies) but its justified and at a reasonable level in comparison to other more pressing topics – very much what we were used to around a decade ago.
However, if companies continue to use taking a risk-based approach as an excuse for simply doing less – rather than providing a real risk-based rationale for shifting resources to areas of the greatest risk – things may change. Some regulatory agencies have already commented that they are getting wise to ‘risk-based’ equating to ‘simply doing less’ and companies simply adopting GAMP® 5 as a flag of convenience for reducing spending on computer system validation without any clear rationale for doing less. Some inspectors have warned that they will take enforcement actions unless there is a clear and sound risk-based rationale for reducing the level of validation. Efficiency savings are fine, but only when the same goals are met.
There is then a possibility that we could see an increase in enforcement actions in response to Life Sciences companies taking the cost savings too far, but hopefully common sense will prevail as more individuals and organisations really start understand how to achieve the same objectives with less time and effort.
That leaves us with the other ‘big issue’ – which is how the industry is looking to changes in IT - such as cloud computing, virtualization, outsourcing and the like – and wondering how to apply risk-based principles to new technology and different business models.
While many Life Sciences companies are still relatively slow to change others are quietly moving ahead and the immediate future is probably one of evolution and not revolution. That’s not to say however that such evolution isn’t exciting – there is great potential to leverage newer technologies and models to deliver enhanced business performance, reduce costs and help restore the bottom line. If we can seize these opportunities and also address the compliance and validation issues in a cost effective manner then we’re in for a very interesting time – even if it’s not quite as exciting as when the regulators were giving everyone a hard time.
Wednesday, February 10, 2010
Answers to Webcast Questions - Testing Best Practices: 5 Years of the GAMP Good Practice Guide
The following answers are provided to questions submitted during the "Testing Best Practices: 5 Years of the GAMP Good Practice Guide" and which we did not have time to answer while we were live.
Can we thank you all for taking the time to submit such interesting questions.
Q. Retesting: What is your opinion on retesting requirements when infrastructure components are upgraded? i.e. O/S patches, database upgrades, web server upgrades
A. The GAMP "IT Infrastructure Control and Compliance" Good Practice Guide specifically addresses this question. In summary, this recommends a risk-based approach to the testing of infrastructure patches, upgrades etc. Based on risk severity, likelihood and detectability this may require little or no testing, will sometime require testing in a Test/QA instance or in some cases they may or should be rolled out to the Production environment (e.g. anti-virus updates). Remember - with a risk-based approach there is no 'one-size-fits-all' approach.
Q. No value add for independent review and oversight? Why not staff SQE's?
A. Assuming that 'SQE' is Software Quality Expert, we would agree that independent review by such SQE's does add value, specifically because they are experts in software and should understand software testing best practices. Where we do question the value of quality reviews (based on current gidance) is where the Quality Unit has no such expertise to draw upon. In these cases the independent Quality Unit still has a useful value add role to play, but this is an oversight role, ensuring that test processes and procedures are followed (by review of Test Strategies/Plans/Reports and/or periodic review or internal audit)
Q. What FDA guidance was being referred to re: QA review of test scripts etc not being necessary?
A. The FDA Final Guidance document “General Principles of Software Validation” doesn’t specifically state that QA review of test scripts is not necessary, but like the GAMP “Testing of GxP Systems“ Good Practice Guide, GAMP 5 and ASTM E2500, it places the emphasis on independent PEER review. i.e. by suitably qualified, trained or experienced peers (e.g. software developers, testers etc) who are able to independently review test cases. Although QA IT people may well have the necessary technical background to play a useful part in this process (guiding, supporting etc) this is not always the case for the independent Quality Unit who are primarily responsible for product (drug, medical device etc) quality.
Q. Do the regulators accept the concept of risk-based testing?
A. As we stated in response to a similar question in the webcast, regulatory authorities generally accept risk-based testing when it is done well. There is a concern amongst some regulators (US FDA and some European inspectors) that in some cases risk-assessments are being used to justify decisions that are actually taken based on timescale or cost constraints.
In the case of testing, the scope and rigor of testing is sometimes determined in advance and the risk assessment (risk criteria, weightings etc) are 'adjusted' to give the desired answer e.g. "Look - we don't need to do any negative case testing after al!"
The better informed regulators are aware of this issue, but where testing is generally risk-based our experience is that this is viewed positively by most inspectors.
Q. Do you think that there a difference in testing good practices in different sectors e.g pharma vs. medical device vs. biomedical?
A. There shouldn't be, but in reality the history of individual Divisions in the FDA (and European Agencies) means that there are certain hot topics in some sectors e.g.
Q. Leaving GMP systems aside and referring to GxP for IT, Clinical and Regulatory applications. How do you handle a vendors minimum hardware spec for an application in a virtual environment?
We have found that vendors overstate the minimums (# of CPUs, CPU spec, minimum RAM, disk space usage, etc.) by a huge margin when comparing actual usage after a system is in place.
A large pharma I used to work for put a standard VM build of 512k RAM and to increase it if needed. This was waived for additional servers of the same application. In the newest version of VMware (vSphere 4) all of these items can be changed while the guest server is running.
A. Software vendors do tend to cover themselves for 'worst case' (peak loading of simultaneous resource intensive tasks, maximum concurrent users etc - and then add a margin), to ensure that the performance of their software isn't a problem. The basic answer is to use your own experience based on a good Capacity Planning and Performance Management process (see the GAMP "IT Infrastructure Control and Compliance" Good Practice Guide again). This shoud tell you whether your hardware is over-rated or not and you can use historic data to size your hardware. It can also be useful to seek out the opinion of other users via user groups, discussion boards and forums etc.
Modern virtualization (which we also covered in a previous webcast "Qualification of Virtualized Environments") does allow the flexibility to modify capacity on the fly, but this isn't an option for Regulated Companies running in a traditional hardware environment. Some hardware vendors will allow you to install additional capacity and only pay for it when it is 'turned on' , but these tend to be large servers with mutliple processors etc.
At the end of the day it comes down to risk assessment - do you take the risk of not going with the software vendors recommendation for the sake of reducing the cost of the hardware? This is the usual issue of balancing project capex' budget against the cost to the business of poor performance.
Can we thank you all for taking the time to submit such interesting questions.
Q. Retesting: What is your opinion on retesting requirements when infrastructure components are upgraded? i.e. O/S patches, database upgrades, web server upgrades
A. The GAMP "IT Infrastructure Control and Compliance" Good Practice Guide specifically addresses this question. In summary, this recommends a risk-based approach to the testing of infrastructure patches, upgrades etc. Based on risk severity, likelihood and detectability this may require little or no testing, will sometime require testing in a Test/QA instance or in some cases they may or should be rolled out to the Production environment (e.g. anti-virus updates). Remember - with a risk-based approach there is no 'one-size-fits-all' approach.
Q. No value add for independent review and oversight? Why not staff SQE's?
A. Assuming that 'SQE' is Software Quality Expert, we would agree that independent review by such SQE's does add value, specifically because they are experts in software and should understand software testing best practices. Where we do question the value of quality reviews (based on current gidance) is where the Quality Unit has no such expertise to draw upon. In these cases the independent Quality Unit still has a useful value add role to play, but this is an oversight role, ensuring that test processes and procedures are followed (by review of Test Strategies/Plans/Reports and/or periodic review or internal audit)
Q. What FDA guidance was being referred to re: QA review of test scripts etc not being necessary?
A. The FDA Final Guidance document “General Principles of Software Validation” doesn’t specifically state that QA review of test scripts is not necessary, but like the GAMP “Testing of GxP Systems“ Good Practice Guide, GAMP 5 and ASTM E2500, it places the emphasis on independent PEER review. i.e. by suitably qualified, trained or experienced peers (e.g. software developers, testers etc) who are able to independently review test cases. Although QA IT people may well have the necessary technical background to play a useful part in this process (guiding, supporting etc) this is not always the case for the independent Quality Unit who are primarily responsible for product (drug, medical device etc) quality.
Q. Do the regulators accept the concept of risk-based testing?
A. As we stated in response to a similar question in the webcast, regulatory authorities generally accept risk-based testing when it is done well. There is a concern amongst some regulators (US FDA and some European inspectors) that in some cases risk-assessments are being used to justify decisions that are actually taken based on timescale or cost constraints.
In the case of testing, the scope and rigor of testing is sometimes determined in advance and the risk assessment (risk criteria, weightings etc) are 'adjusted' to give the desired answer e.g. "Look - we don't need to do any negative case testing after al!"
The better informed regulators are aware of this issue, but where testing is generally risk-based our experience is that this is viewed positively by most inspectors.
Q. Do you think that there a difference in testing good practices in different sectors e.g pharma vs. medical device vs. biomedical?
A. There shouldn't be, but in reality the history of individual Divisions in the FDA (and European Agencies) means that there are certain hot topics in some sectors e.g.
- Because of well understood failures to perform regressions analysis and testing the CBER are very hot on this topic in blood banking.
- Because of the relatively high risk of software embedded in medical devices, some inspectors place a lot of focus on structural testing.
Q. Leaving GMP systems aside and referring to GxP for IT, Clinical and Regulatory applications. How do you handle a vendors minimum hardware spec for an application in a virtual environment?
We have found that vendors overstate the minimums (# of CPUs, CPU spec, minimum RAM, disk space usage, etc.) by a huge margin when comparing actual usage after a system is in place.
A large pharma I used to work for put a standard VM build of 512k RAM and to increase it if needed. This was waived for additional servers of the same application. In the newest version of VMware (vSphere 4) all of these items can be changed while the guest server is running.
A. Software vendors do tend to cover themselves for 'worst case' (peak loading of simultaneous resource intensive tasks, maximum concurrent users etc - and then add a margin), to ensure that the performance of their software isn't a problem. The basic answer is to use your own experience based on a good Capacity Planning and Performance Management process (see the GAMP "IT Infrastructure Control and Compliance" Good Practice Guide again). This shoud tell you whether your hardware is over-rated or not and you can use historic data to size your hardware. It can also be useful to seek out the opinion of other users via user groups, discussion boards and forums etc.
Modern virtualization (which we also covered in a previous webcast "Qualification of Virtualized Environments") does allow the flexibility to modify capacity on the fly, but this isn't an option for Regulated Companies running in a traditional hardware environment. Some hardware vendors will allow you to install additional capacity and only pay for it when it is 'turned on' , but these tend to be large servers with mutliple processors etc.
At the end of the day it comes down to risk assessment - do you take the risk of not going with the software vendors recommendation for the sake of reducing the cost of the hardware? This is the usual issue of balancing project capex' budget against the cost to the business of poor performance.
Subscribe to:
Posts (Atom)