Tuesday, October 22, 2019

Verification and Validation Webinar - Recent Q&A,   John E. Lincoln,  10/22/2019

Some production-oriented V&V questions:

1.     When it comes to authoring and execution of test protocols, what is my role/responsibly as an automation engineer that has designed and implemented a change to the validated system?  What specifically is validation?

Ans:  You or whoever in your company is responsible for validations must reverify / revalidate whatever has been impacted by the change.  Referencing the original validation, the new V&V would normally not be as extensive as the original, unless the whole equipment / system was changed / replaced.  All based on risk (i.e., the amount of effort / depth) to the ultimate end-user / patient.

I define validation as a series of verifications (tests, checks, inspections) to prove that the item being validated meets its intended purpose(s), as defined by requirements / standards / guidance documents, et al. And there are formal CGMP and ISO definitions which say similar. Those requirements, et al, are changed into test cases which are then run to prove that the requirement exits and has been met, without any negative results.

2.     I noticed in your sample IQ ,OQ, and PQ’s, the evidence collected to prove operational acceptance appears to be strictly signatures.  What is your opinion on screenshots?  If required by validation, whose responsibility (validation, automation, production, etc.) should it be for taking screenshots (collecting evidence)?

Ans:  My definitions:  The IQ is a check list, where each installed requirement / item is checked by a qualified individual (per HR records) who signs off on the presence and functionality of that item on the check list; The OQ is composed of test cases to verify the presence and functionality of each test case based on each requirement ; the PQs (of which there are several depending upon input variables) rechallenge those OQ items / requirements that are subject to further variability by having test cases using sample sizes larger than n=1 (which is the case for OQ test cases); so each PQ has various test cases challenging requirements subject to variability, with each PQ test case using samples of n=30, or 125, etc, per a valid sampling plan based on a solid statistical rationale  (quote / reference a text book, a standard, e.g., Z 1.4, or use an industrial statistician  / consultant.

Screen shots are an excellent tool, and I and others have made much use of them.  However, they must be described in the narrative, may need a unique ID number for each, and each should be signed / initialed and dated, and added to the Test Report.  Responsibility for the screen shots can vary, should be defined by SOP (or the test protocol), but is usually done by those qualified to obtain such and/or who have a vested interest in obtaining them (the one handling the V&V usually). 
3.     In your OQ example, you did specify a separation between tester and verifier.  Does this imply that validation should not execute test protocols?  Or is this simply a matter of capability?  I.E. if validation is capable, there is no issue with a computer system validation engineer functioning as a tester?

Ans:  The tester can be the operator, an engineer / tech, or another trained / familiar with the operation being run.  The verifier is often a QC person, not necessarily overly familiar with the specific operation being run (the test case), but who verifies that the instructions in the test case were followed, and the actual achieved results were what were recorded.

4.     In general, it seemed like the focus was on devices, and I’m looking for clarification for where control systems and devices might differ in terms of risk.  For example, on SW V&V Elements – 1, when speaking of LOC (Min, Mod, Maj), you mentioned that Class II device must get elevated to Mod.  How does this relate to software in a data rich control system software environment?  Our systems are primarily Class 4 and 5 software systems (PLC, Operator Interface, Batch Reporting), and for the majority of changes there is little risk to patient or product.  However, because we are modifying Class 4 and 5 systems, it is often hard for us to convince our validation and quality partners that risk is negligible, and they feel their one size fits all approach is therefore justified.

Ans:  The  principle of risk (ultimately to patient / end user) still applies, tho sometimes with some operations its difficult to trace through, that’s why reference to an appropriate ISO 14961 or ICH Q9 Risk Management File is useful.  To somewhat eliminate second-guessing by stake-holders, including gov’t regulatory and internal regulatory, anticipate such push-back, and include an analysis of risk, tied to those files, in your V&V Test Report documentation.  I also try to include such references, tied to specific Risk File line items, with appropriate Test Cases.

"One size fits all" is safest from a bureaucratic standpoint, but is extremely wasteful of resources, and unnecessary.  Your SOPs on V&V should allow some leeway / “trust” in those trained (engineers) to make supported / documented decisions (e.g., I wouldn’t do the degree of effort on a PLC V&V as on a complex software V&V), yet I have done work with companies that required the same level of documentation / approvals for both (painful).

5.     What is your opinion on validation's role when it comes to installing software / firmware patches?  Would it ever be appropriate for IT/Automation to determine the level of risk and be allowed to decide if a log entry is adequate vs determining when full blown change management is required?

Ans:  First, your SOPs must clearly state the methods to be chosen, and how documented.  Second, from a QA standpoint, any patch I feel (my opinion) should be documented by a rev level change.  Unless there’s some identifier in the code, easily accessed.  In other words, there has to be 1) a clear distinction for each change made to the software, 2) changes are themselves V&V’d, approved, including QC/QA, and 3) documented. 

If a log entry is defined by the company as valid, it probably should be supported in documentation somewhere by at least two signatures approving the change.
From a regulatory standpoint, you can’t have one version / release in the field (or in production / manufacturing), but none or few of that version / release are identical to another (sadly another problem I’ve seen). There must be a clear documented history of each. How you as a company chose to work that out (and document for forward / backward traceability) is up to you, subject to the above considerations.

Remember:  I mentioned that this presentation is only one valid approach to V&V, but one that has been field tested, and reviewed by US FDA and EU ISO / Notified-Body inspectors / auditors for over 37 years, with no objections. A company can have another viable and compliant method.

Further considerations:

V&V of production systems generally can have some less depth than device software / firmware - primarily because there are usually redundant checks / verifications "wired" into the production process downstream of a validated item, which are documented in the batch record.  This can also be referenced in the Test Report / Protocol as further justification, similar to patient risk for degree of effort in a V&V.

-- John E. Lincoln                                                          jel@jelincoln.com

Monday, May 13, 2019

   Sources of Medical Device / Equipment Field Use / Quality History

   Ques (heavily redacted) :  I am a South African medical doctor. I have been tasked by the         
   radiology department of one of our hospitals to do a review of medical diagnostic ultrasound 
   systems, specifically if there are studies looking specifically at the quality of the procured equipment 
   from the start.  I came across one of your articles and thought you may have some insights... 

Ans:  I'm not sure what article you're referring to, and the full nature of the assignment with which you've been tasked.  But, it sounds like you are to set up a system to review the quality of ultrasound products the hospital is considering purchasing.  If so, here's some approaches / suggestions: 
1.  ECRI Institute:  https://www.ecri.org/
     I haven't looked at them for many years, but they used to provide analysis like the U.S.'          
     Consumer Reports, but on medical products;
2.  The U.S. FDA:  https://www.fda.gov/
     The FDA maintains a MAUDE (Manufacturer's and Users Device Experience) database,
     which lists products that have been voluntarily reported to have problems (adverse events) in
     the field that could or did cause serious injury or death.  It's not complete, but it can provide
     an idea of field / use issues facing families of products cleared / approved by the FDA.
     You have to get the product's regulation number, by searching the US Federal Register, 21 
     CFR 800-series, and use that number in the MAUDE database.  You can also  search it by 
     common name / description, or manufacturer.  
     New US and EU labeling requirements -- UDI, GUDID (a database), are new but should 
     ultimately provide similar data globally.
3.  Only purchase products from companies following a quality management system, US FDA 
     CGMPs, 21 CFR 820 for devices / equipment, or ISO 13485 / EU MDD / MDR for   
     CE-marked equipment in Europe.  These companies are periodically inspected as to 
     adherence to those respective QMS', which means the equipment meet applicable
     standards / requirements, and were built / documented to required QMS law.


Tuesday, March 26, 2019

ISO 11135:2014 and The QMS

A little more on my answer to the question posed after the EO sterilization webinar:
Ques:  I have a vendor ETO a load for me... what do i need to ensure that they let the load sit or air out between PQ runs?    ISO  11135 or FDA requirement?

Ans:  As mentioned early in the presentation, ISO 11135 presupposes the existence of a viable QMS / CGMP system, as well as adherence to the validation requirements of the standard to complete a successful validation.  Since ISO 11135 is an international standard, it specifically references ISO 13485 requirements under (page 11 of the standard) 4 Quality Management System, 4.1 Documentation, and 4.2 Management Responsibility, and 4.3 Product Realization, et al.   ISO 13485 requires, among others, that a company and its supply chain / vendors adhere to the requirements of 13485 for medical devices.  The US FDA recognizes ISO 11135 as a consensus standard, so for the US, the QSR 21 CFR 820, the CGMPs would be the device QMS in lieu of ISO 13485 (both requirements are very similar).  
So such adherence would assure that your question is addressed.  This can be reafirmed by a Quality Agreement or Contractual Requirement, verified by Certification / Audit, and or by other means. 
-- John E. Lincoln     jel@jelincoln.com

Saturday, March 23, 2019

CRO and Client Disagreement

Here's my further response to the question from the Device Changes webinar of 03/20:
Ques:    What is a CRO's responsibility in educating the sponsor in whether or not the product is a medical device or not? Often times sponsors argue that their product is a cosmetic when it is more like a medical device.
Ans:   As I mentioned, I'm not qualified to directly answer this question.  It goes to the heart of how you as a CRO determine what potential clients you will accept as a formal client, and your legal department's and corporate policies.  As mentioned, as a consultant, I lay out the terms under which I will take on a client, and one of those is clarity of the definition of the project and its scope.  That includes agreement on applicable FDA requirements and definitions.  If the definition is subject to disagreement, then clarify what the proposal will address.
If the definition of cosmetic or device is not clear then define in your agreement just what your services cover, and the approach required by the FDA based on the definition chosen, and that the results are dependent on the chosen definition as it lays out the approaches taken.  This obviously has to be run through your legal department (as mentioned, I am not a lawyer and do not give legal advice). 
You have the right (and responsibility) to turn down clients that will not work with you, and certainly those who will not abide by FDA requirements.
Ref:  https://www.fda.gov/downloads/RegulatoryInformation/Guidances/ucm127073.pdf
Hope that helps.

Friday, February 22, 2019

Does the addition of new production equipment require a revalidation of the sterilization process?

To answer that question:  If the particulate count only increases, that will not affect the sterilization validation.  If the bioburden load increases then additional verification / testing will be required.  I doubt that the addition of another piece of equipment will increase bioburden, though it will increase particulate (and possibly oil vapor if any compressed air escapes into the controlled environment (should be plumbed out)).  If the equipment also requires additional human handling of the product, then there could be increased product bioburden.
After the implementation of the new equipment, If there is increased product bioburden, then at the very least a half cycle should be run on the product, and then test sterility of some product / PCDs in the most difficult to sterilize locations in the load at a half cycle.  Then either complete the cycle or do a full cycle on the load if that was a production run (with data to prove that additional sterilization runs do not negatively affect product function). 
Generally the addition of some additional equipment into a controlled environment does not create a serious challenge to the sterilization cycle.
Of course document all the above.  Could be an addendum to the last sterilization validation test report.


Tuesday, September 18, 2018

Process Verification / Validation

An answer to a query:

Ques:  The  company  (site) that I work for, manufactures intermediate products.  The question that we have is-If a change is made to a process,  should we do validation or is verification sufficient?  All our processes have QC/acceptance criteria.  For example, we got a validated instrument from another site (of our company).   We got IQ/OQ done by the vendor at our site.  We also have the same instrument from another vendor, which will be retired in couple of months.  Can we do verification of the new instrument by writing a technical report ?    In other words, we will  first conduct  a risk assessment and then do experiments to compare the data of the two instruments and record them in a report.  Actually we have already done feasibility studies to show that the new instrument gives the same data as the old one for the same samples .  I heard , in your talk that FDA doesn't require validation if we have the means for 100% checking.  We have similar situations in which we sometimes make a change in the process but we always have means to verify the final product because we have acceptance criteria in place.  Is verification enough for such situations?    Please advise.

Ans:  "FDA doesn't require validation if ..."  The specific reference for my comment for medical device manufacturing is 820.75 "Process validation, (a) "where a the results of a process cannot be fully verified by subsequent inspection and test, the process shall be validated with a high degree of assurance..."  The key exception to this is automated (computerized) processes, which per 820.70(i) must be validated no matter what.

As to the rest of your question, part of the answer depends upon your definitions of verification and validation, technical report, feasibility studies, et al.  Any change requires some level of verification (testing, checking, feasibility studies, etc), or a series of verifications (part of a validation) to prove that the change does what it should and doesn't do what it shouldn't.  While I won't state what you've outlined will satisfy requirements (though on the surface it appears to), if you've satisfied the CGMPs and what I've outlined above, yielding data that nothing has changed and that product quality / specifications are assured after the changes per your data, documented, then you've met the requirements.

Saturday, August 18, 2018

Method Validations

Some Q&A from a recent webinar:

Ques: Do you have any recommended references to create risk categories for method validations?


JVT Article 2004, especially note flow charts; caveat – include potential problems from normal use, not just from a failure mode:


PDA Slides:


Q8, Q9, & Q10 Questions and Answers, U.S. FDA:


U.S. FDA Guidance, 2015: Analytical Procedures and Methods Validation for Drugs and Biologics; note several of the headings may provide risk categories, e.g, Apparatus/Equipment, Operating Parameters, Reagents/Standards, Sample Preparation, Standards Control Solution Preparation, Procedure, System Suitability, Calculations, Data Reporting... :