Tuesday, September 20, 2022

Test case examples from my V&V webinars:


Activity Completed       Signed/Dated (or N/A’d)

        01. Utilities: 

o   Power Hooked Up Properly       _____________________
o   UPS required/added       _____________________
o   Air Hooked Up Properly       _____________________
o   Surge Tanks required/added       _____________________
o   Water Hooked Up Properly       _____________________
o   Incoming Air Filtered       _____________________
               o   Incoming Water Treated/Filtered   ___________________
o   Exhaust Air/Oil Controlled             _____________________
o   Spent Water Controlled       _____________________

       02. Equipment/Tool/Fixture Number Assigned/Affixed       _____________________

03. Instruments Calibrated or NCI Tagged       _____________________

04. Safety/OSHA Requirements Addressed       _____________________

    05.   Documentation (Dwgs, Software...) Received       _____________________

06. Floor/Area Layout Approved       _____________________

07. Environmental Testing Completed/Acceptable       _____________________

           08.    SOP’s Written/Approved                 _____________________

09. Meets QS Regulation requirements       _____________________

           10.   PM Program/Spare Parts       _____________________

11. Process Control Required/Addressed       _____________________

Comments: __________________________________________________________________ 


Verify proper blow molder heating, shot injection / size (no short shots), “preforms”:

Verification Element

Expected Outcome

Observed Outcome

Heater elements inj. barrel; screw cycle / operation.

Screw / injection cycle.


“Preforms” properly configured; no molding anomalies / non conformances.


“Preform” ejections.

Resin achieves proper melt.


Proper shot size.


“Preforms” / cap interface to spec.  No short shots, splay, black specs, thin walls, et al.


“Preforms” ejected properly and without damage.


Tested by:  ___________________     Date:  ______________

                                     Verified by:  __________________     Date:  ______________

PQs (3 or more):

Similar to the above but the test case "elements" are replaced with repetitive samples (e.g., 10, 30 125, or ...).

-- jel@jelincoln.com


Working Definitions for V&V - Why Needed?
From several of my V&V webinar slides

ISO and CGMP definitions generally can’t be easily followed without some additional explanation.  Hence, the need for “working definitions”, workable explanations.

Note:  The information  presented here has also specifically been field tested with the FDA and various N-Bs over several decades; it is not theoretical or derivative of other consultants work.  

These definitions are also in basic agreement with several guidance documents, ISO standards, and similar.

Verification:  Inspection, testing or checking; includes most “qualifications”; Decommissioning; Product: Design Output = Design Input. Usually verify one "Requirement"

Validation (includes a series of verifications, many involving destructive testing); may include “commissioning”, Usually validate a series of "Requirements / Qualifications":
Product: Customer Needs (+ standards, guidance documents, etc.) = Resulting product
FAT, SAT (may need supplemental V&Vs)
Process / Equipment / Facility:  DQ, IQ, OQ, PQ
Software:  In-product, As-product, Process / Equipment; 11 elements (U.S. FDA guidance “model”)
Software:  QMS; 21 CFR Part 11 ER / ES
Master Validation Plan
Site “qualification”.

Product Verification / Testing Examples:

Biocompatibility:  Cytotoxicity, Hemolysis, Sensitization / Irritation, Carcinogenicity ...

Functional and In-process functional testing / QC

Software / firmware testing if appropriate

Accelerated aging, and start of concurrent real-time aging

Shake / drop, shipping

Product bioburden, LAL (bacterial endotoxin test), particulate

Sterilization (and residuals if EO)

Other testing as appropriate per product, standards, guidance documents

Each of the above tests is a verification.  Put them all together and you've validated a product.  Compile all verification documentation into the product validation  package.

DQ, IQ, OQ, PQ “Working” Definitions  

DQ (Design Qualification):  Insure requirements are ID’d =  Requirements Spec(s) are complete (include applicable Guidances and Standards)

IQ (Installation Qualification): Verification that item is installed per vendor’s / company’s / legal & regulatory requirements

OQ (Operational Qualification):  Optimize settings / parameters and tolerance ranges; DOE; ensure all Requirements are functional

PQs (Performance Qualification):  Prove system reproducibility / repeatability over extended time periods with expected company “allowable worst case” inputs (shifts, times, personnel, RM, et al) – 3 or more, the exact number of PQs determined by the number of different inputs, e.g., 24/7 work shifts:  1-3 PQs for the1st shift, 1 for Swig, 1 for Night, 1 for weekend/Sunday, 1 for holiday (next one coming up) = 5-7 PQs just to address a 24/7 production schedule.

Determine / Draft  Test  Cases: 

For IQs (usually a checklist, with each verified requirement signed by a qualified individual, e.g., plumber, electrician, rigger, etc.), OQs, PQs (test cases for OQ and PQs, signed by the one running the test case / operator, and verified by an impartial party, e.g., QC, and dated):

 List all requirements; group under installation requirements under the IQ; and all other requirements under the OQ (to prove they do what they should and don't do what they shouldn't); any OQ requirements that are subject to a company's "allowable worst case inputs" would also be challenged by 3 or more PQ's, with each test case expanded by samples, e.g., n=10, n=3-, n=125, etc., with justification for the sample sizes chosen.

Expand requirements to specific elements that support each requirement (rephrase each element into a question to assist);

Consider how each element can be verified;

Develop a test case for each element;

State the element, the expected observation / outcome / output;

Provide check boxes, fill in the blank, or area for actual observations / outcomes / outputs; and

Include provision for Tester’s signature / initials and date, and a Verifier’s signature / initials and date  (Initials require a “log” …).

Review / refine with team.

-- jel@jelincoln.com


Monday, September 12, 2022

Trending Audit Findings:   \

Ques:  I have a question on the alert and actions limits - can we set action and alert limits for Audit findings (both internal and external). I am well aware of the sheets and formula for calculation of Alert and action limit but my question is different, How can we set limits for complaint or audit findings?

Ans:  For audits, observations / corrective actions - they should be made into CAPAs, and CAPAs would then be trended as per normal CAPA trending. 

However, it should be noted that in starting trending, it's important for meaningful Alert and Action limits to be established.  The standard deviation should be developed from a lot of readings, not usually only one month's worth.  Control Charts usually require one hundred measurements to establish the standard deviation, from which the Alert and Action levels are determined, and that's what I'd recommend here.  So in most cases a trend chart's Alert and Action limits won't have a useful value until at least 100 measurements exist from which a more representative standard deviation can be calculated for the trend chart. Once established those values for Alert and Action would carry over month to month, unless the categories change substantially, or some other event requires a re-evaluation.  The re-evaluation could be part of the annual QA senior management review.

-- jel@jelincoln.com

Friday, September 9, 2022

 Trend Limits

Ques:  Solutions to CAPA data: 

Ans:    Re: problem of the expanding 3 sigma limits as complaints increase; the only approach I have used when this problem surfaces, is to have SOP defined steps to react to the absolute number of complaints (if of a high risk nature) rather than the Alert (1.96-2 sigma) and Action (3.0 sigma) limits; e.g.,  a "may cause death" problem requires a resolution to that specific event, not to a trend. The Alert / Action limits will only trigger action if there's a major bolus of a category of complaints in one month.

Oues:  Other process capability tools other than Cpk:

Ans:  The Control Chart, or % defective; or range of SD (standard deviations) of a key dimension; or the reverse of the Cpk, the Zmax3e:  3 sigma limit of tail of distribution / curve furthest  from the specation mid point (or:  Cpk = Zmin/3), where Cpk is the 3 sigma limit of the tail of the distribution / curve closest to the specification mid point); I've never seen this used.

Saturday, July 30, 2022

 Contract Manufacturers, Compliance, Selection, and the FDA 

The FDA seldom gets involved in contract manufacturer selection.  They hold the top manufacturer responsible for selection and auditing to ensure they meet requirements. 

The key point the FDA focuses on is the part's specification, and thing they are most concerned with is that the part meets the specification  (print, material, method or manufacture, QC requirements, etc) . They require that you rank supplier by patient risk of the item they supply and document selection and periodic inspection / auditing.  

If the supplier's quality becomes unacceptable, they expect the company to have the systems in place to find that and resolve it much under CAPA (Corrective and Preventive Action).  

Your choices should be ranked much as follows (1 is best):

1. FDA / CGMP, 21 CFR 820 compliant, and inspected by the FDA,

2.  ISO 13485:2016 compliant, and audited by a Notified-body;

3.  ISO 9001:2015 compliant, and audited by a Notified-body; 

3.  Actively working to be compliant in 1, 2 or 3;

4.  Know the basics of 820, or ISO 13485, or ISO 9001, especially change control, and willing to become compliant.   

Ultimately you should have a Quality Agreement on CGMP compliance signed by the selected company and yours.  

And the above will have to be defined by SOP, and with records to prove compliance.

-- jel@jelincoln.com

Wednesday, July 27, 2022

 Use of Recycled Material In New Device Builds

From the Q&A on a webinar I conducted on Device changes and the 510(k):

Answer to reprocessing / recycling:

1. Reprocessing is the taking of a used single use device and disinfecting, testing, repairing, cleaning, repackaging / relabeling and resterilizing  that SUD for reuse.  Usually done by a 3rd party reprocessor under the reprocessor's 510(k) to allow that;

2. Recycling could be the incorporation of used material in a new build.  I used the example of plastic regrind, but it could be corrugate shipper's using recycled cardboard, or recycled metal used in a part / component, etc. In such instances the company would have to make decisions based on patient risk (ISO 14971:2019), and V&V against the part specification. If the part using recycled raw material meets spec, then it should meet CGMPs. Use of regrind and recycled corrugate are accepted in the device industry - final decision here is based on cost, cost of the regrind / corrugate, vs cost of additional testing / monitoring, if an issue. 

What are other companies doing?  The "c" in CGMP.   


o  Unknown provinence / history of the recycled material; missing "documentation" needs to be addressed by filling in the gaps with new documentation based on data;

o Unknown number of times the recycled material has had a portion of it recycled:  

o  The introduction of extra material / particulate during handling / introduction of recycled material and cleanness of the recycled stock; e.g., black specs, other particulate / scrap; 

o Potential weakness of the resulting part using recycled material, compared to part using virgin material; caused by the extra processing and particulate introduction, et al;

Use of recycled subassemblies: I would tend to question its usage at all as that gets into the definition of a new vs used device.   

Also check with your legal department, e.g.,   How would this use affect the outcome of a jury trial involving the malfunction of a device having recycled materials.

See FDA Guidance on use of recycled material in food contact applications:  https://www.fda.gov/regulatory-information/search-fda-guidance-documents/guidance-industry-use-recycled-plastics-food-packaging-chemistry-considerations

I couldn't find any specific device information on the FDA's website on the above. 

Another follow-on point:

Rework of a failed / non-conforming new part / component is addressed the CGMPs, requiring dispositioning per SOP and usually by a Material Review Board, defined in a rework Work Order, supported by test / verification data if necessary, and reviewed/approved by Engineering, maybe R&D / others, and QA

-- jel@jelincoln.com

Saturday, June 11, 2022

Two Major CGMP Non-Conformances

I have recently completed US FDA remediation assistance with two companies outside the US, manufacturing product sold in the US - one in Europe, one in India.  They had had several Notified-Body inspections of their QMS to ISO 13485, with great results.  However, they failed their first CGMP inspection by a US FDA CSO.

The key reason for failures in both cases, both resulting in Warning Letters (primarily due to CAPA) were:

1.  Poor CAPA systems and lack of trending; and

2. Wrong definition of risk, as in "risk-based" activities.

Let's focus on the second (for information on CAPA trending, see post of 08.04/2020; for CAPA problems leading to 483 Observations, see FDA's Inspectional Observations DB at fda.gov -

https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/inspection-references/inspection-observations  ).

I find that outside the US, many companies adhere to the basic premise of ISO 31000, i.e., risk is business risk, legislative risk, regulatory risk, schedule risk, financial risk, budget risk, etc.  And these are real risks to a business, and have to be considered.  But - the key point of this blog - those other legitimate definitions of risk aren't what the FDA is focused on when it talks "risk".  

This broader definition of risk exists in US companies selling to the US market as well, but it's not supported by or allowed to perpetuate in the QMS by regulatory inspections, which in these cases are US FDA- administered.  Any confusion over definitions is quickly addressed by the first FDA inspection, and is nor supported by any alternate QMS inspection paradigm (unless the company is also selling product outside the US, and thus subject to Notified-body audits as well) as was the case with those companies in Europe and Asia . 

When the FDA talks "risk-based", they're talking about ISO 14971 risk ONLY, i.e., per ISO 14971:2019, "Introduction", pg. vi, para  3:

     1. Risk to patient (safety of the patient in use of the product);

     2.  Risk to the clinician (facilitating the patient's use of the product); and 

     3.  Risk to the use environment.

Nothing else!

Such a definition of risk must permeate the company's QMS / CGMP system, must be part of the product design process (ISO 14971 for devices under Design Control, 21 CFR 820; and ICH Q9 for pharma), and must be part of the Failure Investigation / Root Cause Analysis process in CAPA resolutions, Verification and Validation issues, and similar.

Note:  Another key reason for 483 observations, not part of the above discussion, is failure to follow one's own company's SOPs / WIs, leading to "adulterated" product.   

- jel@jelincoln.com

Updated 06/20/2022 - JEL