Tuesday, September 20, 2022

Test case examples from my V&V webinars:

IQ:


Activity Completed       Signed/Dated (or N/A’d)

        01. Utilities: 

o   Power Hooked Up Properly       _____________________
o   UPS required/added       _____________________
o   Air Hooked Up Properly       _____________________
o   Surge Tanks required/added       _____________________
o   Water Hooked Up Properly       _____________________
o   Incoming Air Filtered       _____________________
               o   Incoming Water Treated/Filtered   ___________________
o   Exhaust Air/Oil Controlled             _____________________
o   Spent Water Controlled       _____________________

       02. Equipment/Tool/Fixture Number Assigned/Affixed       _____________________

03. Instruments Calibrated or NCI Tagged       _____________________

04. Safety/OSHA Requirements Addressed       _____________________

    05.   Documentation (Dwgs, Software...) Received       _____________________

06. Floor/Area Layout Approved       _____________________

07. Environmental Testing Completed/Acceptable       _____________________

           08.    SOP’s Written/Approved                 _____________________

09. Meets QS Regulation requirements       _____________________

           10.   PM Program/Spare Parts       _____________________

11. Process Control Required/Addressed       _____________________

Comments: __________________________________________________________________ 


OQ:

Verify proper blow molder heating, shot injection / size (no short shots), “preforms”:

Verification Element

Expected Outcome

Observed Outcome

Heater elements inj. barrel; screw cycle / operation.

Screw / injection cycle.

 

“Preforms” properly configured; no molding anomalies / non conformances.

 

“Preform” ejections.

Resin achieves proper melt.

 

Proper shot size.

 

“Preforms” / cap interface to spec.  No short shots, splay, black specs, thin walls, et al.

 

“Preforms” ejected properly and without damage.

 

Tested by:  ___________________     Date:  ______________

                                     Verified by:  __________________     Date:  ______________


PQs (3 or more):


Similar to the above but the test case "elements" are replaced with repetitive samples (e.g., 10, 30 125, or ...).

-- jel@jelincoln.com


























 

Working Definitions for V&V - Why Needed?
From several of my V&V webinar slides
 

ISO and CGMP definitions generally can’t be easily followed without some additional explanation.  Hence, the need for “working definitions”, workable explanations.

Note:  The information  presented here has also specifically been field tested with the FDA and various N-Bs over several decades; it is not theoretical or derivative of other consultants work.  

These definitions are also in basic agreement with several guidance documents, ISO standards, and similar.

Verification:  Inspection, testing or checking; includes most “qualifications”; Decommissioning; Product: Design Output = Design Input. Usually verify one "Requirement"

Validation (includes a series of verifications, many involving destructive testing); may include “commissioning”, Usually validate a series of "Requirements / Qualifications":
Product: Customer Needs (+ standards, guidance documents, etc.) = Resulting product
FAT, SAT (may need supplemental V&Vs)
Commissioning
Process / Equipment / Facility:  DQ, IQ, OQ, PQ
Software:  In-product, As-product, Process / Equipment; 11 elements (U.S. FDA guidance “model”)
Software:  QMS; 21 CFR Part 11 ER / ES
Master Validation Plan
Site “qualification”.

Product Verification / Testing Examples:

Biocompatibility:  Cytotoxicity, Hemolysis, Sensitization / Irritation, Carcinogenicity ...

Functional and In-process functional testing / QC

Software / firmware testing if appropriate

Accelerated aging, and start of concurrent real-time aging

Shake / drop, shipping

Product bioburden, LAL (bacterial endotoxin test), particulate

Sterilization (and residuals if EO)

Other testing as appropriate per product, standards, guidance documents

Each of the above tests is a verification.  Put them all together and you've validated a product.  Compile all verification documentation into the product validation  package.

DQ, IQ, OQ, PQ “Working” Definitions  

DQ (Design Qualification):  Insure requirements are ID’d =  Requirements Spec(s) are complete (include applicable Guidances and Standards)

IQ (Installation Qualification): Verification that item is installed per vendor’s / company’s / legal & regulatory requirements

OQ (Operational Qualification):  Optimize settings / parameters and tolerance ranges; DOE; ensure all Requirements are functional

PQs (Performance Qualification):  Prove system reproducibility / repeatability over extended time periods with expected company “allowable worst case” inputs (shifts, times, personnel, RM, et al) – 3 or more, the exact number of PQs determined by the number of different inputs, e.g., 24/7 work shifts:  1-3 PQs for the1st shift, 1 for Swing, 1 for Night, 1 for weekend/Sunday, 1 for holiday (next one coming up) = 5-7 PQs just to address a 24/7 production schedule.

Determine / Draft  Test  Cases: 

For IQs (usually a checklist, with each verified requirement signed by a qualified individual, e.g., plumber, electrician, rigger, etc.), OQs, PQs (test cases for OQ and PQs, signed by the one running the test case / operator, and verified by an impartial party, e.g., QC, and dated):

 List all requirements; group under installation requirements under the IQ; and all other requirements under the OQ (to prove they do what they should and don't do what they shouldn't); any OQ requirements that are subject to a company's "allowable worst case inputs" would also be challenged by 3 or more PQ's, with each test case expanded by samples, e.g., n=10, n=3-, n=125, etc., with justification for the sample sizes chosen.

Expand requirements to specific elements that support each requirement (rephrase each element into a question to assist);

Consider how each element can be verified;

Develop a test case for each element;

State the element, the expected observation / outcome / output;

Provide check boxes, fill in the blank, or area for actual observations / outcomes / outputs; and

Include provision for Tester’s signature / initials and date, and a Verifier’s signature / initials and date  (Initials require a “log” …).

Review / refine with team.


-- jel@jelincoln.com

 

Monday, September 12, 2022

Trending Audit Findings:   \

Ques:  I have a question on the alert and actions limits - can we set action and alert limits for Audit findings (both internal and external). I am well aware of the sheets and formula for calculation of Alert and action limit but my question is different, How can we set limits for complaint or audit findings?

Ans:  For audits, observations / corrective actions - they should be made into CAPAs, and CAPAs would then be trended as per normal CAPA trending. 

However, it should be noted that in starting trending, it's important for meaningful Alert and Action limits to be established.  The standard deviation should be developed from a lot of readings, not usually only one month's worth.  Control Charts usually require one hundred measurements to establish the standard deviation, from which the Alert and Action levels are determined, and that's what I'd recommend here.  So in most cases a trend chart's Alert and Action limits won't have a useful value until at least 100 measurements exist from which a more representative standard deviation can be calculated for the trend chart. Once established those values for Alert and Action would carry over month to month, unless the categories change substantially, or some other event requires a re-evaluation.  The re-evaluation could be part of the annual QA senior management review.

-- jel@jelincoln.com

Friday, September 9, 2022

 Trend Limits

Ques:  Solutions to CAPA data: 

Ans:    Re: problem of the expanding 3 sigma limits as complaints increase; the only approach I have used when this problem surfaces, is to have SOP defined steps to react to the absolute number of complaints (if of a high risk nature) rather than the Alert (1.96-2 sigma) and Action (3.0 sigma) limits; e.g.,  a "may cause death" problem requires a resolution to that specific event, not to a trend. The Alert / Action limits will only trigger action if there's a major bolus of a category of complaints in one month.

Oues:  Other process capability tools other than Cpk:

Ans:  The Control Chart, or % defective; or range of SD (standard deviations) of a key dimension; or the reverse of the Cpk, the Zmax3e:  3 sigma limit of tail of distribution / curve furthest  from the specation mid point (or:  Cpk = Zmin/3), where Cpk is the 3 sigma limit of the tail of the distribution / curve closest to the specification mid point); I've never seen this used.