Tuesday, July 13, 2021

Part Validation for Single Lot Run

Ever have the need to validate one lot of parts from a vendor.  Here's some ideas, based on a one lot run off a new molding tool by a vendor prior shipping the tool off to the manufacturer to be used in on-going production.  One possible approach --

________ Molding Validation Outline

John E. Lincoln


1.    1.  Purpose:  To provide a short term one-lot validation of the Injection Molding Press, New Tool, and resulting small one-time run of a [part]  lot of approx. [quantity] pieces, to be done by a contract molder  [name, address]. After this run the tool will be shipped to [destination] for ongoing production. Once the tool is received by [Manufacturer], it will be formally validated in dedicated press(es) for subsequent production runs. The [part] is a low patient / user risk part [supported by ISO 14971:2019 Risk Management File].

2.      2.   Name / address / contact of SLC Molder / Vendor;

3.       3.  Injection Molding Press:  [Description – Manufacturer, Model, S/N, capacity];

4.       4 .  Tool / Mold Description / Number, cavities …;

5.       5.  [Part Description], P/N, P/N Specification, Lot Number, Quantity [total, fall-off, released];

6.      . 6.  Press calibration data (on gauges for:  Injection Pressure, Time, Tool Temperature …;)

7.       7.  Molding Set-up Card true copy;

8.       8.  Molding Lot / Batch actual press run data  [Injection Pressure, Time, Tool Temperature …] true copy;

9.      9.   Part’s 1st Article Test Results true copy;

10. 10.  Part’s QC In-process test data (if any) true copy;

11.  11.  Part’s QA Finished Part test /release data true copy;

12.  12.  Any Deviations, Non-conformances, Change Orders, if any, and how resolved – true copies (see below).  

13. 13.   [Manufacturer]  receiving / IQC test data for this lot.

For these purposes, a true copy can be a xerographic copy, certified on each  document as a “true, exact, complete and unaltered” copy, and signed and dated by an authorized representative for the vendor.  Either each document so annotated, or a collective single “certificate” listing each of the documents, stating the above, and signed / dated by an authorized representative for the vendor.

Provide above to assist in drafting a “mini-validation” for that lot, based on the above outline, for [the Manufacturer's] files.

14.  Results:  Once the above data is gathered, compare the set-up data, the run data / parameters, the 1st Article data, the In-process QC and final lot QA test / dimension data, to the part drawings / specifications, add functional test data (including any destructive stress testing), assembly test data, visual inspection data, et al.

15.  Conclusion:  Evaluate the requirements for the part to the actual results of the  parts in the lot run, and write up the conclusions.

Follow up with a formal validation of the tool in it's assigned injection molding press(es) for continued production.

-- jel@jelincoln.co0m

Monday, July 12, 2021


Further on:  "DHFs / D&DFs for Older Products -- Responses to Questions From My Recent Webinar" (see 10/09/2017 blog):

I recently reviewed my response to the above and feel the need to add an important qualifier.  Yes the US DHF / EU D&DPF provide the development history of a device, while the EU Technical File / Design Dossier / Technical Documentation files are primarily a "snapshot" in time, the current description of the device and how it meets the requirements of applicable EU regulations, especially the old EU MDD (Medical Device Directive, especially the Essential Requirements) and now the EU MDR (Medical Device Regulation, especially the General Safety and Performance Requirements).

The point I failed to sufficiently emphasize, and why Design Control / Design and Development Planning is to be addressed in the first place, in the EUs Technical Documentation File (and why it is a focus of a device CGMP compliance inspection by the FDA), is not just for the development history over time (which is also valuable IP), but primarily for design control, i.e, control / evaluation of the changes in the design as it evolves during R&D and prior to Design Transfer for manufacture.  

When Design Control, 21 CFR 820.30 was added as a CGMP requirement in 1996-97, it was emphasized by FDA spokespersons such as Kim Trautman, that it's primary purpose was to control / formalize design changes under a review (2nd party) and verification system, in the previous poorly controlled R&D change control environment.  This was to address the fact the FDA had identified in its post-production device monitoring, that changes to a device under development were often reactive to one problem, and then not fully vetted as to it's positive effect on a problem, but also not evaluating  any possible negative effects, which need to be identified and eliminated.  The DHF / D&DPF documents the design process, including such changes, resulting in a safe and effective device, with minimal to no design flaws that could negatively impact a patient / end user.

-- jel@jelincoln.com

Thursday, July 8, 2021


My response to a query from one of the participant attended my webinar “Design History Files (DHF), Device Master Records (DMR), Device History Records (DHR), Technical Documentation Files”. 

 Query : Thank you for this webinar – I definitely learned a few things. I had asked a question about whether a full PV on the fully scaled up manufacturing process was required for EU MDR submission – and you indicated yes.  We have also had that interpretation from at least one other person.

Ans:  My answer was more geared to the FDA. For EU MDR, double check with your Notified-Body.  But things can happen from pilot to scale up, hence my answer.

Also my comments on "Like For Like" -- isn't!  There's always subtle differences in so-called "identical" equipment / processes, which must be addressed in a verification or validation, albeit perhaps in a reduced format, depending upon user / patient risk, and or nature of the potential variance. 

Ques: I was just thinking that through and am wondering if that means we need to expand the scope of our Design Inputs to include the stages of Process Transfer and Validation?

We use a DIOVV Matrix – Design Inputs, outputs, verification, and validation – and start with a stakeholder need that then translates in to one or more design inputs.  And of course then the design outputs, verification documents, and validation documents as appropriate for each design input get listed.

Just wondering if we need to add a section to the DIOVV so that we address Process Transfer and Validation?  Would you consider those things as legitimate Design Inputs??

Design inputs really are very tricky things – we spend a lot of time debating them.

 Ans:  Agreed. As I mentioned in the webinar, the definitions of DI and DO elicit strong opinions, and when I meet with clients on Design Control, we spend the majority of a session just on those two elements of the 10 in Design Control.

Bottom line as mentioned in the webinar:  You as a company define what will be a DI and a DO, as DI’s become Interim DO’s become interim DI’s, ad infinitum. Certainly the initial DI’s would be under the DI category, as would the final DO’s be under the DO category, with the interim DI’s and DO’s being placed where they are easiest to understand per your company’s methods / definitions, as spelled out in your SOPs.   


Notice last para (2nd half), pg 3, Concurrent engineering pg 5, discussion on DI and DO, pp. 13-21, note especially the root problem on the top of page 20 “The design output in one stage is often part of the design input in subsequent stages.”

... and Design Transfer per 820.30 is a separate / discrete step / milestone in the R&D project subject to Design Control, i.e., one destined to commercialization (see below).

Ques: It’s potentially an interesting paradigm shift because typically R&D ‘owns’ the DIOVV – we do Process Transfer – and Operations executes the final PV(s) – and so this would mean engaging Operations earlier in the Development Process, which might be a good thing.

Ans:  Under Design Control, Design Transfer is a formal discrete activity.  But per Concurrent Engineering in the above referenced Guidance documents, it goes easier if all stakeholders are involved in the project from the very beginning as mentioned in the webinar, and the “good thing” you mentioned above – I would expand the early participants to more than operations, including manufacturing / production, manufacturing engineering and QA/QC and RA and others. I practiced that with new product development in the 1980’s to great effect with minimal “surprises” and meeting schedule.

-- John E. Lincoln


My responses to questions re: my recent webinar on cybersecurity:

1.  The System Administrator is a key weak link?  How can we address that?  

Ans:  The Sys Ad can make changes to the program to adapt to user needs; this can be abused.  It needs checks and balances.  All changes should be documented by the Sys Ad; evaluated as to need for V&V, with rationale; that documentation / log(?) / lab book(?) should be reviewed and signed off / approved by an independent party (QC/QA?). The  Sys Ad should be a reliable QA/RA-inclined IT individual;

2. We're moving more of our applications and storage to the cloud.  What should we be focusing on to reduce risk?  

Ans:  Focus on the cloud provider; do they understand and agree to the need for CGMP / change control, understand the need for their clients to validate the cloud programs and are willing to give advance notice of anticipated changes well in advance of such changes to allow the client to perform any necessary regressing testing, V&V; 

3.  It seems that phishing attacks are growing.  How can we compensate for the weak point - our people? 

Ans: Weak point is people. Follow the NIST guidelines discussed to train personnel on how to scan e-mails, pop-ups, websites, et al, as to authenticity (of URL, site ...) vs. spoofing, check the URLs (look for purposely misspelled URLs), avoid the unsolicited content and go back to the actual source from their own addresses / URL, rather than clicking the furnished link;  Train with unsolicited / unannounced company IT-initiated phishing "attacks" periodically (with personnel informed that such "tests" are part of the job; but not told when or what means to be used).

4.  Where in the software V&V should we best add cyber tests?  

Ans:  The OQ, since the primary goal is to prove the requirement has been met, does what it should do, and doesn't do what it shouldn't;  if there is allowable "worst case" input variability in the company's implementation of that requirement, due to platforms, shifts, types of records, etc., then also expand that OQ test case into a PQ test cases with several PQs, each with many samples (with rationale for sample number selected included in the protocol).

-- John E. Lincoln


Tuesday, August 4, 2020


When your company team develops non-conformance / complaint / CAPA trend charts (usually from a spreadsheet), they'll need to ask the following questions and determine how to present and label / categorize the data presented so it can be acted upon (the key reason for collecting / trending in the first place):

  • Frequency - generally monthly presentations and also showing yearly for the bigger picture;
  • Areas charted, and Categories within each chart - should be based on risk of problem to the end-use / patient as no. 1, with business / financial / GMP compliance issues also important, but secondary;
  • Is all the above structured and presented to assist in getting to root causes, and addressing and eliminating the problem in a timely manner;
  • Is all the above designed to show resolution / effectiveness (e.g., a decline in the number of occurrences of each area / category) ;
  • How will each chart be presented to senior management; how will action be taken on each / triggers (monthly Alert at 1.96 sigma (annual); monthly Action at 3.0 sigma (annual), or ...); how will such action(s) be documented monthly (a Trend Meeting, Agenda, and Minutes?).

These (and possibly other) issues are what the CGMPs, especially CAPA (see 21 CFR 820.100 and note a.1-7) and the Trending requirements are driving.  The FDA (and any QMS, i.e., ISO 13485) want companies to have quality systems in place that recognize system and product / supply chain / manufacturing / testing problems, highlight those problems, and force / drive change (verified / validated / monitored for effectiveness) to eliminate those problems, then move on to the next inevitable group of problems. 

-- jel@jelincoln.com

Tuesday, June 23, 2020

Virtual Site Inspections / Audits in Times of Pandemic, et al

an approach ...

         Virtual Audits / Inspections, if applicable:  Using the information in a Vendor Audit SOP /
         Attachment, develop an Audit Plan / Script that can be conducted by a video
         conferencing tool, e.g., Skype®, Zoom®, WebEx®,  GoToMeeting®, MS-Team®,
         and/or the use of a phone camera or similar.  Enlist the help of a QA/QC contact
         person at the vendor to assist with the video. 

·     Obtain e-copies, pdf, or paper copies pre-sent of the QM, SOP Index / Titles, specific applicable SOPs, Establishment Registration Number, applicable 510(k) K numbers / dates, catalog (or website screen captures), certifications, last US FDA 483 if available, and so on;  
·     “Tour” their facility, taking visuals with the digital camera;
·     Focus on areas of interest to you, e.g., IQC, Inventory Storage, Manufacturing, specific work stations, FG QA, post-production activities, if applicable, and similar;
·     Video specific activities, specific documents related to your product (or obtain e-copies, pdf, or paper copies pre-sent;
·     Request additional supplemental videos, if needed, during course of desk audit.
                        ·     Compile a written “narrative” of the video for the file (or the actual video in 21                                    CFR11/system-validated storage.  

    -- jel@jelincoln.com  

         One Possible Approach to an Approved Vendor List

         Based on part / component criticality (to patient / end user / clinician).    

               Based on product risk, supplier / vendor audit results, incoming QC trend data,    
               and other factors, a list of approved vendors, and their criticality should be 
               established and maintained by QA, separately, or in each vendor’s file. 
Criticality (based on risk to the patient / end user / clinician):
Criticality Category
01.  Lab services – contracted – clean room 
       / controlled manufacturing area testing,   
       biocompatibility, EO / BI / gamma
       validation and testing, GMP
       compatibility issues
High (Major)
02.  Calibration services -- contracted
03.  Patient fluid path contact 
04.  Tissue, bone and dentin contact
05.  Sterile barrier
06.  Sterilization services – contracted
07.  Patient contact, general
08.  Test equipment, tooling, fixturing  
       manufacturers – company validated
Services / Equipment
09.  Equipment, tooling, fixturing;    
       manufacturers – company validated
Low (Minor)
Services / Equipment
10.  No patient contact
11.  Manufacturing Materials – fully    
       removed during processing (validated)

Ranking explained (reference:Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices Document”, May 11, 2005):


A failure or latent flaw could directly result in death or serious injury to the patient or operator. The level of concern is also Major if a failure or latent flaw could indirectly result in death or serious injury of the patient or operator through incorrect or delayed information or through the action of a care provider.
A failure or latent design flaw could directly result in minor injury to the patient or operator. The level of concern is also Moderate if a failure or latent flaw could indirectly result in minor injury to the patient or operator through incorrect or delayed information or through the action of a care provider.
Failures or latent design flaws are unlikely to cause any injury to the patient or operator


      6.11  Critical Vendors:  Suppliers / vendors ranked “High / Major” on the above tables     
               are determined to be “critical”.  Critical suppliers are to be subjected to an initial
               site (or virtual) audit.  Follow-up reviews / frequency will be determined by patient /
               user risk factors as mentioned above and/or a Product Risk Management File, and the
               quality of their deliverables, maintenance of their CGMP compliance and/or

               The type of vendors to be considered “critical” could consist of: 
               Independent test labs, calibration services, contract sterilizers, manufacturers of   
               critical / high risk components, compliance consultants, Notified-Bodies (generally
               will not be audited by the company, as they are subject to their own competent
               authority oversight), and similar. 

               Moderate risk vendors will be subject to an initial desk audit (see Attachment 1),
               supplemented by any additional follow up phone interviews (document in memo to    
               file) and/or e-mail correspondence (initialed and dated)

               All High and Moderate vendors will be subject to a periodic review, usually
               annually, as to performance, retention or replacement, and/or need for any
               supplemental re-audit / re-inspection; documented by a memo to their file.  

               Low risk ranked vendors will be subject to review / action if any deliverable is found
               non-conforming, and such review will be included in the NCMR or other CAPA

              --  jel@jelincoln.com